From ztdepyahoo at 163.com Thu Aug 1 05:11:39 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Thu, 1 Aug 2013 18:11:39 +0800 (CST) Subject: [petsc-users] confusion about VecGetArray on a sequential vec Message-ID: <28aec5c5.114b0.140395c8513.Coremail.ztdepyahoo@163.com> I create a sequential vec seqx, and i get it value with VecGetArrary, since seqx is created only in process 0, why every process can output the value of the seqx. VecCreateSeq(PETSC_COMM_SELF,5,&seqx); VecView(seqx,PETSC_VIEWER_STDOUT_WORLD); double *seqarry; VecGetArray(seqx,&seqarry); for(int i=0;i From dave.mayhem23 at gmail.com Thu Aug 1 05:36:09 2013 From: dave.mayhem23 at gmail.com (Dave May) Date: Thu, 1 Aug 2013 12:36:09 +0200 Subject: [petsc-users] confusion about VecGetArray on a sequential vec In-Reply-To: <28aec5c5.114b0.140395c8513.Coremail.ztdepyahoo@163.com> References: <28aec5c5.114b0.140395c8513.Coremail.ztdepyahoo@163.com> Message-ID: When you called VecCreateSeq(PETSC_COMM_SELF,5,&seqx); you created a sequential vector on each process - not just on rank 0. On 1 August 2013 12:11, ??? wrote: > I create a sequential vec seqx, and i get it value with VecGetArrary, > since seqx is created only in process 0, why every process can output the > value of the seqx. > > > VecCreateSeq(PETSC_COMM_SELF,5,&seqx); > VecView(seqx,PETSC_VIEWER_STDOUT_WORLD); > double *seqarry; > VecGetArray(seqx,&seqarry); > > for(int i=0;i cout< > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Thu Aug 1 06:08:25 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Thu, 1 Aug 2013 19:08:25 +0800 (CST) Subject: [petsc-users] confusion about VecGetArray on a sequential vec In-Reply-To: References: <28aec5c5.114b0.140395c8513.Coremail.ztdepyahoo@163.com> Message-ID: <5c5bd170.11f4e.14039907f75.Coremail.ztdepyahoo@163.com> but when we use vecveiw to veiw the seqx, it gives value only with 1 process At 2013-08-01 18:36:09,"Dave May" wrote: When you called VecCreateSeq(PETSC_COMM_SELF,5,&seqx); you created a sequential vector on each process - not just on rank 0. On 1 August 2013 12:11, ??? wrote: I create a sequential vec seqx, and i get it value with VecGetArrary, since seqx is created only in process 0, why every process can output the value of the seqx. VecCreateSeq(PETSC_COMM_SELF,5,&seqx); VecView(seqx,PETSC_VIEWER_STDOUT_WORLD); double *seqarry; VecGetArray(seqx,&seqarry); for(int i=0;i From dave.mayhem23 at gmail.com Thu Aug 1 06:18:46 2013 From: dave.mayhem23 at gmail.com (Dave May) Date: Thu, 1 Aug 2013 13:18:46 +0200 Subject: [petsc-users] confusion about VecGetArray on a sequential vec In-Reply-To: <5c5bd170.11f4e.14039907f75.Coremail.ztdepyahoo@163.com> References: <28aec5c5.114b0.140395c8513.Coremail.ztdepyahoo@163.com> <5c5bd170.11f4e.14039907f75.Coremail.ztdepyahoo@163.com> Message-ID: You should be pass the same communicator into VecView() as was used when you created the vector. Do this ierr = VecView(seqx,PETSC_VIEWER_STDOUT_SELF);CHKERRQ(ierr); On 1 August 2013 13:08, ??? wrote: > but when we use vecveiw to veiw the seqx, it gives value only with 1 > process > > > > > > > At 2013-08-01 18:36:09,"Dave May" wrote: > > When you called > VecCreateSeq(PETSC_COMM_SELF,5,&seqx); > you created a sequential vector on each process - not just on rank 0. > > > On 1 August 2013 12:11, ??? wrote: > >> I create a sequential vec seqx, and i get it value with VecGetArrary, >> since seqx is created only in process 0, why every process can output the >> value of the seqx. >> >> >> VecCreateSeq(PETSC_COMM_SELF,5,&seqx); >> VecView(seqx,PETSC_VIEWER_STDOUT_WORLD); >> double *seqarry; >> VecGetArray(seqx,&seqarry); >> >> for(int i=0;i> cout<> >> >> >> >> >> >> > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 1 07:46:36 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 1 Aug 2013 20:46:36 +0800 Subject: [petsc-users] Use kspsolve repeatly In-Reply-To: <2fc48037.8f45.140383da796.Coremail.ztdepyahoo@163.com> References: <2fc48037.8f45.140383da796.Coremail.ztdepyahoo@163.com> Message-ID: On Thu, Aug 1, 2013 at 12:58 PM, ??? wrote: > I need to use the kspsolve(A,b,x) repeatly in my code in the following > style. > but i have noticed from the system load monitor that during the code > running, it allocate new memeory every step. > > I use the default setting for the PC. > Could you please told me how to resolve this problem. > 1) There is no enough here to give us an idea what you are doing 2) Are you using the latest release? If so, it prevents you from adding new nonzreos to the matrix, which is my first guess. Matt > for (int i=0;i { > MatsetValues( ).... > kspsolve(A,b,x) > } > > > > > < span title="neteasefooter"> > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Thu Aug 1 08:29:34 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Thu, 1 Aug 2013 21:29:34 +0800 (CST) Subject: [petsc-users] Use kspsolve repeatly In-Reply-To: References: <2fc48037.8f45.140383da796.Coremail.ztdepyahoo@163.com> Message-ID: <28294a88.134fb.1403a11b96f.Coremail.ztdepyahoo@163.com> I use the latest version PETSC, and the matrix has the same nonzero pattern during each outer iteration. I want to know does the kspsolve allocate memory for PC dring each call. ? 2013-08-01 20:46:36?"Matthew Knepley" ??? On Thu, Aug 1, 2013 at 12:58 PM, ??? wrote: I need to use the kspsolve(A,b,x) repeatly in my code in the following style. but i have noticed from the system load monitor that during the code running, it allocate new memeory every step. I use the default setting for the PC. Could you please told me how to resolve this problem. 1) There is no enough here to give us an idea what you are doing 2) Are you using the latest release? If so, it prevents you from adding new nonzreos to the matrix, which is my first guess. Matt for (int i=0;i -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Thu Aug 1 09:16:48 2013 From: dave.mayhem23 at gmail.com (Dave May) Date: Thu, 1 Aug 2013 16:16:48 +0200 Subject: [petsc-users] Use kspsolve repeatly In-Reply-To: <28294a88.134fb.1403a11b96f.Coremail.ztdepyahoo@163.com> References: <2fc48037.8f45.140383da796.Coremail.ztdepyahoo@163.com> <28294a88.134fb.1403a11b96f.Coremail.ztdepyahoo@163.com> Message-ID: If you want a meaningful answer to your questions, it would be much faster and simpler if you just sent your source code. On 1 August 2013 15:29, ??? wrote: > I use the latest version PETSC, and the matrix has the same nonzero > pattern during each outer iteration. > I want to know does the kspsolve allocate memory for PC dring each call. > > > > > ? 2013-08-01 20:46:36?"Matthew Knepley" ??? > > On Thu, Aug 1, 2013 at 12:58 PM, ??? wrote: > >> I need to use the kspsolve(A,b,x) repeatly in my code in the following >> style. >> but i have noticed from the system load monitor that during the code >> running, it allocate new memeory every step. >> >> I use the default setting for the PC. >> Could you please told me how to resolve this problem. >> > > 1) There is no enough here to give us an idea what you are doing > > 2) Are you using the latest release? If so, it prevents you from adding > new nonzreos to the matrix, which is my first guess. > > Matt > > >> for (int i=0;i> { >> MatsetValues( ).... >> kspsolve(A,b,x) >> } >> >> >> >> >> < span title="neteasefooter"> >> >> >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Wadud.Miah at awe.co.uk Thu Aug 1 10:08:30 2013 From: Wadud.Miah at awe.co.uk (Wadud.Miah at awe.co.uk) Date: Thu, 1 Aug 2013 15:08:30 +0000 Subject: [petsc-users] Matrix assembly error in PETSc Message-ID: <201308011508.r71F8bgq024836@msw2.awe.co.uk> Hello, I am running an application code which works with 4, 8 and 18 processes but crashes with 16 processes. I have used MPICH2 and MVAPICH2 (both adhered to the MPI 3.0 standard) and both cause the same problem. I get the following error message: [12]PETSC ERROR: MatSetValues_MPIAIJ() line 564 in src/mat/impls/aij/mpi/mpiaij.c [12]PETSC ERROR: MatAssemblyEnd_MPIAIJ() line 680 in src/mat/impls/aij/mpi/mpiaij.c [12]PETSC ERROR: MatAssemblyEnd() line 4879 in src/mat/interface/matrix.c [12] --> Error in "MatAssemblyEnd()". [12] --> Code: 63 However, I do not get this using the Intel MPI (which adheres to the MPI 2.0 standard) library. Any help will be greatly appreciated. Regards, -------------------------- Wadud Miah HPC, Design and Theoretical Physics Direct: 0118 98 56220 AWE, Aldermaston, Reading, RG7 4PR ___________________________________________________ ____________________________ The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited. Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer. While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected. AWE Plc Registered in England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Aug 1 13:02:05 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 1 Aug 2013 13:02:05 -0500 Subject: [petsc-users] Matrix assembly error in PETSc In-Reply-To: <201308011508.r71F8bgq024836@msw2.awe.co.uk> References: <201308011508.r71F8bgq024836@msw2.awe.co.uk> Message-ID: <94E6CE15-F08C-4961-A184-FEC03FA241F3@mcs.anl.gov> Please always send the ENTIRE error message, it makes it much easier for us to deduce what is going on. Error code 63 is PETSC_ERR_ARG_OUTOFRANGE which presumably is generated in MatSetValues_MPIAIJ() which means a row or column index is out of range. But since this is called within the MatAssemblyEnd_MPIAIJ() it should never be out of range. The most likely cause is data corruption on values passed between processes with MPI. It is possible the error is due to bugs in the MPI implementation or due to memory corruption elsewhere. I would first recommend running the code with valgrind (and enormously powerful tool) to eliminate the chance of memory corruption http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind Let us know what happens, Barry MPI 2.0 vs MPI 3.0 is likely not the issue. On Aug 1, 2013, at 10:08 AM, Wadud.Miah at awe.co.uk wrote: > Hello, > > I am running an application code which works with 4, 8 and 18 processes but crashes with 16 processes. I have used MPICH2 and MVAPICH2 (both adhered to the MPI 3.0 standard) and both cause the same problem. I get the following error message: > > [12]PETSC ERROR: MatSetValues_MPIAIJ() line 564 in src/mat/impls/aij/mpi/mpiaij.c > [12]PETSC ERROR: MatAssemblyEnd_MPIAIJ() line 680 in src/mat/impls/aij/mpi/mpiaij.c > [12]PETSC ERROR: MatAssemblyEnd() line 4879 in src/mat/interface/matrix.c > > [12] --> Error in "MatAssemblyEnd()". > [12] --> Code: 63 > > However, I do not get this using the Intel MPI (which adheres to the MPI 2.0 standard) library. Any help will be greatly appreciated. > > Regards, > > -------------------------- > Wadud Miah > HPC, Design and Theoretical Physics > Direct: 0118 98 56220 > AWE, Aldermaston, Reading, RG7 4PR > > > ___________________________________________________ ____________________________ The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited. Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer. While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected. AWE Plc Registered in England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR > From mrosso at uci.edu Thu Aug 1 13:14:05 2013 From: mrosso at uci.edu (Michele Rosso) Date: Thu, 01 Aug 2013 11:14:05 -0700 Subject: [petsc-users] GAMG speed Message-ID: <51FAA56D.60106@uci.edu> Hi, I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. So far I am using GAMG with the default settings, i.e. -pc_type gamg -pc_gamg_agg_nsmooths 1 The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? Finally, I did not try geometric multigrid: do you think it is worth a shot? Here are my current settings: I run with -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left and the output is: KSP Object: 4 MPI processes type: cg maximum iterations=10000 tolerances: relative=1e-08, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using UNPRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 4 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 4 MPI processes type: bjacobi block Jacobi: number of blocks = 4 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 KSP Object: (mg_coarse_sub_) left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: preonly 1 MPI processes type: lu maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 LU: out-of-place factorization left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: factor fill ratio given 5, needed 4.13207 Factored matrix follows: Matrix Object: Matrix Object: 1 MPI processes type: seqaij rows=395, cols=395 package used to perform factorization: petsc total: nonzeros=132379, allocated nonzeros=132379 total number of mallocs used during MatSetValues calls =0 not using I-node routines 1 MPI processes type: seqaij linear system matrix = precond matrix: rows=0, cols=0 package used to perform factorization: petsc total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij Matrix Object:KSP Object: 1 MPI processes type: seqaij rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines rows=395, cols=395 total: nonzeros=32037, allocated nonzeros=32037 total number of mallocs used during MatSetValues calls =0 not using I-node routines - - - - - - - - - - - - - - - - - - KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 package used to perform factorization: petsc total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 package used to perform factorization: petsc total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines [1] number of local blocks = 1, first local block number = 1 [1] local block number 0 - - - - - - - - - - - - - - - - - - [2] number of local blocks = 1, first local block number = 2 [2] local block number 0 - - - - - - - - - - - - - - - - - - [3] number of local blocks = 1, first local block number = 3 [3] local block number 0 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=395, cols=395 total: nonzeros=32037, allocated nonzeros=32037 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=23918, cols=23918 total: nonzeros=818732, allocated nonzeros=818732 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=1835008, allocated nonzeros=1835008 total number of mallocs used during MatSetValues calls =0 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=1835008, allocated nonzeros=1835008 total number of mallocs used during MatSetValues calls =0 #PETSc Option Table entries: -ksp_view -options_left -pc_gamg_agg_nsmooths 1 -pc_type gamg #End of PETSc Option Table entries There are no unused options. Thank you, Michele -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Aug 1 13:21:22 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 1 Aug 2013 13:21:22 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FAA56D.60106@uci.edu> References: <51FAA56D.60106@uci.edu> Message-ID: What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. Barry On Aug 1, 2013, at 1:14 PM, Michele Rosso wrote: > Hi, > > I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. > So far I am using GAMG with the default settings, i.e. > > -pc_type gamg -pc_gamg_agg_nsmooths 1 > > The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly > if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. > So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? > Finally, I did not try geometric multigrid: do you think it is worth a shot? > > Here are my current settings: > > I run with > > -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left > > and the output is: > > KSP Object: 4 MPI processes > type: cg > maximum iterations=10000 > tolerances: relative=1e-08, absolute=1e-50, divergence=10000 > left preconditioning > using nonzero initial guess > using UNPRECONDITIONED norm type for convergence test > PC Object: 4 MPI processes > type: gamg > MG: type is MULTIPLICATIVE, levels=3 cycles=v > Cycles per PCApply=1 > Using Galerkin computed coarse grid matrices > Coarse grid solver -- level ------------------------------- > KSP Object: (mg_coarse_) 4 MPI processes > type: preonly > maximum iterations=1, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using NONE norm type for convergence test > PC Object: (mg_coarse_) 4 MPI processes > type: bjacobi > block Jacobi: number of blocks = 4 > Local solve info for each block is in the following KSP and PC objects: > [0] number of local blocks = 1, first local block number = 0 > [0] local block number 0 > KSP Object: (mg_coarse_sub_) 1 MPI processes > type: preonly > maximum iterations=1, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > KSP Object: (mg_coarse_sub_) left preconditioning > using NONE norm type for convergence test > PC Object: (mg_coarse_sub_) 1 MPI processes > type: preonly > 1 MPI processes > type: lu > maximum iterations=1, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > LU: out-of-place factorization > left preconditioning > using NONE norm type for convergence test > PC Object: (mg_coarse_sub_) 1 MPI processes > type: lu > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: nd > LU: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: nd > factor fill ratio given 5, needed 0 > Factored matrix follows: > factor fill ratio given 5, needed 4.13207 > Factored matrix follows: > Matrix Object: Matrix Object: 1 MPI processes > type: seqaij > rows=395, cols=395 > package used to perform factorization: petsc > total: nonzeros=132379, allocated nonzeros=132379 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > 1 MPI processes > type: seqaij > linear system matrix = precond matrix: > rows=0, cols=0 > package used to perform factorization: petsc > total: nonzeros=1, allocated nonzeros=1 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > Matrix Object:KSP Object: 1 MPI processes > type: seqaij > rows=0, cols=0 > total: nonzeros=0, allocated nonzeros=0 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > rows=395, cols=395 > total: nonzeros=32037, allocated nonzeros=32037 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > - - - - - - - - - - - - - - - - - - > KSP Object: (mg_coarse_sub_) 1 MPI processes > type: preonly > maximum iterations=1, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using NONE norm type for convergence test > PC Object: (mg_coarse_sub_) 1 MPI processes > type: lu > LU: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: nd > factor fill ratio given 5, needed 0 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=0, cols=0 > package used to perform factorization: petsc > total: nonzeros=1, allocated nonzeros=1 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=0, cols=0 > total: nonzeros=0, allocated nonzeros=0 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > (mg_coarse_sub_) 1 MPI processes > type: preonly > maximum iterations=1, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using NONE norm type for convergence test > PC Object: (mg_coarse_sub_) 1 MPI processes > type: lu > LU: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: nd > factor fill ratio given 5, needed 0 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=0, cols=0 > package used to perform factorization: petsc > total: nonzeros=1, allocated nonzeros=1 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=0, cols=0 > total: nonzeros=0, allocated nonzeros=0 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > [1] number of local blocks = 1, first local block number = 1 > [1] local block number 0 > - - - - - - - - - - - - - - - - - - > [2] number of local blocks = 1, first local block number = 2 > [2] local block number 0 > - - - - - - - - - - - - - - - - - - > [3] number of local blocks = 1, first local block number = 3 > [3] local block number 0 > - - - - - - - - - - - - - - - - - - > linear system matrix = precond matrix: > Matrix Object: 4 MPI processes > type: mpiaij > rows=395, cols=395 > total: nonzeros=32037, allocated nonzeros=32037 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > Down solver (pre-smoother) on level 1 ------------------------------- > KSP Object: (mg_levels_1_) 4 MPI processes > type: chebyshev > Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 > maximum iterations=2 > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using nonzero initial guess > using NONE norm type for convergence test > PC Object: (mg_levels_1_) 4 MPI processes > type: jacobi > linear system matrix = precond matrix: > Matrix Object: 4 MPI processes > type: mpiaij > rows=23918, cols=23918 > total: nonzeros=818732, allocated nonzeros=818732 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > Up solver (post-smoother) same as down solver (pre-smoother) > Down solver (pre-smoother) on level 2 ------------------------------- > KSP Object: (mg_levels_2_) 4 MPI processes > type: chebyshev > Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 > maximum iterations=2 > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using nonzero initial guess > using NONE norm type for convergence test > PC Object: (mg_levels_2_) 4 MPI processes > type: jacobi > linear system matrix = precond matrix: > Matrix Object: 4 MPI processes > type: mpiaij > rows=262144, cols=262144 > total: nonzeros=1835008, allocated nonzeros=1835008 > total number of mallocs used during MatSetValues calls =0 > Up solver (post-smoother) same as down solver (pre-smoother) > linear system matrix = precond matrix: > Matrix Object: 4 MPI processes > type: mpiaij > rows=262144, cols=262144 > total: nonzeros=1835008, allocated nonzeros=1835008 > total number of mallocs used during MatSetValues calls =0 > #PETSc Option Table entries: > -ksp_view > -options_left > -pc_gamg_agg_nsmooths 1 > -pc_type gamg > #End of PETSc Option Table entries > There are no unused options. > > > Thank you, > Michele From mrosso at uci.edu Thu Aug 1 13:35:11 2013 From: mrosso at uci.edu (Michele Rosso) Date: Thu, 01 Aug 2013 11:35:11 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> Message-ID: <51FAAA5F.20805@uci.edu> Barry, I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? I tried to run my case with -pc_type mg -da_refine 4 but it does not seem to use the -da_refine option: mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left KSP Object: 4 MPI processes type: cg maximum iterations=10000 tolerances: relative=1e-08, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using UNPRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: mg MG: type is MULTIPLICATIVE, levels=1 cycles=v Cycles per PCApply=1 Not using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_levels_0_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 Chebyshev: estimated using: [0 0.1; 0 1.1] KSP Object: (mg_levels_0_est_) 4 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_0_) 4 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=1835008, allocated nonzeros=1835008 total number of mallocs used during MatSetValues calls =0 maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_0_) 4 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=1835008, allocated nonzeros=1835008 total number of mallocs used during MatSetValues calls =0 linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=1835008, allocated nonzeros=1835008 total number of mallocs used during MatSetValues calls =0 Solution = 1.53600013 sec #PETSc Option Table entries: -da_refine 4 -ksp_view -options_left -pc_type mg #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-da_refine value: 4 Michele On 08/01/2013 11:21 AM, Barry Smith wrote: > What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. > > For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. > > > Barry > > > On Aug 1, 2013, at 1:14 PM, Michele Rosso wrote: > >> Hi, >> >> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >> So far I am using GAMG with the default settings, i.e. >> >> -pc_type gamg -pc_gamg_agg_nsmooths 1 >> >> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >> Finally, I did not try geometric multigrid: do you think it is worth a shot? >> >> Here are my current settings: >> >> I run with >> >> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >> >> and the output is: >> >> KSP Object: 4 MPI processes >> type: cg >> maximum iterations=10000 >> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >> left preconditioning >> using nonzero initial guess >> using UNPRECONDITIONED norm type for convergence test >> PC Object: 4 MPI processes >> type: gamg >> MG: type is MULTIPLICATIVE, levels=3 cycles=v >> Cycles per PCApply=1 >> Using Galerkin computed coarse grid matrices >> Coarse grid solver -- level ------------------------------- >> KSP Object: (mg_coarse_) 4 MPI processes >> type: preonly >> maximum iterations=1, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_coarse_) 4 MPI processes >> type: bjacobi >> block Jacobi: number of blocks = 4 >> Local solve info for each block is in the following KSP and PC objects: >> [0] number of local blocks = 1, first local block number = 0 >> [0] local block number 0 >> KSP Object: (mg_coarse_sub_) 1 MPI processes >> type: preonly >> maximum iterations=1, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> KSP Object: (mg_coarse_sub_) left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_coarse_sub_) 1 MPI processes >> type: preonly >> 1 MPI processes >> type: lu >> maximum iterations=1, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> LU: out-of-place factorization >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_coarse_sub_) 1 MPI processes >> type: lu >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: nd >> LU: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: nd >> factor fill ratio given 5, needed 0 >> Factored matrix follows: >> factor fill ratio given 5, needed 4.13207 >> Factored matrix follows: >> Matrix Object: Matrix Object: 1 MPI processes >> type: seqaij >> rows=395, cols=395 >> package used to perform factorization: petsc >> total: nonzeros=132379, allocated nonzeros=132379 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> 1 MPI processes >> type: seqaij >> linear system matrix = precond matrix: >> rows=0, cols=0 >> package used to perform factorization: petsc >> total: nonzeros=1, allocated nonzeros=1 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> Matrix Object:KSP Object: 1 MPI processes >> type: seqaij >> rows=0, cols=0 >> total: nonzeros=0, allocated nonzeros=0 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> rows=395, cols=395 >> total: nonzeros=32037, allocated nonzeros=32037 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> - - - - - - - - - - - - - - - - - - >> KSP Object: (mg_coarse_sub_) 1 MPI processes >> type: preonly >> maximum iterations=1, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_coarse_sub_) 1 MPI processes >> type: lu >> LU: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: nd >> factor fill ratio given 5, needed 0 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=0, cols=0 >> package used to perform factorization: petsc >> total: nonzeros=1, allocated nonzeros=1 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=0, cols=0 >> total: nonzeros=0, allocated nonzeros=0 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> (mg_coarse_sub_) 1 MPI processes >> type: preonly >> maximum iterations=1, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_coarse_sub_) 1 MPI processes >> type: lu >> LU: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: nd >> factor fill ratio given 5, needed 0 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=0, cols=0 >> package used to perform factorization: petsc >> total: nonzeros=1, allocated nonzeros=1 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=0, cols=0 >> total: nonzeros=0, allocated nonzeros=0 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> [1] number of local blocks = 1, first local block number = 1 >> [1] local block number 0 >> - - - - - - - - - - - - - - - - - - >> [2] number of local blocks = 1, first local block number = 2 >> [2] local block number 0 >> - - - - - - - - - - - - - - - - - - >> [3] number of local blocks = 1, first local block number = 3 >> [3] local block number 0 >> - - - - - - - - - - - - - - - - - - >> linear system matrix = precond matrix: >> Matrix Object: 4 MPI processes >> type: mpiaij >> rows=395, cols=395 >> total: nonzeros=32037, allocated nonzeros=32037 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node (on process 0) routines >> Down solver (pre-smoother) on level 1 ------------------------------- >> KSP Object: (mg_levels_1_) 4 MPI processes >> type: chebyshev >> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >> maximum iterations=2 >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using nonzero initial guess >> using NONE norm type for convergence test >> PC Object: (mg_levels_1_) 4 MPI processes >> type: jacobi >> linear system matrix = precond matrix: >> Matrix Object: 4 MPI processes >> type: mpiaij >> rows=23918, cols=23918 >> total: nonzeros=818732, allocated nonzeros=818732 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node (on process 0) routines >> Up solver (post-smoother) same as down solver (pre-smoother) >> Down solver (pre-smoother) on level 2 ------------------------------- >> KSP Object: (mg_levels_2_) 4 MPI processes >> type: chebyshev >> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >> maximum iterations=2 >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using nonzero initial guess >> using NONE norm type for convergence test >> PC Object: (mg_levels_2_) 4 MPI processes >> type: jacobi >> linear system matrix = precond matrix: >> Matrix Object: 4 MPI processes >> type: mpiaij >> rows=262144, cols=262144 >> total: nonzeros=1835008, allocated nonzeros=1835008 >> total number of mallocs used during MatSetValues calls =0 >> Up solver (post-smoother) same as down solver (pre-smoother) >> linear system matrix = precond matrix: >> Matrix Object: 4 MPI processes >> type: mpiaij >> rows=262144, cols=262144 >> total: nonzeros=1835008, allocated nonzeros=1835008 >> total number of mallocs used during MatSetValues calls =0 >> #PETSc Option Table entries: >> -ksp_view >> -options_left >> -pc_gamg_agg_nsmooths 1 >> -pc_type gamg >> #End of PETSc Option Table entries >> There are no unused options. >> >> >> Thank you, >> Michele > From bsmith at mcs.anl.gov Thu Aug 1 13:48:33 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 1 Aug 2013 13:48:33 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FAAA5F.20805@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> Message-ID: Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c Barry On Aug 1, 2013, at 1:35 PM, Michele Rosso wrote: > Barry, > > I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. > I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? > I tried to run my case with > > > -pc_type mg -da_refine 4 > > > > but it does not seem to use the -da_refine option: > > mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left > > > KSP Object: 4 MPI processes > type: cg > maximum iterations=10000 > tolerances: relative=1e-08, absolute=1e-50, divergence=10000 > left preconditioning > using nonzero initial guess > using UNPRECONDITIONED norm type for convergence test > PC Object: 4 MPI processes > type: mg > MG: type is MULTIPLICATIVE, levels=1 cycles=v > Cycles per PCApply=1 > Not using Galerkin computed coarse grid matrices > Coarse grid solver -- level ------------------------------- > KSP Object: (mg_levels_0_) 4 MPI processes > type: chebyshev > Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 > Chebyshev: estimated using: [0 0.1; 0 1.1] > KSP Object: (mg_levels_0_est_) 4 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using NONE norm type for convergence test > PC Object: (mg_levels_0_) 4 MPI processes > type: sor > SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 > linear system matrix = precond matrix: > Matrix Object: 4 MPI processes > type: mpiaij > rows=262144, cols=262144 > total: nonzeros=1835008, allocated nonzeros=1835008 > total number of mallocs used during MatSetValues calls =0 > maximum iterations=1, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using NONE norm type for convergence test > PC Object: (mg_levels_0_) 4 MPI processes > type: sor > SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 > linear system matrix = precond matrix: > Matrix Object: 4 MPI processes > type: mpiaij > rows=262144, cols=262144 > total: nonzeros=1835008, allocated nonzeros=1835008 > total number of mallocs used during MatSetValues calls =0 > linear system matrix = precond matrix: > Matrix Object: 4 MPI processes > type: mpiaij > rows=262144, cols=262144 > total: nonzeros=1835008, allocated nonzeros=1835008 > total number of mallocs used during MatSetValues calls =0 > Solution = 1.53600013 sec > #PETSc Option Table entries: > -da_refine 4 > -ksp_view > -options_left > -pc_type mg > #End of PETSc Option Table entries > There is one unused database option. It is: > Option left: name:-da_refine value: 4 > > Michele > > On 08/01/2013 11:21 AM, Barry Smith wrote: >> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >> >> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >> >> >> Barry >> >> >> On Aug 1, 2013, at 1:14 PM, Michele Rosso wrote: >> >>> Hi, >>> >>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>> So far I am using GAMG with the default settings, i.e. >>> >>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>> >>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>> >>> Here are my current settings: >>> >>> I run with >>> >>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>> >>> and the output is: >>> >>> KSP Object: 4 MPI processes >>> type: cg >>> maximum iterations=10000 >>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using nonzero initial guess >>> using UNPRECONDITIONED norm type for convergence test >>> PC Object: 4 MPI processes >>> type: gamg >>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>> Cycles per PCApply=1 >>> Using Galerkin computed coarse grid matrices >>> Coarse grid solver -- level ------------------------------- >>> KSP Object: (mg_coarse_) 4 MPI processes >>> type: preonly >>> maximum iterations=1, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (mg_coarse_) 4 MPI processes >>> type: bjacobi >>> block Jacobi: number of blocks = 4 >>> Local solve info for each block is in the following KSP and PC objects: >>> [0] number of local blocks = 1, first local block number = 0 >>> [0] local block number 0 >>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>> type: preonly >>> maximum iterations=1, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> KSP Object: (mg_coarse_sub_) left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (mg_coarse_sub_) 1 MPI processes >>> type: preonly >>> 1 MPI processes >>> type: lu >>> maximum iterations=1, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> LU: out-of-place factorization >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (mg_coarse_sub_) 1 MPI processes >>> type: lu >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: nd >>> LU: out-of-place factorization >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: nd >>> factor fill ratio given 5, needed 0 >>> Factored matrix follows: >>> factor fill ratio given 5, needed 4.13207 >>> Factored matrix follows: >>> Matrix Object: Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=395, cols=395 >>> package used to perform factorization: petsc >>> total: nonzeros=132379, allocated nonzeros=132379 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> 1 MPI processes >>> type: seqaij >>> linear system matrix = precond matrix: >>> rows=0, cols=0 >>> package used to perform factorization: petsc >>> total: nonzeros=1, allocated nonzeros=1 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> Matrix Object:KSP Object: 1 MPI processes >>> type: seqaij >>> rows=0, cols=0 >>> total: nonzeros=0, allocated nonzeros=0 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> rows=395, cols=395 >>> total: nonzeros=32037, allocated nonzeros=32037 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> - - - - - - - - - - - - - - - - - - >>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>> type: preonly >>> maximum iterations=1, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (mg_coarse_sub_) 1 MPI processes >>> type: lu >>> LU: out-of-place factorization >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: nd >>> factor fill ratio given 5, needed 0 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=0, cols=0 >>> package used to perform factorization: petsc >>> total: nonzeros=1, allocated nonzeros=1 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=0, cols=0 >>> total: nonzeros=0, allocated nonzeros=0 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> (mg_coarse_sub_) 1 MPI processes >>> type: preonly >>> maximum iterations=1, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (mg_coarse_sub_) 1 MPI processes >>> type: lu >>> LU: out-of-place factorization >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: nd >>> factor fill ratio given 5, needed 0 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=0, cols=0 >>> package used to perform factorization: petsc >>> total: nonzeros=1, allocated nonzeros=1 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=0, cols=0 >>> total: nonzeros=0, allocated nonzeros=0 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> [1] number of local blocks = 1, first local block number = 1 >>> [1] local block number 0 >>> - - - - - - - - - - - - - - - - - - >>> [2] number of local blocks = 1, first local block number = 2 >>> [2] local block number 0 >>> - - - - - - - - - - - - - - - - - - >>> [3] number of local blocks = 1, first local block number = 3 >>> [3] local block number 0 >>> - - - - - - - - - - - - - - - - - - >>> linear system matrix = precond matrix: >>> Matrix Object: 4 MPI processes >>> type: mpiaij >>> rows=395, cols=395 >>> total: nonzeros=32037, allocated nonzeros=32037 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node (on process 0) routines >>> Down solver (pre-smoother) on level 1 ------------------------------- >>> KSP Object: (mg_levels_1_) 4 MPI processes >>> type: chebyshev >>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>> maximum iterations=2 >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using nonzero initial guess >>> using NONE norm type for convergence test >>> PC Object: (mg_levels_1_) 4 MPI processes >>> type: jacobi >>> linear system matrix = precond matrix: >>> Matrix Object: 4 MPI processes >>> type: mpiaij >>> rows=23918, cols=23918 >>> total: nonzeros=818732, allocated nonzeros=818732 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node (on process 0) routines >>> Up solver (post-smoother) same as down solver (pre-smoother) >>> Down solver (pre-smoother) on level 2 ------------------------------- >>> KSP Object: (mg_levels_2_) 4 MPI processes >>> type: chebyshev >>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>> maximum iterations=2 >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using nonzero initial guess >>> using NONE norm type for convergence test >>> PC Object: (mg_levels_2_) 4 MPI processes >>> type: jacobi >>> linear system matrix = precond matrix: >>> Matrix Object: 4 MPI processes >>> type: mpiaij >>> rows=262144, cols=262144 >>> total: nonzeros=1835008, allocated nonzeros=1835008 >>> total number of mallocs used during MatSetValues calls =0 >>> Up solver (post-smoother) same as down solver (pre-smoother) >>> linear system matrix = precond matrix: >>> Matrix Object: 4 MPI processes >>> type: mpiaij >>> rows=262144, cols=262144 >>> total: nonzeros=1835008, allocated nonzeros=1835008 >>> total number of mallocs used during MatSetValues calls =0 >>> #PETSc Option Table entries: >>> -ksp_view >>> -options_left >>> -pc_gamg_agg_nsmooths 1 >>> -pc_type gamg >>> #End of PETSc Option Table entries >>> There are no unused options. >>> >>> >>> Thank you, >>> Michele >> > From knepley at gmail.com Thu Aug 1 14:45:33 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 2 Aug 2013 03:45:33 +0800 Subject: [petsc-users] Use kspsolve repeatly In-Reply-To: <28294a88.134fb.1403a11b96f.Coremail.ztdepyahoo@163.com> References: <2fc48037.8f45.140383da796.Coremail.ztdepyahoo@163.com> <28294a88.134fb.1403a11b96f.Coremail.ztdepyahoo@163.com> Message-ID: On Thu, Aug 1, 2013 at 9:29 PM, ??? wrote: > I use the latest version PETSC, and the matrix has the same nonzero > pattern during each outer iteration. > I want to know does the kspsolve allocate memory for PC dring each call. > Its possible if you are asking for a refactorization every time. Without the output of -ksp_view, I have no idea what solver you are using. Its easy to check whether memory is being leaked using -malloc_dump. Matt > > > ? 2013-08-01 20:46:36?"Matthew Knepley" ??? > > On Thu, Aug 1, 2013 at 12:58 PM, ??? wrote: > >> I need to use the kspsolve(A,b,x) repeatly in my code in the following >> style. >> but i have noticed from the system load monitor that during the code >> running, it allocate new memeory every step. >> >> I use the default setting for the PC. >> Could you please told me how to resolve this problem. >> > > 1) There is no enough here to give us an idea what you are doing > > 2) Are you using the latest release? If so, it prevents you from adding > new nonzreos to the matrix, which is my first guess. > > Matt > > >> for (int i=0;i> { >> MatsetValues( ).... >> kspsolve(A,b,x) >> } >> >> >> >> >> < span title="neteasefooter"> >> >> >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Thu Aug 1 14:47:39 2013 From: mrosso at uci.edu (Michele Rosso) Date: Thu, 01 Aug 2013 12:47:39 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> Message-ID: <51FABB5B.50708@uci.edu> Barry, you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through KSPSetComputeOperators and KSPSetComputeRHS. I do not do that, I simply build a rhs vector and a matrix and then I solve the system. If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. Thank you, Michele On 08/01/2013 11:48 AM, Barry Smith wrote: > Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c > > Barry > > On Aug 1, 2013, at 1:35 PM, Michele Rosso wrote: > >> Barry, >> >> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >> I tried to run my case with >> >> >> -pc_type mg -da_refine 4 >> >> >> >> but it does not seem to use the -da_refine option: >> >> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >> >> >> KSP Object: 4 MPI processes >> type: cg >> maximum iterations=10000 >> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >> left preconditioning >> using nonzero initial guess >> using UNPRECONDITIONED norm type for convergence test >> PC Object: 4 MPI processes >> type: mg >> MG: type is MULTIPLICATIVE, levels=1 cycles=v >> Cycles per PCApply=1 >> Not using Galerkin computed coarse grid matrices >> Coarse grid solver -- level ------------------------------- >> KSP Object: (mg_levels_0_) 4 MPI processes >> type: chebyshev >> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >> Chebyshev: estimated using: [0 0.1; 0 1.1] >> KSP Object: (mg_levels_0_est_) 4 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_levels_0_) 4 MPI processes >> type: sor >> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >> linear system matrix = precond matrix: >> Matrix Object: 4 MPI processes >> type: mpiaij >> rows=262144, cols=262144 >> total: nonzeros=1835008, allocated nonzeros=1835008 >> total number of mallocs used during MatSetValues calls =0 >> maximum iterations=1, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (mg_levels_0_) 4 MPI processes >> type: sor >> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >> linear system matrix = precond matrix: >> Matrix Object: 4 MPI processes >> type: mpiaij >> rows=262144, cols=262144 >> total: nonzeros=1835008, allocated nonzeros=1835008 >> total number of mallocs used during MatSetValues calls =0 >> linear system matrix = precond matrix: >> Matrix Object: 4 MPI processes >> type: mpiaij >> rows=262144, cols=262144 >> total: nonzeros=1835008, allocated nonzeros=1835008 >> total number of mallocs used during MatSetValues calls =0 >> Solution = 1.53600013 sec >> #PETSc Option Table entries: >> -da_refine 4 >> -ksp_view >> -options_left >> -pc_type mg >> #End of PETSc Option Table entries >> There is one unused database option. It is: >> Option left: name:-da_refine value: 4 >> >> Michele >> >> On 08/01/2013 11:21 AM, Barry Smith wrote: >>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>> >>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>> >>> >>> Barry >>> >>> >>> On Aug 1, 2013, at 1:14 PM, Michele Rosso wrote: >>> >>>> Hi, >>>> >>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>> So far I am using GAMG with the default settings, i.e. >>>> >>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>> >>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>> >>>> Here are my current settings: >>>> >>>> I run with >>>> >>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>> >>>> and the output is: >>>> >>>> KSP Object: 4 MPI processes >>>> type: cg >>>> maximum iterations=10000 >>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using nonzero initial guess >>>> using UNPRECONDITIONED norm type for convergence test >>>> PC Object: 4 MPI processes >>>> type: gamg >>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>> Cycles per PCApply=1 >>>> Using Galerkin computed coarse grid matrices >>>> Coarse grid solver -- level ------------------------------- >>>> KSP Object: (mg_coarse_) 4 MPI processes >>>> type: preonly >>>> maximum iterations=1, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using NONE norm type for convergence test >>>> PC Object: (mg_coarse_) 4 MPI processes >>>> type: bjacobi >>>> block Jacobi: number of blocks = 4 >>>> Local solve info for each block is in the following KSP and PC objects: >>>> [0] number of local blocks = 1, first local block number = 0 >>>> [0] local block number 0 >>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>> type: preonly >>>> maximum iterations=1, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>> using NONE norm type for convergence test >>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>> type: preonly >>>> 1 MPI processes >>>> type: lu >>>> maximum iterations=1, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> LU: out-of-place factorization >>>> left preconditioning >>>> using NONE norm type for convergence test >>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>> type: lu >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: nd >>>> LU: out-of-place factorization >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: nd >>>> factor fill ratio given 5, needed 0 >>>> Factored matrix follows: >>>> factor fill ratio given 5, needed 4.13207 >>>> Factored matrix follows: >>>> Matrix Object: Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=395, cols=395 >>>> package used to perform factorization: petsc >>>> total: nonzeros=132379, allocated nonzeros=132379 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> 1 MPI processes >>>> type: seqaij >>>> linear system matrix = precond matrix: >>>> rows=0, cols=0 >>>> package used to perform factorization: petsc >>>> total: nonzeros=1, allocated nonzeros=1 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> Matrix Object:KSP Object: 1 MPI processes >>>> type: seqaij >>>> rows=0, cols=0 >>>> total: nonzeros=0, allocated nonzeros=0 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> rows=395, cols=395 >>>> total: nonzeros=32037, allocated nonzeros=32037 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> - - - - - - - - - - - - - - - - - - >>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>> type: preonly >>>> maximum iterations=1, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using NONE norm type for convergence test >>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>> type: lu >>>> LU: out-of-place factorization >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: nd >>>> factor fill ratio given 5, needed 0 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=0, cols=0 >>>> package used to perform factorization: petsc >>>> total: nonzeros=1, allocated nonzeros=1 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=0, cols=0 >>>> total: nonzeros=0, allocated nonzeros=0 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> (mg_coarse_sub_) 1 MPI processes >>>> type: preonly >>>> maximum iterations=1, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using NONE norm type for convergence test >>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>> type: lu >>>> LU: out-of-place factorization >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: nd >>>> factor fill ratio given 5, needed 0 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=0, cols=0 >>>> package used to perform factorization: petsc >>>> total: nonzeros=1, allocated nonzeros=1 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=0, cols=0 >>>> total: nonzeros=0, allocated nonzeros=0 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> [1] number of local blocks = 1, first local block number = 1 >>>> [1] local block number 0 >>>> - - - - - - - - - - - - - - - - - - >>>> [2] number of local blocks = 1, first local block number = 2 >>>> [2] local block number 0 >>>> - - - - - - - - - - - - - - - - - - >>>> [3] number of local blocks = 1, first local block number = 3 >>>> [3] local block number 0 >>>> - - - - - - - - - - - - - - - - - - >>>> linear system matrix = precond matrix: >>>> Matrix Object: 4 MPI processes >>>> type: mpiaij >>>> rows=395, cols=395 >>>> total: nonzeros=32037, allocated nonzeros=32037 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node (on process 0) routines >>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>> type: chebyshev >>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>> maximum iterations=2 >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using nonzero initial guess >>>> using NONE norm type for convergence test >>>> PC Object: (mg_levels_1_) 4 MPI processes >>>> type: jacobi >>>> linear system matrix = precond matrix: >>>> Matrix Object: 4 MPI processes >>>> type: mpiaij >>>> rows=23918, cols=23918 >>>> total: nonzeros=818732, allocated nonzeros=818732 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node (on process 0) routines >>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>> type: chebyshev >>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>> maximum iterations=2 >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using nonzero initial guess >>>> using NONE norm type for convergence test >>>> PC Object: (mg_levels_2_) 4 MPI processes >>>> type: jacobi >>>> linear system matrix = precond matrix: >>>> Matrix Object: 4 MPI processes >>>> type: mpiaij >>>> rows=262144, cols=262144 >>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>> total number of mallocs used during MatSetValues calls =0 >>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>> linear system matrix = precond matrix: >>>> Matrix Object: 4 MPI processes >>>> type: mpiaij >>>> rows=262144, cols=262144 >>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>> total number of mallocs used during MatSetValues calls =0 >>>> #PETSc Option Table entries: >>>> -ksp_view >>>> -options_left >>>> -pc_gamg_agg_nsmooths 1 >>>> -pc_type gamg >>>> #End of PETSc Option Table entries >>>> There are no unused options. >>>> >>>> >>>> Thank you, >>>> Michele > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Aug 1 15:04:59 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 1 Aug 2013 15:04:59 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FABB5B.50708@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> Message-ID: You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working If it doesn't work, send us all the output. Barry On Aug 1, 2013, at 2:47 PM, Michele Rosso wrote: > Barry, > you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the > geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through > KSPSetComputeOperators and KSPSetComputeRHS. > I do not do that, I simply build a rhs vector and a matrix and then I solve the system. > If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. > Thank you, > Michele > > On 08/01/2013 11:48 AM, Barry Smith wrote: >> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >> >> Barry >> >> On Aug 1, 2013, at 1:35 PM, Michele Rosso >> >> wrote: >> >> >>> Barry, >>> >>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>> I tried to run my case with >>> >>> >>> -pc_type mg -da_refine 4 >>> >>> >>> >>> but it does not seem to use the -da_refine option: >>> >>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>> >>> >>> KSP Object: 4 MPI processes >>> type: cg >>> maximum iterations=10000 >>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using nonzero initial guess >>> using UNPRECONDITIONED norm type for convergence test >>> PC Object: 4 MPI processes >>> type: mg >>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>> Cycles per PCApply=1 >>> Not using Galerkin computed coarse grid matrices >>> Coarse grid solver -- level ------------------------------- >>> KSP Object: (mg_levels_0_) 4 MPI processes >>> type: chebyshev >>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (mg_levels_0_) 4 MPI processes >>> type: sor >>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>> linear system matrix = precond matrix: >>> Matrix Object: 4 MPI processes >>> type: mpiaij >>> rows=262144, cols=262144 >>> total: nonzeros=1835008, allocated nonzeros=1835008 >>> total number of mallocs used during MatSetValues calls =0 >>> maximum iterations=1, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (mg_levels_0_) 4 MPI processes >>> type: sor >>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>> linear system matrix = precond matrix: >>> Matrix Object: 4 MPI processes >>> type: mpiaij >>> rows=262144, cols=262144 >>> total: nonzeros=1835008, allocated nonzeros=1835008 >>> total number of mallocs used during MatSetValues calls =0 >>> linear system matrix = precond matrix: >>> Matrix Object: 4 MPI processes >>> type: mpiaij >>> rows=262144, cols=262144 >>> total: nonzeros=1835008, allocated nonzeros=1835008 >>> total number of mallocs used during MatSetValues calls =0 >>> Solution = 1.53600013 sec >>> #PETSc Option Table entries: >>> -da_refine 4 >>> -ksp_view >>> -options_left >>> -pc_type mg >>> #End of PETSc Option Table entries >>> There is one unused database option. It is: >>> Option left: name:-da_refine value: 4 >>> >>> Michele >>> >>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>> >>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>> >>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>> >>>> >>>> Barry >>>> >>>> >>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>> >>>> wrote: >>>> >>>> >>>>> Hi, >>>>> >>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>> So far I am using GAMG with the default settings, i.e. >>>>> >>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>> >>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>> >>>>> Here are my current settings: >>>>> >>>>> I run with >>>>> >>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>> >>>>> and the output is: >>>>> >>>>> KSP Object: 4 MPI processes >>>>> type: cg >>>>> maximum iterations=10000 >>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using nonzero initial guess >>>>> using UNPRECONDITIONED norm type for convergence test >>>>> PC Object: 4 MPI processes >>>>> type: gamg >>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>> Cycles per PCApply=1 >>>>> Using Galerkin computed coarse grid matrices >>>>> Coarse grid solver -- level ------------------------------- >>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>> type: preonly >>>>> maximum iterations=1, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>> type: bjacobi >>>>> block Jacobi: number of blocks = 4 >>>>> Local solve info for each block is in the following KSP and PC objects: >>>>> [0] number of local blocks = 1, first local block number = 0 >>>>> [0] local block number 0 >>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>> type: preonly >>>>> maximum iterations=1, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>> type: preonly >>>>> 1 MPI processes >>>>> type: lu >>>>> maximum iterations=1, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> LU: out-of-place factorization >>>>> left preconditioning >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>> type: lu >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: nd >>>>> LU: out-of-place factorization >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: nd >>>>> factor fill ratio given 5, needed 0 >>>>> Factored matrix follows: >>>>> factor fill ratio given 5, needed 4.13207 >>>>> Factored matrix follows: >>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=395, cols=395 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> 1 MPI processes >>>>> type: seqaij >>>>> linear system matrix = precond matrix: >>>>> rows=0, cols=0 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=1, allocated nonzeros=1 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> Matrix Object:KSP Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=0, cols=0 >>>>> total: nonzeros=0, allocated nonzeros=0 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> rows=395, cols=395 >>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> - - - - - - - - - - - - - - - - - - >>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>> type: preonly >>>>> maximum iterations=1, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>> type: lu >>>>> LU: out-of-place factorization >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: nd >>>>> factor fill ratio given 5, needed 0 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=0, cols=0 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=1, allocated nonzeros=1 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=0, cols=0 >>>>> total: nonzeros=0, allocated nonzeros=0 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> (mg_coarse_sub_) 1 MPI processes >>>>> type: preonly >>>>> maximum iterations=1, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>> type: lu >>>>> LU: out-of-place factorization >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: nd >>>>> factor fill ratio given 5, needed 0 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=0, cols=0 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=1, allocated nonzeros=1 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=0, cols=0 >>>>> total: nonzeros=0, allocated nonzeros=0 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> [1] number of local blocks = 1, first local block number = 1 >>>>> [1] local block number 0 >>>>> - - - - - - - - - - - - - - - - - - >>>>> [2] number of local blocks = 1, first local block number = 2 >>>>> [2] local block number 0 >>>>> - - - - - - - - - - - - - - - - - - >>>>> [3] number of local blocks = 1, first local block number = 3 >>>>> [3] local block number 0 >>>>> - - - - - - - - - - - - - - - - - - >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 4 MPI processes >>>>> type: mpiaij >>>>> rows=395, cols=395 >>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node (on process 0) routines >>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>> type: chebyshev >>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>> maximum iterations=2 >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using nonzero initial guess >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>> type: jacobi >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 4 MPI processes >>>>> type: mpiaij >>>>> rows=23918, cols=23918 >>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node (on process 0) routines >>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>> type: chebyshev >>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>> maximum iterations=2 >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using nonzero initial guess >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>> type: jacobi >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 4 MPI processes >>>>> type: mpiaij >>>>> rows=262144, cols=262144 >>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 4 MPI processes >>>>> type: mpiaij >>>>> rows=262144, cols=262144 >>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> #PETSc Option Table entries: >>>>> -ksp_view >>>>> -options_left >>>>> -pc_gamg_agg_nsmooths 1 >>>>> -pc_type gamg >>>>> #End of PETSc Option Table entries >>>>> There are no unused options. >>>>> >>>>> >>>>> Thank you, >>>>> Michele >>>>> >> > From mrosso at uci.edu Thu Aug 1 16:47:48 2013 From: mrosso at uci.edu (Michele Rosso) Date: Thu, 01 Aug 2013 14:47:48 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> Message-ID: <51FAD784.6080302@uci.edu> Barry, I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left For the test I use a 64^3 grid and 4 processors. The output is: [2]PETSC ERROR: --------------------- Error Message ------------------------------------ [2]PETSC ERROR: Arguments are incompatible! [2]PETSC ERROR: Zero diagonal on row 0! [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [2]PETSC ERROR: See docs/changes/index.html for recent updates. [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: See docs/index.html for manual pages. [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 [2]PETSC ERROR: Configure options [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c --------------------- Error Message ------------------------------------ [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h Arguments are incompatible! [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c Zero diagonal on row 0! [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c ------------------------------------------------------------------------ [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [3]PETSC ERROR: Arguments are incompatible! [3]PETSC ERROR: Zero diagonal on row 0! [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [3]PETSC ERROR: See docs/changes/index.html for recent updates. [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [3]PETSC ERROR: See docs/index.html for manual pages. [3]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 [3]PETSC ERROR: Configure options [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: --------------------- Error Message ------------------------------------ MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c [1]PETSC ERROR: Arguments are incompatible! [1]PETSC ERROR: Zero diagonal on row 0! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 [1]PETSC ERROR: Configure options [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c ------------------------------------------------------------------------ [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c #PETSc Option Table entries: -da_refine 4 -ksp_view -options_left -pc_mg_galerkin -pc_type mg #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-da_refine value: 4 Here is the code I use to setup DMDA and KSP: call DMDACreate3d( PETSC_COMM_WORLD , & & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) ! Create Global Vectors call DMCreateGlobalVector(da,b,ierr) call VecDuplicate(b,x,ierr) ! Set initial guess for first use of the module to 0 call VecSet(x,0.0_rp,ierr) ! Create matrix call DMCreateMatrix(da,MATMPIAIJ,A,ierr) ! Create solver call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) call KSPSetDM(ksp,da,ierr) call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) call KSPSetType(ksp,KSPCG,ierr) call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) ! To allow using option from command line call KSPSetFromOptions(ksp,ierr) Michele On 08/01/2013 01:04 PM, Barry Smith wrote: > You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working > > If it doesn't work, send us all the output. > > Barry > > > On Aug 1, 2013, at 2:47 PM, Michele Rosso wrote: > >> Barry, >> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >> KSPSetComputeOperators and KSPSetComputeRHS. >> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >> Thank you, >> Michele >> >> On 08/01/2013 11:48 AM, Barry Smith wrote: >>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>> >>> Barry >>> >>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>> >>> wrote: >>> >>> >>>> Barry, >>>> >>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>> I tried to run my case with >>>> >>>> >>>> -pc_type mg -da_refine 4 >>>> >>>> >>>> >>>> but it does not seem to use the -da_refine option: >>>> >>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>> >>>> >>>> KSP Object: 4 MPI processes >>>> type: cg >>>> maximum iterations=10000 >>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using nonzero initial guess >>>> using UNPRECONDITIONED norm type for convergence test >>>> PC Object: 4 MPI processes >>>> type: mg >>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>> Cycles per PCApply=1 >>>> Not using Galerkin computed coarse grid matrices >>>> Coarse grid solver -- level ------------------------------- >>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>> type: chebyshev >>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using NONE norm type for convergence test >>>> PC Object: (mg_levels_0_) 4 MPI processes >>>> type: sor >>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>> linear system matrix = precond matrix: >>>> Matrix Object: 4 MPI processes >>>> type: mpiaij >>>> rows=262144, cols=262144 >>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>> total number of mallocs used during MatSetValues calls =0 >>>> maximum iterations=1, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using NONE norm type for convergence test >>>> PC Object: (mg_levels_0_) 4 MPI processes >>>> type: sor >>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>> linear system matrix = precond matrix: >>>> Matrix Object: 4 MPI processes >>>> type: mpiaij >>>> rows=262144, cols=262144 >>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>> total number of mallocs used during MatSetValues calls =0 >>>> linear system matrix = precond matrix: >>>> Matrix Object: 4 MPI processes >>>> type: mpiaij >>>> rows=262144, cols=262144 >>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>> total number of mallocs used during MatSetValues calls =0 >>>> Solution = 1.53600013 sec >>>> #PETSc Option Table entries: >>>> -da_refine 4 >>>> -ksp_view >>>> -options_left >>>> -pc_type mg >>>> #End of PETSc Option Table entries >>>> There is one unused database option. It is: >>>> Option left: name:-da_refine value: 4 >>>> >>>> Michele >>>> >>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>> >>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>> >>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>> >>>>> >>>>> Barry >>>>> >>>>> >>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>> >>>>> wrote: >>>>> >>>>> >>>>>> Hi, >>>>>> >>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>> So far I am using GAMG with the default settings, i.e. >>>>>> >>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>> >>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>> >>>>>> Here are my current settings: >>>>>> >>>>>> I run with >>>>>> >>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>> >>>>>> and the output is: >>>>>> >>>>>> KSP Object: 4 MPI processes >>>>>> type: cg >>>>>> maximum iterations=10000 >>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using nonzero initial guess >>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>> PC Object: 4 MPI processes >>>>>> type: gamg >>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>> Cycles per PCApply=1 >>>>>> Using Galerkin computed coarse grid matrices >>>>>> Coarse grid solver -- level ------------------------------- >>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>> type: preonly >>>>>> maximum iterations=1, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>> type: bjacobi >>>>>> block Jacobi: number of blocks = 4 >>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>> [0] local block number 0 >>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>> type: preonly >>>>>> maximum iterations=1, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>> type: preonly >>>>>> 1 MPI processes >>>>>> type: lu >>>>>> maximum iterations=1, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> LU: out-of-place factorization >>>>>> left preconditioning >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>> type: lu >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: nd >>>>>> LU: out-of-place factorization >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: nd >>>>>> factor fill ratio given 5, needed 0 >>>>>> Factored matrix follows: >>>>>> factor fill ratio given 5, needed 4.13207 >>>>>> Factored matrix follows: >>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=395, cols=395 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> 1 MPI processes >>>>>> type: seqaij >>>>>> linear system matrix = precond matrix: >>>>>> rows=0, cols=0 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=0, cols=0 >>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> rows=395, cols=395 >>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> - - - - - - - - - - - - - - - - - - >>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>> type: preonly >>>>>> maximum iterations=1, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>> type: lu >>>>>> LU: out-of-place factorization >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: nd >>>>>> factor fill ratio given 5, needed 0 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=0, cols=0 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=0, cols=0 >>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>> type: preonly >>>>>> maximum iterations=1, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>> type: lu >>>>>> LU: out-of-place factorization >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: nd >>>>>> factor fill ratio given 5, needed 0 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=0, cols=0 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=0, cols=0 >>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>> [1] local block number 0 >>>>>> - - - - - - - - - - - - - - - - - - >>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>> [2] local block number 0 >>>>>> - - - - - - - - - - - - - - - - - - >>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>> [3] local block number 0 >>>>>> - - - - - - - - - - - - - - - - - - >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 4 MPI processes >>>>>> type: mpiaij >>>>>> rows=395, cols=395 >>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node (on process 0) routines >>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>> type: chebyshev >>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>> maximum iterations=2 >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using nonzero initial guess >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>> type: jacobi >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 4 MPI processes >>>>>> type: mpiaij >>>>>> rows=23918, cols=23918 >>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node (on process 0) routines >>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>> type: chebyshev >>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>> maximum iterations=2 >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using nonzero initial guess >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>> type: jacobi >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 4 MPI processes >>>>>> type: mpiaij >>>>>> rows=262144, cols=262144 >>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 4 MPI processes >>>>>> type: mpiaij >>>>>> rows=262144, cols=262144 >>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> #PETSc Option Table entries: >>>>>> -ksp_view >>>>>> -options_left >>>>>> -pc_gamg_agg_nsmooths 1 >>>>>> -pc_type gamg >>>>>> #End of PETSc Option Table entries >>>>>> There are no unused options. >>>>>> >>>>>> >>>>>> Thank you, >>>>>> Michele >>>>>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Aug 1 17:11:02 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 1 Aug 2013 17:11:02 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FAD784.6080302@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> Message-ID: <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. Barry On Aug 1, 2013, at 4:47 PM, Michele Rosso wrote: > Barry, > > I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left > > For the test I use a 64^3 grid and 4 processors. > > The output is: > > [2]PETSC ERROR: --------------------- Error Message ------------------------------------ > [2]PETSC ERROR: Arguments are incompatible! > [2]PETSC ERROR: Zero diagonal on row 0! > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [2]PETSC ERROR: See docs/changes/index.html for recent updates. > [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [2]PETSC ERROR: See docs/index.html for manual pages. > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 > [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 > [2]PETSC ERROR: Configure options > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c > [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c > --------------------- Error Message ------------------------------------ > [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c > [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c > [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c > [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > Arguments are incompatible! > [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c > [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c > [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c > [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > Zero diagonal on row 0! > [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > ------------------------------------------------------------------------ > [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [3]PETSC ERROR: Arguments are incompatible! > [3]PETSC ERROR: Zero diagonal on row 0! > [3]PETSC ERROR: ------------------------------------------------------------------------ > [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [3]PETSC ERROR: See docs/changes/index.html for recent updates. > [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [3]PETSC ERROR: See docs/index.html for manual pages. > [3]PETSC ERROR: ------------------------------------------------------------------------ > See docs/index.html for manual pages. > [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 > [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 > [3]PETSC ERROR: Configure options > [3]PETSC ERROR: ------------------------------------------------------------------------ > [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c > [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c > [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c > [1]PETSC ERROR: Arguments are incompatible! > [1]PETSC ERROR: Zero diagonal on row 0! > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 > [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 > [1]PETSC ERROR: Configure options > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c > [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c > [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c > [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c > [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c > [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c > [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c > [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c > [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c > [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c > [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c > [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c > [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c > [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 > [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c > [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c > [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c > [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c > [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c > [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c > [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c > [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c > [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > #PETSc Option Table entries: > -da_refine 4 > -ksp_view > -options_left > -pc_mg_galerkin > -pc_type mg > #End of PETSc Option Table entries > There is one unused database option. It is: > Option left: name:-da_refine value: 4 > > > Here is the code I use to setup DMDA and KSP: > > call DMDACreate3d( PETSC_COMM_WORLD , & > & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & > & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & > & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & > & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) > > ! Create Global Vectors > call DMCreateGlobalVector(da,b,ierr) > call VecDuplicate(b,x,ierr) > > ! Set initial guess for first use of the module to 0 > call VecSet(x,0.0_rp,ierr) > > ! Create matrix > call DMCreateMatrix(da,MATMPIAIJ,A,ierr) > > ! Create solver > call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) > call KSPSetDM(ksp,da,ierr) > call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) > ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) > call KSPSetType(ksp,KSPCG,ierr) > call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual > call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) > call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& > & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) > > ! To allow using option from command line > call KSPSetFromOptions(ksp,ierr) > > > Michele > > > > > On 08/01/2013 01:04 PM, Barry Smith wrote: >> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >> >> If it doesn't work, send us all the output. >> >> Barry >> >> >> On Aug 1, 2013, at 2:47 PM, Michele Rosso >> >> wrote: >> >> >>> Barry, >>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>> KSPSetComputeOperators and KSPSetComputeRHS. >>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>> Thank you, >>> Michele >>> >>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>> >>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>> >>>> Barry >>>> >>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>> >>>> >>>> >>>> wrote: >>>> >>>> >>>> >>>>> Barry, >>>>> >>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>> I tried to run my case with >>>>> >>>>> >>>>> -pc_type mg -da_refine 4 >>>>> >>>>> >>>>> >>>>> but it does not seem to use the -da_refine option: >>>>> >>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>> >>>>> >>>>> KSP Object: 4 MPI processes >>>>> type: cg >>>>> maximum iterations=10000 >>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using nonzero initial guess >>>>> using UNPRECONDITIONED norm type for convergence test >>>>> PC Object: 4 MPI processes >>>>> type: mg >>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>> Cycles per PCApply=1 >>>>> Not using Galerkin computed coarse grid matrices >>>>> Coarse grid solver -- level ------------------------------- >>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>> type: chebyshev >>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>> type: sor >>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 4 MPI processes >>>>> type: mpiaij >>>>> rows=262144, cols=262144 >>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> maximum iterations=1, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using NONE norm type for convergence test >>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>> type: sor >>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 4 MPI processes >>>>> type: mpiaij >>>>> rows=262144, cols=262144 >>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 4 MPI processes >>>>> type: mpiaij >>>>> rows=262144, cols=262144 >>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> Solution = 1.53600013 sec >>>>> #PETSc Option Table entries: >>>>> -da_refine 4 >>>>> -ksp_view >>>>> -options_left >>>>> -pc_type mg >>>>> #End of PETSc Option Table entries >>>>> There is one unused database option. It is: >>>>> Option left: name:-da_refine value: 4 >>>>> >>>>> Michele >>>>> >>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>> >>>>> >>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>> >>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>> >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>> >>>>>> >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>>> Hi, >>>>>>> >>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>> >>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>> >>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>> >>>>>>> Here are my current settings: >>>>>>> >>>>>>> I run with >>>>>>> >>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>> >>>>>>> and the output is: >>>>>>> >>>>>>> KSP Object: 4 MPI processes >>>>>>> type: cg >>>>>>> maximum iterations=10000 >>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using nonzero initial guess >>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>> PC Object: 4 MPI processes >>>>>>> type: gamg >>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>> Cycles per PCApply=1 >>>>>>> Using Galerkin computed coarse grid matrices >>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>> type: preonly >>>>>>> maximum iterations=1, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>> type: bjacobi >>>>>>> block Jacobi: number of blocks = 4 >>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>> [0] local block number 0 >>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>> type: preonly >>>>>>> maximum iterations=1, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>> type: preonly >>>>>>> 1 MPI processes >>>>>>> type: lu >>>>>>> maximum iterations=1, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> LU: out-of-place factorization >>>>>>> left preconditioning >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>> type: lu >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: nd >>>>>>> LU: out-of-place factorization >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: nd >>>>>>> factor fill ratio given 5, needed 0 >>>>>>> Factored matrix follows: >>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=395, cols=395 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> 1 MPI processes >>>>>>> type: seqaij >>>>>>> linear system matrix = precond matrix: >>>>>>> rows=0, cols=0 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=0, cols=0 >>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> rows=395, cols=395 >>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>> type: preonly >>>>>>> maximum iterations=1, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>> type: lu >>>>>>> LU: out-of-place factorization >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: nd >>>>>>> factor fill ratio given 5, needed 0 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=0, cols=0 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=0, cols=0 >>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>> type: preonly >>>>>>> maximum iterations=1, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>> type: lu >>>>>>> LU: out-of-place factorization >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: nd >>>>>>> factor fill ratio given 5, needed 0 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=0, cols=0 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=0, cols=0 >>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>> [1] local block number 0 >>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>> [2] local block number 0 >>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>> [3] local block number 0 >>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 4 MPI processes >>>>>>> type: mpiaij >>>>>>> rows=395, cols=395 >>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node (on process 0) routines >>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>> type: chebyshev >>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>> maximum iterations=2 >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using nonzero initial guess >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>> type: jacobi >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 4 MPI processes >>>>>>> type: mpiaij >>>>>>> rows=23918, cols=23918 >>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node (on process 0) routines >>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>> type: chebyshev >>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>> maximum iterations=2 >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using nonzero initial guess >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>> type: jacobi >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 4 MPI processes >>>>>>> type: mpiaij >>>>>>> rows=262144, cols=262144 >>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 4 MPI processes >>>>>>> type: mpiaij >>>>>>> rows=262144, cols=262144 >>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> #PETSc Option Table entries: >>>>>>> -ksp_view >>>>>>> -options_left >>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>> -pc_type gamg >>>>>>> #End of PETSc Option Table entries >>>>>>> There are no unused options. >>>>>>> >>>>>>> >>>>>>> Thank you, >>>>>>> Michele >>>>>>> >>>>>>> >> > From mrosso at uci.edu Thu Aug 1 17:21:09 2013 From: mrosso at uci.edu (Michele Rosso) Date: Thu, 01 Aug 2013 15:21:09 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> Message-ID: <51FADF55.50003@uci.edu> Barry, here it is the fraction of code where I set the rhs term and the matrix. ! Create matrix call form_matrix( A , qrho, lsf, head ) call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) ! Create rhs term call form_rhs(work, qrho, lsf, b , head) ! Solve system call KSPSetFromOptions(ksp,ierr) call KSPSetUp(ksp,ierr) call KSPSolve(ksp,b,x,ierr) call KSPGetIterationNumber(ksp, iiter ,ierr) The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. qrho, lsf and head are additional arguments that are needed to compute the matrix value. Michele On 08/01/2013 03:11 PM, Barry Smith wrote: > Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. > > Barry > > > On Aug 1, 2013, at 4:47 PM, Michele Rosso wrote: > >> Barry, >> >> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >> >> For the test I use a 64^3 grid and 4 processors. >> >> The output is: >> >> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >> [2]PETSC ERROR: Arguments are incompatible! >> [2]PETSC ERROR: Zero diagonal on row 0! >> [2]PETSC ERROR: ------------------------------------------------------------------------ >> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [2]PETSC ERROR: See docs/index.html for manual pages. >> [2]PETSC ERROR: ------------------------------------------------------------------------ >> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >> [2]PETSC ERROR: Configure options >> [2]PETSC ERROR: ------------------------------------------------------------------------ >> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >> --------------------- Error Message ------------------------------------ >> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> Arguments are incompatible! >> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >> Zero diagonal on row 0! >> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >> ------------------------------------------------------------------------ >> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >> [3]PETSC ERROR: Arguments are incompatible! >> [3]PETSC ERROR: Zero diagonal on row 0! >> [3]PETSC ERROR: ------------------------------------------------------------------------ >> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [3]PETSC ERROR: See docs/index.html for manual pages. >> [3]PETSC ERROR: ------------------------------------------------------------------------ >> See docs/index.html for manual pages. >> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >> [3]PETSC ERROR: Configure options >> [3]PETSC ERROR: ------------------------------------------------------------------------ >> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >> [1]PETSC ERROR: Arguments are incompatible! >> [1]PETSC ERROR: Zero diagonal on row 0! >> [1]PETSC ERROR: ------------------------------------------------------------------------ >> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [1]PETSC ERROR: See docs/index.html for manual pages. >> [1]PETSC ERROR: ------------------------------------------------------------------------ >> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >> [1]PETSC ERROR: Configure options >> [1]PETSC ERROR: ------------------------------------------------------------------------ >> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >> [0]PETSC ERROR: Configure options >> [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >> #PETSc Option Table entries: >> -da_refine 4 >> -ksp_view >> -options_left >> -pc_mg_galerkin >> -pc_type mg >> #End of PETSc Option Table entries >> There is one unused database option. It is: >> Option left: name:-da_refine value: 4 >> >> >> Here is the code I use to setup DMDA and KSP: >> >> call DMDACreate3d( PETSC_COMM_WORLD , & >> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >> >> ! Create Global Vectors >> call DMCreateGlobalVector(da,b,ierr) >> call VecDuplicate(b,x,ierr) >> >> ! Set initial guess for first use of the module to 0 >> call VecSet(x,0.0_rp,ierr) >> >> ! Create matrix >> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >> >> ! Create solver >> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >> call KSPSetDM(ksp,da,ierr) >> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >> call KSPSetType(ksp,KSPCG,ierr) >> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >> >> ! To allow using option from command line >> call KSPSetFromOptions(ksp,ierr) >> >> >> Michele >> >> >> >> >> On 08/01/2013 01:04 PM, Barry Smith wrote: >>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>> >>> If it doesn't work, send us all the output. >>> >>> Barry >>> >>> >>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>> >>> wrote: >>> >>> >>>> Barry, >>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>> Thank you, >>>> Michele >>>> >>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>> >>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>> >>>>> Barry >>>>> >>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>> >>>>> >>>>> >>>>> wrote: >>>>> >>>>> >>>>> >>>>>> Barry, >>>>>> >>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>> I tried to run my case with >>>>>> >>>>>> >>>>>> -pc_type mg -da_refine 4 >>>>>> >>>>>> >>>>>> >>>>>> but it does not seem to use the -da_refine option: >>>>>> >>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>> >>>>>> >>>>>> KSP Object: 4 MPI processes >>>>>> type: cg >>>>>> maximum iterations=10000 >>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using nonzero initial guess >>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>> PC Object: 4 MPI processes >>>>>> type: mg >>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>> Cycles per PCApply=1 >>>>>> Not using Galerkin computed coarse grid matrices >>>>>> Coarse grid solver -- level ------------------------------- >>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>> type: chebyshev >>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>> type: sor >>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 4 MPI processes >>>>>> type: mpiaij >>>>>> rows=262144, cols=262144 >>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> maximum iterations=1, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using NONE norm type for convergence test >>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>> type: sor >>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 4 MPI processes >>>>>> type: mpiaij >>>>>> rows=262144, cols=262144 >>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 4 MPI processes >>>>>> type: mpiaij >>>>>> rows=262144, cols=262144 >>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> Solution = 1.53600013 sec >>>>>> #PETSc Option Table entries: >>>>>> -da_refine 4 >>>>>> -ksp_view >>>>>> -options_left >>>>>> -pc_type mg >>>>>> #End of PETSc Option Table entries >>>>>> There is one unused database option. It is: >>>>>> Option left: name:-da_refine value: 4 >>>>>> >>>>>> Michele >>>>>> >>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>> >>>>>> >>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>> >>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>> >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>> >>>>>>> >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Hi, >>>>>>>> >>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>> >>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>> >>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>> >>>>>>>> Here are my current settings: >>>>>>>> >>>>>>>> I run with >>>>>>>> >>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>> >>>>>>>> and the output is: >>>>>>>> >>>>>>>> KSP Object: 4 MPI processes >>>>>>>> type: cg >>>>>>>> maximum iterations=10000 >>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using nonzero initial guess >>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>> PC Object: 4 MPI processes >>>>>>>> type: gamg >>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>> Cycles per PCApply=1 >>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>> type: preonly >>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>> type: bjacobi >>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>> [0] local block number 0 >>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>> type: preonly >>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>> type: preonly >>>>>>>> 1 MPI processes >>>>>>>> type: lu >>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> LU: out-of-place factorization >>>>>>>> left preconditioning >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>> type: lu >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: nd >>>>>>>> LU: out-of-place factorization >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: nd >>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>> Factored matrix follows: >>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=395, cols=395 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> linear system matrix = precond matrix: >>>>>>>> rows=0, cols=0 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=0, cols=0 >>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> rows=395, cols=395 >>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>> type: preonly >>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>> type: lu >>>>>>>> LU: out-of-place factorization >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: nd >>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=0, cols=0 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=0, cols=0 >>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>> type: preonly >>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>> type: lu >>>>>>>> LU: out-of-place factorization >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: nd >>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=0, cols=0 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=0, cols=0 >>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>> [1] local block number 0 >>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>> [2] local block number 0 >>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>> [3] local block number 0 >>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 4 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=395, cols=395 >>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node (on process 0) routines >>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>> type: chebyshev >>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>> maximum iterations=2 >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using nonzero initial guess >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>> type: jacobi >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 4 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=23918, cols=23918 >>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node (on process 0) routines >>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>> type: chebyshev >>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>> maximum iterations=2 >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using nonzero initial guess >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>> type: jacobi >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 4 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=262144, cols=262144 >>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 4 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=262144, cols=262144 >>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> #PETSc Option Table entries: >>>>>>>> -ksp_view >>>>>>>> -options_left >>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>> -pc_type gamg >>>>>>>> #End of PETSc Option Table entries >>>>>>>> There are no unused options. >>>>>>>> >>>>>>>> >>>>>>>> Thank you, >>>>>>>> Michele >>>>>>>> >>>>>>>> > From bsmith at mcs.anl.gov Thu Aug 1 17:27:23 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 1 Aug 2013 17:27:23 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FADF55.50003@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> Message-ID: Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. Barry On Aug 1, 2013, at 5:21 PM, Michele Rosso wrote: > Barry, > > here it is the fraction of code where I set the rhs term and the matrix. > > ! Create matrix > call form_matrix( A , qrho, lsf, head ) > call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) > call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) > > ! Create rhs term > call form_rhs(work, qrho, lsf, b , head) > > ! Solve system > call KSPSetFromOptions(ksp,ierr) > call KSPSetUp(ksp,ierr) > call KSPSolve(ksp,b,x,ierr) > call KSPGetIterationNumber(ksp, iiter ,ierr) > > The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. > qrho, lsf and head are additional arguments that are needed to compute the matrix value. > > > Michele > > > > On 08/01/2013 03:11 PM, Barry Smith wrote: >> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >> >> Barry >> >> >> On Aug 1, 2013, at 4:47 PM, Michele Rosso wrote: >> >>> Barry, >>> >>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>> >>> For the test I use a 64^3 grid and 4 processors. >>> >>> The output is: >>> >>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>> [2]PETSC ERROR: Arguments are incompatible! >>> [2]PETSC ERROR: Zero diagonal on row 0! >>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [2]PETSC ERROR: See docs/index.html for manual pages. >>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>> [2]PETSC ERROR: Configure options >>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>> --------------------- Error Message ------------------------------------ >>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> Arguments are incompatible! >>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> Zero diagonal on row 0! >>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> ------------------------------------------------------------------------ >>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>> [3]PETSC ERROR: Arguments are incompatible! >>> [3]PETSC ERROR: Zero diagonal on row 0! >>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [3]PETSC ERROR: See docs/index.html for manual pages. >>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>> See docs/index.html for manual pages. >>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>> [3]PETSC ERROR: Configure options >>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>> [1]PETSC ERROR: Arguments are incompatible! >>> [1]PETSC ERROR: Zero diagonal on row 0! >>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [1]PETSC ERROR: See docs/index.html for manual pages. >>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>> [1]PETSC ERROR: Configure options >>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>> [0]PETSC ERROR: Configure options >>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> #PETSc Option Table entries: >>> -da_refine 4 >>> -ksp_view >>> -options_left >>> -pc_mg_galerkin >>> -pc_type mg >>> #End of PETSc Option Table entries >>> There is one unused database option. It is: >>> Option left: name:-da_refine value: 4 >>> >>> >>> Here is the code I use to setup DMDA and KSP: >>> >>> call DMDACreate3d( PETSC_COMM_WORLD , & >>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>> ! Create Global Vectors >>> call DMCreateGlobalVector(da,b,ierr) >>> call VecDuplicate(b,x,ierr) >>> ! Set initial guess for first use of the module to 0 >>> call VecSet(x,0.0_rp,ierr) >>> ! Create matrix >>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>> ! Create solver >>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>> call KSPSetDM(ksp,da,ierr) >>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>> call KSPSetType(ksp,KSPCG,ierr) >>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>> >>> ! To allow using option from command line >>> call KSPSetFromOptions(ksp,ierr) >>> >>> >>> Michele >>> >>> >>> >>> >>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>> >>>> If it doesn't work, send us all the output. >>>> >>>> Barry >>>> >>>> >>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>> >>>> wrote: >>>> >>>> >>>>> Barry, >>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>> Thank you, >>>>> Michele >>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>> >>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>> >>>>>> Barry >>>>>> >>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>> >>>>>> >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> >>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>> I tried to run my case with >>>>>>> >>>>>>> >>>>>>> -pc_type mg -da_refine 4 >>>>>>> >>>>>>> >>>>>>> >>>>>>> but it does not seem to use the -da_refine option: >>>>>>> >>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>> >>>>>>> >>>>>>> KSP Object: 4 MPI processes >>>>>>> type: cg >>>>>>> maximum iterations=10000 >>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using nonzero initial guess >>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>> PC Object: 4 MPI processes >>>>>>> type: mg >>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>> Cycles per PCApply=1 >>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>> type: chebyshev >>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>> type: sor >>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 4 MPI processes >>>>>>> type: mpiaij >>>>>>> rows=262144, cols=262144 >>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> maximum iterations=1, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using NONE norm type for convergence test >>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>> type: sor >>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 4 MPI processes >>>>>>> type: mpiaij >>>>>>> rows=262144, cols=262144 >>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 4 MPI processes >>>>>>> type: mpiaij >>>>>>> rows=262144, cols=262144 >>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> Solution = 1.53600013 sec >>>>>>> #PETSc Option Table entries: >>>>>>> -da_refine 4 >>>>>>> -ksp_view >>>>>>> -options_left >>>>>>> -pc_type mg >>>>>>> #End of PETSc Option Table entries >>>>>>> There is one unused database option. It is: >>>>>>> Option left: name:-da_refine value: 4 >>>>>>> >>>>>>> Michele >>>>>>> >>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>> >>>>>>> >>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>> >>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>> >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Hi, >>>>>>>>> >>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>> >>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>> >>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>> >>>>>>>>> Here are my current settings: >>>>>>>>> >>>>>>>>> I run with >>>>>>>>> >>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>> >>>>>>>>> and the output is: >>>>>>>>> >>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>> type: cg >>>>>>>>> maximum iterations=10000 >>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using nonzero initial guess >>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: 4 MPI processes >>>>>>>>> type: gamg >>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>> Cycles per PCApply=1 >>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>> type: preonly >>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>> type: bjacobi >>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>> [0] local block number 0 >>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>> type: preonly >>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>> type: preonly >>>>>>>>> 1 MPI processes >>>>>>>>> type: lu >>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> LU: out-of-place factorization >>>>>>>>> left preconditioning >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>> type: lu >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: nd >>>>>>>>> LU: out-of-place factorization >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: nd >>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>> Factored matrix follows: >>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=395, cols=395 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> rows=0, cols=0 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=0, cols=0 >>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> rows=395, cols=395 >>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>> type: preonly >>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>> type: lu >>>>>>>>> LU: out-of-place factorization >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: nd >>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=0, cols=0 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=0, cols=0 >>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>> type: preonly >>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>> type: lu >>>>>>>>> LU: out-of-place factorization >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: nd >>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=0, cols=0 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=0, cols=0 >>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>> [1] local block number 0 >>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>> [2] local block number 0 >>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>> [3] local block number 0 >>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>> type: mpiaij >>>>>>>>> rows=395, cols=395 >>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node (on process 0) routines >>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>> type: chebyshev >>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>> maximum iterations=2 >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using nonzero initial guess >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>> type: jacobi >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>> type: mpiaij >>>>>>>>> rows=23918, cols=23918 >>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node (on process 0) routines >>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>> type: chebyshev >>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>> maximum iterations=2 >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using nonzero initial guess >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>> type: jacobi >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>> type: mpiaij >>>>>>>>> rows=262144, cols=262144 >>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>> type: mpiaij >>>>>>>>> rows=262144, cols=262144 >>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> #PETSc Option Table entries: >>>>>>>>> -ksp_view >>>>>>>>> -options_left >>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>> -pc_type gamg >>>>>>>>> #End of PETSc Option Table entries >>>>>>>>> There are no unused options. >>>>>>>>> >>>>>>>>> >>>>>>>>> Thank you, >>>>>>>>> Michele >>>>>>>>> >>>>>>>>> >> > From mrosso at uci.edu Thu Aug 1 17:52:14 2013 From: mrosso at uci.edu (Michele Rosso) Date: Thu, 01 Aug 2013 15:52:14 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> Message-ID: <51FAE69E.4020108@uci.edu> Barry, I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. The matrix is symmetric positive define (i.e. the standard Poisson matrix). Also, -da_refine is never used (see previous output). I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left and now the error is different: 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Floating point exception! [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: --------------------- Error Message ------------------------------------ [2]PETSC ERROR: Floating point exception! [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [2]PETSC ERROR: See docs/changes/index.html for recent updates. [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ [3]PETSC ERROR: Floating point exception! [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [3]PETSC ERROR: See docs/changes/index.html for recent updates. [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [3]PETSC ERROR: See docs/index.html for manual pages. [3]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [1]PETSC ERROR: See docs/index.html for manual pages. [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 [3]PETSC ERROR: Configure options Configure run at Thu Aug 1 12:01:44 2013 [1]PETSC ERROR: Configure options [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c Configure options [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h MatMult() line 2174 in src/mat/interface/matrix.c [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Floating point exception! [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c #PETSc Option Table entries: -ksp_view -mg_levels_ksp_chebyshev_estimate_eigenvalues -mg_levels_ksp_type chebyshev -mg_levels_pc_type jacobi -options_left -pc_mg_galerkin -pc_type mg #End of PETSc Option Table entries There are no unused options. Michele On 08/01/2013 03:27 PM, Barry Smith wrote: > Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues > > Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. > > Barry > > On Aug 1, 2013, at 5:21 PM, Michele Rosso wrote: > >> Barry, >> >> here it is the fraction of code where I set the rhs term and the matrix. >> >> ! Create matrix >> call form_matrix( A , qrho, lsf, head ) >> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >> >> ! Create rhs term >> call form_rhs(work, qrho, lsf, b , head) >> >> ! Solve system >> call KSPSetFromOptions(ksp,ierr) >> call KSPSetUp(ksp,ierr) >> call KSPSolve(ksp,b,x,ierr) >> call KSPGetIterationNumber(ksp, iiter ,ierr) >> >> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >> >> >> Michele >> >> >> >> On 08/01/2013 03:11 PM, Barry Smith wrote: >>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>> >>> Barry >>> >>> >>> On Aug 1, 2013, at 4:47 PM, Michele Rosso wrote: >>> >>>> Barry, >>>> >>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>> >>>> For the test I use a 64^3 grid and 4 processors. >>>> >>>> The output is: >>>> >>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>> [2]PETSC ERROR: Arguments are incompatible! >>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>> [2]PETSC ERROR: Configure options >>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>> --------------------- Error Message ------------------------------------ >>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> Arguments are incompatible! >>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> Zero diagonal on row 0! >>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> ------------------------------------------------------------------------ >>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>> [3]PETSC ERROR: Arguments are incompatible! >>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>> See docs/index.html for manual pages. >>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>> [3]PETSC ERROR: Configure options >>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>> [1]PETSC ERROR: Arguments are incompatible! >>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>> [1]PETSC ERROR: Configure options >>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>> [0]PETSC ERROR: Configure options >>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> #PETSc Option Table entries: >>>> -da_refine 4 >>>> -ksp_view >>>> -options_left >>>> -pc_mg_galerkin >>>> -pc_type mg >>>> #End of PETSc Option Table entries >>>> There is one unused database option. It is: >>>> Option left: name:-da_refine value: 4 >>>> >>>> >>>> Here is the code I use to setup DMDA and KSP: >>>> >>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>> ! Create Global Vectors >>>> call DMCreateGlobalVector(da,b,ierr) >>>> call VecDuplicate(b,x,ierr) >>>> ! Set initial guess for first use of the module to 0 >>>> call VecSet(x,0.0_rp,ierr) >>>> ! Create matrix >>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>> ! Create solver >>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>> call KSPSetDM(ksp,da,ierr) >>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>> call KSPSetType(ksp,KSPCG,ierr) >>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>> >>>> ! To allow using option from command line >>>> call KSPSetFromOptions(ksp,ierr) >>>> >>>> >>>> Michele >>>> >>>> >>>> >>>> >>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>> >>>>> If it doesn't work, send us all the output. >>>>> >>>>> Barry >>>>> >>>>> >>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>> >>>>> wrote: >>>>> >>>>> >>>>>> Barry, >>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>> Thank you, >>>>>> Michele >>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>> >>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>> >>>>>>> >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>> I tried to run my case with >>>>>>>> >>>>>>>> >>>>>>>> -pc_type mg -da_refine 4 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>> >>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>> >>>>>>>> >>>>>>>> KSP Object: 4 MPI processes >>>>>>>> type: cg >>>>>>>> maximum iterations=10000 >>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using nonzero initial guess >>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>> PC Object: 4 MPI processes >>>>>>>> type: mg >>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>> Cycles per PCApply=1 >>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>> type: chebyshev >>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>> type: sor >>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 4 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=262144, cols=262144 >>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using NONE norm type for convergence test >>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>> type: sor >>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 4 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=262144, cols=262144 >>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 4 MPI processes >>>>>>>> type: mpiaij >>>>>>>> rows=262144, cols=262144 >>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> Solution = 1.53600013 sec >>>>>>>> #PETSc Option Table entries: >>>>>>>> -da_refine 4 >>>>>>>> -ksp_view >>>>>>>> -options_left >>>>>>>> -pc_type mg >>>>>>>> #End of PETSc Option Table entries >>>>>>>> There is one unused database option. It is: >>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>> >>>>>>>> >>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>> >>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>> >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Hi, >>>>>>>>>> >>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>> >>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>> >>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>> >>>>>>>>>> Here are my current settings: >>>>>>>>>> >>>>>>>>>> I run with >>>>>>>>>> >>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>> >>>>>>>>>> and the output is: >>>>>>>>>> >>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>> type: cg >>>>>>>>>> maximum iterations=10000 >>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using nonzero initial guess >>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>> type: gamg >>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>> type: preonly >>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>> type: bjacobi >>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>> [0] local block number 0 >>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>> type: preonly >>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>> type: preonly >>>>>>>>>> 1 MPI processes >>>>>>>>>> type: lu >>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> LU: out-of-place factorization >>>>>>>>>> left preconditioning >>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>> type: lu >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: nd >>>>>>>>>> LU: out-of-place factorization >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: nd >>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=395, cols=395 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> rows=0, cols=0 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=0, cols=0 >>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> rows=395, cols=395 >>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>> type: preonly >>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>> type: lu >>>>>>>>>> LU: out-of-place factorization >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: nd >>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=0, cols=0 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=0, cols=0 >>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>> type: preonly >>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>> type: lu >>>>>>>>>> LU: out-of-place factorization >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: nd >>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=0, cols=0 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=0, cols=0 >>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>> [1] local block number 0 >>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>> [2] local block number 0 >>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>> [3] local block number 0 >>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>> type: mpiaij >>>>>>>>>> rows=395, cols=395 >>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>> type: chebyshev >>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>> maximum iterations=2 >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using nonzero initial guess >>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>> type: jacobi >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>> type: mpiaij >>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>> type: chebyshev >>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>> maximum iterations=2 >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using nonzero initial guess >>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>> type: jacobi >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>> type: mpiaij >>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>> type: mpiaij >>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>> -ksp_view >>>>>>>>>> -options_left >>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>> -pc_type gamg >>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>> There are no unused options. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thank you, >>>>>>>>>> Michele >>>>>>>>>> >>>>>>>>>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Aug 1 18:19:12 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 1 Aug 2013 18:19:12 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FAE69E.4020108@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> Message-ID: Run on one process until this is debugged. You can try the option -start_in_debugger noxterm and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? Barry On Aug 1, 2013, at 5:52 PM, Michele Rosso wrote: > Barry, > > I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. > The matrix is symmetric positive define (i.e. the standard Poisson matrix). > Also, -da_refine is never used (see previous output). > I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left > > and now the error is different: > 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ > [1]PETSC ERROR: Floating point exception! > [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [2]PETSC ERROR: --------------------- Error Message ------------------------------------ > [2]PETSC ERROR: Floating point exception! > [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [2]PETSC ERROR: See docs/changes/index.html for recent updates. > [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > [3]PETSC ERROR: Floating point exception! > [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! > [3]PETSC ERROR: ------------------------------------------------------------------------ > [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [3]PETSC ERROR: See docs/changes/index.html for recent updates. > [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [3]PETSC ERROR: See docs/index.html for manual pages. > [3]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 > [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [1]PETSC ERROR: See docs/index.html for manual pages. > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 > [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 > [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 > [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 > [3]PETSC ERROR: Configure options > Configure run at Thu Aug 1 12:01:44 2013 > [1]PETSC ERROR: Configure options > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c > Configure options > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c > [3]PETSC ERROR: ------------------------------------------------------------------------ > [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c > [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c > [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c > [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > MatMult() line 2174 in src/mat/interface/matrix.c > [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c > [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c > [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c > [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Floating point exception! > [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 > [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib > [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c > [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c > [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c > [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c > [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c > > #PETSc Option Table entries: > -ksp_view > -mg_levels_ksp_chebyshev_estimate_eigenvalues > -mg_levels_ksp_type chebyshev > -mg_levels_pc_type jacobi > -options_left > -pc_mg_galerkin > -pc_type mg > #End of PETSc Option Table entries > There are no unused options. > > Michele > > > On 08/01/2013 03:27 PM, Barry Smith wrote: >> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >> >> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >> >> Barry >> >> On Aug 1, 2013, at 5:21 PM, Michele Rosso >> >> wrote: >> >> >>> Barry, >>> >>> here it is the fraction of code where I set the rhs term and the matrix. >>> >>> ! Create matrix >>> call form_matrix( A , qrho, lsf, head ) >>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>> >>> ! Create rhs term >>> call form_rhs(work, qrho, lsf, b , head) >>> >>> ! Solve system >>> call KSPSetFromOptions(ksp,ierr) >>> call KSPSetUp(ksp,ierr) >>> call KSPSolve(ksp,b,x,ierr) >>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>> >>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>> >>> >>> Michele >>> >>> >>> >>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>> >>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>> >>>> Barry >>>> >>>> >>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>> >>>> wrote: >>>> >>>> >>>>> Barry, >>>>> >>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>> >>>>> For the test I use a 64^3 grid and 4 processors. >>>>> >>>>> The output is: >>>>> >>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>> [2]PETSC ERROR: Configure options >>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>> --------------------- Error Message ------------------------------------ >>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> Arguments are incompatible! >>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> Zero diagonal on row 0! >>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> ------------------------------------------------------------------------ >>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>> See docs/index.html for manual pages. >>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>> [3]PETSC ERROR: Configure options >>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>> [1]PETSC ERROR: Configure options >>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>> [0]PETSC ERROR: Configure options >>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> #PETSc Option Table entries: >>>>> -da_refine 4 >>>>> -ksp_view >>>>> -options_left >>>>> -pc_mg_galerkin >>>>> -pc_type mg >>>>> #End of PETSc Option Table entries >>>>> There is one unused database option. It is: >>>>> Option left: name:-da_refine value: 4 >>>>> >>>>> >>>>> Here is the code I use to setup DMDA and KSP: >>>>> >>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>> ! Create Global Vectors >>>>> call DMCreateGlobalVector(da,b,ierr) >>>>> call VecDuplicate(b,x,ierr) >>>>> ! Set initial guess for first use of the module to 0 >>>>> call VecSet(x,0.0_rp,ierr) >>>>> ! Create matrix >>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>> ! Create solver >>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>> call KSPSetDM(ksp,da,ierr) >>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>> >>>>> ! To allow using option from command line >>>>> call KSPSetFromOptions(ksp,ierr) >>>>> >>>>> >>>>> Michele >>>>> >>>>> >>>>> >>>>> >>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>> >>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>> >>>>>> If it doesn't work, send us all the output. >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>> >>>>>> >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>> Thank you, >>>>>>> Michele >>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>> >>>>>>> >>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Barry, >>>>>>>>> >>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>> I tried to run my case with >>>>>>>>> >>>>>>>>> >>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>> >>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>> >>>>>>>>> >>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>> type: cg >>>>>>>>> maximum iterations=10000 >>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using nonzero initial guess >>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: 4 MPI processes >>>>>>>>> type: mg >>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>> Cycles per PCApply=1 >>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>> type: chebyshev >>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>> type: sor >>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>> type: mpiaij >>>>>>>>> rows=262144, cols=262144 >>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>> type: sor >>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>> type: mpiaij >>>>>>>>> rows=262144, cols=262144 >>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>> type: mpiaij >>>>>>>>> rows=262144, cols=262144 >>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> Solution = 1.53600013 sec >>>>>>>>> #PETSc Option Table entries: >>>>>>>>> -da_refine 4 >>>>>>>>> -ksp_view >>>>>>>>> -options_left >>>>>>>>> -pc_type mg >>>>>>>>> #End of PETSc Option Table entries >>>>>>>>> There is one unused database option. It is: >>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>> >>>>>>>>> Michele >>>>>>>>> >>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>> >>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Hi, >>>>>>>>>>> >>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>> >>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>> >>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>> >>>>>>>>>>> Here are my current settings: >>>>>>>>>>> >>>>>>>>>>> I run with >>>>>>>>>>> >>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>> >>>>>>>>>>> and the output is: >>>>>>>>>>> >>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>> type: cg >>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using nonzero initial guess >>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>> type: gamg >>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>> type: preonly >>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>> type: bjacobi >>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>> [0] local block number 0 >>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>> type: preonly >>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>> type: preonly >>>>>>>>>>> 1 MPI processes >>>>>>>>>>> type: lu >>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>> left preconditioning >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>> type: lu >>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>> matrix ordering: nd >>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>> matrix ordering: nd >>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>> Factored matrix follows: >>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>> Factored matrix follows: >>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>> type: preonly >>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>> type: lu >>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>> matrix ordering: nd >>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>> Factored matrix follows: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>> type: preonly >>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>> type: lu >>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>> matrix ordering: nd >>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>> Factored matrix follows: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>> [1] local block number 0 >>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>> [2] local block number 0 >>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>> [3] local block number 0 >>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>> type: mpiaij >>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>> type: chebyshev >>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>> maximum iterations=2 >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using nonzero initial guess >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>> type: jacobi >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>> type: mpiaij >>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>> type: chebyshev >>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>> maximum iterations=2 >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using nonzero initial guess >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>> type: jacobi >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>> type: mpiaij >>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>> type: mpiaij >>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>> -ksp_view >>>>>>>>>>> -options_left >>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>> -pc_type gamg >>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>> There are no unused options. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thank you, >>>>>>>>>>> Michele >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >> > From olivier.bonnefon at avignon.inra.fr Fri Aug 2 09:30:19 2013 From: olivier.bonnefon at avignon.inra.fr (Olivier Bonnefon) Date: Fri, 02 Aug 2013 16:30:19 +0200 Subject: [petsc-users] FEM on 2D poisson equation In-Reply-To: <87bo5i68g0.fsf@mcs.anl.gov> References: <51E7BEAF.4090901@avignon.inra.fr> <51E7EADD.6010401@avignon.inra.fr> <51E7F011.2020408@avignon.inra.fr> <51F7C860.3030704@avignon.inra.fr> <51F92D2F.1080106@avignon.inra.fr> <87bo5i68g0.fsf@mcs.anl.gov> Message-ID: <51FBC27B.9070807@avignon.inra.fr> Hello, Just a mail to thank you for your help, I successfully simulate linear and non linear system (0=-\Delta u + u, and 0=-\Delta u - u(1-u)). The point was to know that "g0" is the derivative of f0 regarding u, in the PetscFEM struct. Olivier B. On 07/31/2013 05:43 PM, Jed Brown wrote: > Olivier Bonnefon writes: > >> Hello, >> >> You are right. I have to define the Jacobian function of the variational >> formulation. I'm using snes and the petsFem struc (like in ex12). >> >> I need some information about the petscFem struct. I didn't find any >> document about that, is there one ? >> >> The field f0Funcs is used for the term \int f_0(u,gradu,x)*v >> The field f1Funcs is used for the term \int f_1(u,gradu,x).grad v >> >> Are f0 and f1 used for the rhs of the linearized problem ? > We think about these problems as being nonlinear whether they are or > not. For a linear problem, you can apply one iteration of Newton's > method using '-snes_type ksponly'. The Jacobian consists of the > derivatives of f_0 and f_1 with respect to u. > >> But what about g0,g1,g2 and g3 functions? I guess I have to use it to >> define the Jacobian ? > Those are the derivatives of the f_0 and f_1. > > For example, see the notation in Eq. 3 and 5 of this paper: > > http://59A2.org/na/Brown-EfficientNonlinearSolversNodalHighOrder3D-2010.pdf -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 72 21 58 From jyawney123 at gmail.com Fri Aug 2 12:27:10 2013 From: jyawney123 at gmail.com (John Yawney) Date: Fri, 2 Aug 2013 13:27:10 -0400 Subject: [petsc-users] Question about KSPSolve Message-ID: Good Afternoon Everyone, I'm using PETSc to solve some linear systems in my ocean model. Currently I'm using the KSPSolve environment with 4 MPI processors. I've established the matrix A and the RHS b and confirmed that everything looks correct using VecView and MatView. Here are some code snippets that show the basic steps I took. ------------------------------------------------------------------------- *Assembling A:* MatConvert(matLaplacianX, MATSAME, MAT_INITIAL_MATRIX, &matLaplacian); MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); MatAXPY(matLaplacian, 1.0, matLaplacianY, DIFFERENT_NONZERO_PATTERN); MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); MatAXPY(matLaplacian, 1.0, matLaplacianZ, DIFFERENT_NONZERO_PATTERN); MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); *Defining KSP environment:* KSPCreate(MPI_COMM_WORLD, &m_inksp); KSPSetOperators(m_inksp, matLaplacian, matLaplacian, DIFFERENT_NONZERO_PATTERN); KSPSetType(m_inksp, KSPGMRES); KSPSetInitialGuessNonzero(m_inksp,PETSC_TRUE); KSPSetFromOptions(m_inksp); KSPSetUp(m_inksp); *Defining RHS vector:* VecCreateMPI(MPI_COMM_WORLD, nLocalElements, nGlobalElements, &m_vecRHS); *Solving the linear system:* VecAssemblyBegin(m_vecRHS); VecAssemblyEnd(m_vecRHS); KSPSolve(m_inksp, m_vecRHS, m_vecPressure); ------------------------------------------------------------------------- If I modify my problem to consider a 2D (x-z) domain with flat bottom topography and I set the initial velocity fields to 0 and a constant density of 1025 throughout, then if I compute a number of time steps I get computational artifacts at the beginning and end locations of each block. I should also mention I'm only splitting up the domain into sub-blocks in the x direction currently. After about 10 time steps, the min density is off by about 1E-8 but only at these locations. I've attached a figure to demonstrate the errors. Are there ways for me to remove these errors? Should I be looking at the DM manual pages? Thanks for any help and suggestions. All the best, John -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MPICommError - TimeStep10.jpg Type: image/jpeg Size: 74936 bytes Desc: not available URL: From Shuangshuang.Jin at pnnl.gov Fri Aug 2 15:22:01 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 2 Aug 2013 13:22:01 -0700 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> Hello, My code solves a linear system AX=B using superlu_dist in PETSc, and use some of X's data to solve a DAE problem. I get a very wild error: When I use less than 8 processors to run the code, it runs just fine with correct results. When I use greater than 8 processors, such as 16 or 32 processors, I'll get an error and a lot of generated core.##### files. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: ! [0]PETSC ERROR: TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Development GIT revision: a0a914e661bf6402b8edabe0f5a2dad46323f69f GIT Date: 2013-06-05 14:18:39 -0500 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: dynSim on a arch-complex named node0055.local by d3m956 Fri Aug 2 11:56:10 2013 [0]PETSC ERROR: Libraries linked from /pic/projects/ds/petsc-dev.6.06.13/arch-complex/lib [0]PETSC ERROR: Configure run at Fri Jul 26 14:32:37 2013 [0]PETSC ERROR: Configure options --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --download-scalapack --download-parmetis --download-metis --download-elemental --with-debugging=0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: TSStep() line 2515 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c [0]PETSC ERROR: TSSolve() line 2632 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c [0]PETSC ERROR: simu() line 566 in "unknowndirectory/"simulation.C [0]PETSC ERROR: runSimulation() line 99 in "unknowndirectory/"dynSim.h [node0055:32539] *** Process received signal *** [node0055:32535] *** Process received signal *** [node0055:32535] Signal: Aborted (6) [node0055:32535] Signal code: (24153104) [node0055:32534] *** Process received signal *** [node0055:32534] Signal: Aborted (6) [node0055:32534] Signal code: (24199552) [node0055:32539] Signal: Aborted (6) [node0055:32539] Signal code: (24157648) [node0055:32537] *** Process received signal *** [node0055:32537] Signal: Aborted (6) [node0055:32537] Signal code: (24546704) [node0055:32538] *** Process received signal *** The Error Message from PETSc pointed out that "TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery!", but I think it's because the superlu_dist computed an all "nan" X as I printed it out. However, I don't understand why using 8 or 16 processors should make such a difference. Can anyone give me some help for the trouble shooting? Thanks, Shuangshuang -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 2 15:24:24 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 3 Aug 2013 04:24:24 +0800 Subject: [petsc-users] FEM on 2D poisson equation In-Reply-To: <51FBC27B.9070807@avignon.inra.fr> References: <51E7BEAF.4090901@avignon.inra.fr> <51E7EADD.6010401@avignon.inra.fr> <51E7F011.2020408@avignon.inra.fr> <51F7C860.3030704@avignon.inra.fr> <51F92D2F.1080106@avignon.inra.fr> <87bo5i68g0.fsf@mcs.anl.gov> <51FBC27B.9070807@avignon.inra.fr> Message-ID: On Fri, Aug 2, 2013 at 10:30 PM, Olivier Bonnefon < olivier.bonnefon at avignon.inra.fr> wrote: > Hello, > > Just a mail to thank you for your help, I successfully simulate linear and > non linear system (0=-\Delta u + u, and 0=-\Delta u - u(1-u)). The point > was to know that "g0" is the derivative of f0 regarding u, in the PetscFEM > struct. > Great! Thanks for your patience. Let us know if we can give any more help. Matt > Olivier B. > > > On 07/31/2013 05:43 PM, Jed Brown wrote: > >> Olivier Bonnefon> >> writes: >> >> Hello, >>> >>> You are right. I have to define the Jacobian function of the variational >>> formulation. I'm using snes and the petsFem struc (like in ex12). >>> >>> I need some information about the petscFem struct. I didn't find any >>> document about that, is there one ? >>> >>> The field f0Funcs is used for the term \int f_0(u,gradu,x)*v >>> The field f1Funcs is used for the term \int f_1(u,gradu,x).grad v >>> >>> Are f0 and f1 used for the rhs of the linearized problem ? >>> >> We think about these problems as being nonlinear whether they are or >> not. For a linear problem, you can apply one iteration of Newton's >> method using '-snes_type ksponly'. The Jacobian consists of the >> derivatives of f_0 and f_1 with respect to u. >> >> But what about g0,g1,g2 and g3 functions? I guess I have to use it to >>> define the Jacobian ? >>> >> Those are the derivatives of the f_0 and f_1. >> >> For example, see the notation in Eq. 3 and 5 of this paper: >> >> http://59A2.org/na/Brown-**EfficientNonlinearSolversNodal** >> HighOrder3D-2010.pdf >> > > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 2 15:28:36 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 3 Aug 2013 04:28:36 +0800 Subject: [petsc-users] Question about KSPSolve In-Reply-To: References: Message-ID: On Sat, Aug 3, 2013 at 1:27 AM, John Yawney wrote: > Good Afternoon Everyone, > > I'm using PETSc to solve some linear systems in my ocean model. Currently > I'm using the KSPSolve environment with 4 MPI processors. I've established > the matrix A and the RHS b and confirmed that everything looks correct > using VecView and MatView. Here are some code snippets that show the basic > steps I took. > > ------------------------------------------------------------------------- > *Assembling A:* > MatConvert(matLaplacianX, MATSAME, MAT_INITIAL_MATRIX, &matLaplacian); > MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); > > MatAXPY(matLaplacian, 1.0, matLaplacianY, DIFFERENT_NONZERO_PATTERN); > MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); > > MatAXPY(matLaplacian, 1.0, matLaplacianZ, DIFFERENT_NONZERO_PATTERN); > MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); > > *Defining KSP environment:* > KSPCreate(MPI_COMM_WORLD, &m_inksp); > KSPSetOperators(m_inksp, matLaplacian, matLaplacian, > DIFFERENT_NONZERO_PATTERN); > KSPSetType(m_inksp, KSPGMRES); > KSPSetInitialGuessNonzero(m_inksp,PETSC_TRUE); > KSPSetFromOptions(m_inksp); > KSPSetUp(m_inksp); > > *Defining RHS vector:* > VecCreateMPI(MPI_COMM_WORLD, nLocalElements, nGlobalElements, &m_vecRHS); > > *Solving the linear system:* > VecAssemblyBegin(m_vecRHS); > VecAssemblyEnd(m_vecRHS); > KSPSolve(m_inksp, m_vecRHS, m_vecPressure); > ------------------------------------------------------------------------- > > If I modify my problem to consider a 2D (x-z) domain with flat bottom > topography and I set the initial velocity fields to 0 and a constant > density of 1025 throughout, then if I compute a number of time steps I get > computational artifacts at the beginning and end locations of each block. I > should also mention I'm only splitting up the domain into sub-blocks in the > x direction currently. After about 10 time steps, the min density is off by > about 1E-8 but only at these locations. I've attached a figure to > demonstrate the errors. > > Are there ways for me to remove these errors? Should I be looking at the > DM manual pages? > The above looks correct, so I assume there is a problem with the definition of the system. I would try putting in an exact solution, or comparing a serial and parallel run. Thanks, Matt > Thanks for any help and suggestions. > > All the best, > John > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 2 15:33:04 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 3 Aug 2013 04:33:04 +0800 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> Message-ID: On Sat, Aug 3, 2013 at 4:22 AM, Jin, Shuangshuang wrote: > Hello, > > My code solves a linear system AX=B using superlu_dist in PETSc, and > use some of X?s data to solve a DAE problem. I get a very wild error: > > When I use less than 8 processors to run the code, it runs just fine > with correct results. When I use greater than 8 processors, such as 16 or > 32 processors, I?ll get an error and a lot of generated core.##### files. > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: ! > [0]PETSC ERROR: TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, > increase -ts_max_snes_failures or make negative to attempt recovery! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: > a0a914e661bf6402b8edabe0f5a2dad46323f69f GIT Date: 2013-06-05 14:18:39 > -0500 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: dynSim on a arch-complex named node0055.local by d3m956 > Fri Aug 2 11:56:10 2013 > [0]PETSC ERROR: Libraries linked from > /pic/projects/ds/petsc-dev.6.06.13/arch-complex/lib > [0]PETSC ERROR: Configure run at Fri Jul 26 14:32:37 2013 > [0]PETSC ERROR: Configure options --with-scalar-type=complex > --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic > --download-superlu_dist --download-mumps --download-scalapack > --download-parmetis --download-metis --download-elemental --with-debugging=0 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: TSStep() line 2515 in > /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c > [0]PETSC ERROR: TSSolve() line 2632 in > /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c > [0]PETSC ERROR: simu() line 566 in "unknowndirectory/"simulation.C > [0]PETSC ERROR: runSimulation() line 99 in "unknowndirectory/"dynSim.h > [node0055:32539] *** Process received signal *** > [node0055:32535] *** Process received signal *** > [node0055:32535] Signal: Aborted (6) > [node0055:32535] Signal code: (24153104) > [node0055:32534] *** Process received signal *** > [node0055:32534] Signal: Aborted (6) > [node0055:32534] Signal code: (24199552) > [node0055:32539] Signal: Aborted (6) > [node0055:32539] Signal code: (24157648) > [node0055:32537] *** Process received signal *** > [node0055:32537] Signal: Aborted (6) > [node0055:32537] Signal code: (24546704) > [node0055:32538] *** Process received signal *** > > The Error Message from PETSc pointed out that ?TSStep has failed due to > DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative > to attempt recovery!?, but I think it?s because the superlu_dist computed > an all ?nan? X as I printed it out. > > However, I don?t understand why using 8 or 16 processors should make such > a difference. > It sounds like you are computing a NaN somewhere, possibly your residual evaluation. However, we should catch this when we evaluate the norm. Please turn on debugging in your build. Matt > Can anyone give me some help for the trouble shooting? > > Thanks, > Shuangshuang > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Fri Aug 2 15:41:41 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 2 Aug 2013 13:41:41 -0700 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error In-Reply-To: References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> Is there a quick way to turn on the debugging in my build, or I have to do the following again? Configure options --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --download-scalapack --download-parmetis --download-metis --download-elemental It usually takes over an hour to reconfigure PETSc on my machine... Thanks, Shuangshuang From: Matthew Knepley [mailto:knepley at gmail.com] Sent: Friday, August 02, 2013 1:33 PM To: Jin, Shuangshuang Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] DIVERGED_NONLINEAR_SOLVE error On Sat, Aug 3, 2013 at 4:22 AM, Jin, Shuangshuang > wrote: Hello, My code solves a linear system AX=B using superlu_dist in PETSc, and use some of X's data to solve a DAE problem. I get a very wild error: When I use less than 8 processors to run the code, it runs just fine with correct results. When I use greater than 8 processors, such as 16 or 32 processors, I'll get an error and a lot of generated core.##### files. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: ! [0]PETSC ERROR: TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Development GIT revision: a0a914e661bf6402b8edabe0f5a2dad46323f69f GIT Date: 2013-06-05 14:18:39 -0500 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: dynSim on a arch-complex named node0055.local by d3m956 Fri Aug 2 11:56:10 2013 [0]PETSC ERROR: Libraries linked from /pic/projects/ds/petsc-dev.6.06.13/arch-complex/lib [0]PETSC ERROR: Configure run at Fri Jul 26 14:32:37 2013 [0]PETSC ERROR: Configure options --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --download-scalapack --download-parmetis --download-metis --download-elemental --with-debugging=0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: TSStep() line 2515 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c [0]PETSC ERROR: TSSolve() line 2632 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c [0]PETSC ERROR: simu() line 566 in "unknowndirectory/"simulation.C [0]PETSC ERROR: runSimulation() line 99 in "unknowndirectory/"dynSim.h [node0055:32539] *** Process received signal *** [node0055:32535] *** Process received signal *** [node0055:32535] Signal: Aborted (6) [node0055:32535] Signal code: (24153104) [node0055:32534] *** Process received signal *** [node0055:32534] Signal: Aborted (6) [node0055:32534] Signal code: (24199552) [node0055:32539] Signal: Aborted (6) [node0055:32539] Signal code: (24157648) [node0055:32537] *** Process received signal *** [node0055:32537] Signal: Aborted (6) [node0055:32537] Signal code: (24546704) [node0055:32538] *** Process received signal *** The Error Message from PETSc pointed out that "TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery!", but I think it's because the superlu_dist computed an all "nan" X as I printed it out. However, I don't understand why using 8 or 16 processors should make such a difference. It sounds like you are computing a NaN somewhere, possibly your residual evaluation. However, we should catch this when we evaluate the norm. Please turn on debugging in your build. Matt Can anyone give me some help for the trouble shooting? Thanks, Shuangshuang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 2 15:43:17 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 3 Aug 2013 04:43:17 +0800 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> Message-ID: On Sat, Aug 3, 2013 at 4:41 AM, Jin, Shuangshuang wrote: > Is there a quick way to turn on the debugging in my build, or I have to do > the following again?**** > > ** ** > > Configure options --with-scalar-type=complex --with-clanguage=C++ > PETSC_ARCH=arch-complex --with-fortran-kernels=generic > --download-superlu_dist --download-mumps --download-scalapack > --download-parmetis --download-metis --download-elemental **** > > ** ** > > It usually takes over an hour to reconfigure PETSc on my machine? > That is the way. I think its time to upgrade your Commodore 64 :) Matt > > > Thanks,**** > > Shuangshuang**** > > ** ** > > *From:* Matthew Knepley [mailto:knepley at gmail.com] > *Sent:* Friday, August 02, 2013 1:33 PM > *To:* Jin, Shuangshuang > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] DIVERGED_NONLINEAR_SOLVE error**** > > ** ** > > On Sat, Aug 3, 2013 at 4:22 AM, Jin, Shuangshuang < > Shuangshuang.Jin at pnnl.gov> wrote:**** > > Hello, **** > > **** > > My code solves a linear system AX=B using superlu_dist in PETSc, and > use some of X?s data to solve a DAE problem. I get a very wild error:**** > > **** > > When I use less than 8 processors to run the code, it runs just fine > with correct results. When I use greater than 8 processors, such as 16 or > 32 processors, I?ll get an error and a lot of generated core.##### files. > **** > > **** > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------**** > > [0]PETSC ERROR: !**** > > [0]PETSC ERROR: TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, > increase -ts_max_snes_failures or make negative to attempt recovery!**** > > [0]PETSC ERROR: > ------------------------------------------------------------------------** > ** > > [0]PETSC ERROR: Petsc Development GIT revision: > a0a914e661bf6402b8edabe0f5a2dad46323f69f GIT Date: 2013-06-05 14:18:39 > -0500**** > > [0]PETSC ERROR: See docs/changes/index.html for recent updates.**** > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.**** > > [0]PETSC ERROR: See docs/index.html for manual pages.**** > > [0]PETSC ERROR: > ------------------------------------------------------------------------** > ** > > [0]PETSC ERROR: dynSim on a arch-complex named node0055.local by d3m956 > Fri Aug 2 11:56:10 2013**** > > [0]PETSC ERROR: Libraries linked from > /pic/projects/ds/petsc-dev.6.06.13/arch-complex/lib**** > > [0]PETSC ERROR: Configure run at Fri Jul 26 14:32:37 2013**** > > [0]PETSC ERROR: Configure options --with-scalar-type=complex > --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic > --download-superlu_dist --download-mumps --download-scalapack > --download-parmetis --download-metis --download-elemental --with-debugging=0 > **** > > [0]PETSC ERROR: > ------------------------------------------------------------------------** > ** > > [0]PETSC ERROR: TSStep() line 2515 in > /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c**** > > [0]PETSC ERROR: TSSolve() line 2632 in > /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c**** > > [0]PETSC ERROR: simu() line 566 in "unknowndirectory/"simulation.C**** > > [0]PETSC ERROR: runSimulation() line 99 in "unknowndirectory/"dynSim.h**** > > [node0055:32539] *** Process received signal ******* > > [node0055:32535] *** Process received signal ******* > > [node0055:32535] Signal: Aborted (6)**** > > [node0055:32535] Signal code: (24153104)**** > > [node0055:32534] *** Process received signal ******* > > [node0055:32534] Signal: Aborted (6)**** > > [node0055:32534] Signal code: (24199552)**** > > [node0055:32539] Signal: Aborted (6)**** > > [node0055:32539] Signal code: (24157648)**** > > [node0055:32537] *** Process received signal ******* > > [node0055:32537] Signal: Aborted (6)**** > > [node0055:32537] Signal code: (24546704)**** > > [node0055:32538] *** Process received signal ******* > > **** > > The Error Message from PETSc pointed out that ?TSStep has failed due to > DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative > to attempt recovery!?, but I think it?s because the superlu_dist computed > an all ?nan? X as I printed it out. **** > > **** > > However, I don?t understand why using 8 or 16 processors should make such > a difference.**** > > ** ** > > It sounds like you are computing a NaN somewhere, possibly your residual > evaluation. However, we should**** > > catch this when we evaluate the norm. Please turn on debugging in your > build.**** > > ** ** > > Matt**** > > **** > > Can anyone give me some help for the trouble shooting?**** > > **** > > Thanks,**** > > Shuangshuang**** > > **** > > > > **** > > ** ** > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener **** > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 2 16:03:30 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 2 Aug 2013 16:03:30 -0500 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> Message-ID: Use two PETSC_ARCH PETSC_ARCH=arch-complex-debug and PETSC_ARCH=arch-complex-opt then you can switch back and forth between them without rebuilding. Barry On Aug 2, 2013, at 3:41 PM, "Jin, Shuangshuang" wrote: > Is there a quick way to turn on the debugging in my build, or I have to do the following again? > > Configure options --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --download-scalapack --download-parmetis --download-metis --download-elemental > > It usually takes over an hour to reconfigure PETSc on my machine? > > Thanks, > Shuangshuang > > From: Matthew Knepley [mailto:knepley at gmail.com] > Sent: Friday, August 02, 2013 1:33 PM > To: Jin, Shuangshuang > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] DIVERGED_NONLINEAR_SOLVE error > > On Sat, Aug 3, 2013 at 4:22 AM, Jin, Shuangshuang wrote: > Hello, > > My code solves a linear system AX=B using superlu_dist in PETSc, and use some of X?s data to solve a DAE problem. I get a very wild error: > > When I use less than 8 processors to run the code, it runs just fine with correct results. When I use greater than 8 processors, such as 16 or 32 processors, I?ll get an error and a lot of generated core.##### files. > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: ! > [0]PETSC ERROR: TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: a0a914e661bf6402b8edabe0f5a2dad46323f69f GIT Date: 2013-06-05 14:18:39 -0500 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: dynSim on a arch-complex named node0055.local by d3m956 Fri Aug 2 11:56:10 2013 > [0]PETSC ERROR: Libraries linked from /pic/projects/ds/petsc-dev.6.06.13/arch-complex/lib > [0]PETSC ERROR: Configure run at Fri Jul 26 14:32:37 2013 > [0]PETSC ERROR: Configure options --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --download-scalapack --download-parmetis --download-metis --download-elemental --with-debugging=0 > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: TSStep() line 2515 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c > [0]PETSC ERROR: TSSolve() line 2632 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c > [0]PETSC ERROR: simu() line 566 in "unknowndirectory/"simulation.C > [0]PETSC ERROR: runSimulation() line 99 in "unknowndirectory/"dynSim.h > [node0055:32539] *** Process received signal *** > [node0055:32535] *** Process received signal *** > [node0055:32535] Signal: Aborted (6) > [node0055:32535] Signal code: (24153104) > [node0055:32534] *** Process received signal *** > [node0055:32534] Signal: Aborted (6) > [node0055:32534] Signal code: (24199552) > [node0055:32539] Signal: Aborted (6) > [node0055:32539] Signal code: (24157648) > [node0055:32537] *** Process received signal *** > [node0055:32537] Signal: Aborted (6) > [node0055:32537] Signal code: (24546704) > [node0055:32538] *** Process received signal *** > > The Error Message from PETSc pointed out that ?TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery!?, but I think it?s because the superlu_dist computed an all ?nan? X as I printed it out. > > However, I don?t understand why using 8 or 16 processors should make such a difference. > > It sounds like you are computing a NaN somewhere, possibly your residual evaluation. However, we should > catch this when we evaluate the norm. Please turn on debugging in your build. > > Matt > > Can anyone give me some help for the trouble shooting? > > Thanks, > Shuangshuang > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From bsmith at mcs.anl.gov Fri Aug 2 16:38:03 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 2 Aug 2013 16:38:03 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FAEF43.2050207@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> Message-ID: <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> Finally got it. My failing memory. I had to add the line call KSPSetDMActive(ksp,PETSC_FALSE,ierr) immediately after KSPSetDM() and change call DMCreateMatrix(da,MATMPIAIJ,A,ierr) to call DMCreateMatrix(da,MATAIJ,A,ierr) so it will work in both parallel and sequential then ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 works great with 2 levels. Barry On Aug 1, 2013, at 6:29 PM, Michele Rosso wrote: > Barry, > > no problem. I attached the full code in test_poisson_solver.tar.gz. > My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. > I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. > Please, contact me for any question/suggestion. > I the mean time I will try to debug it. > > Michele > > > > > On 08/01/2013 04:19 PM, Barry Smith wrote: >> Run on one process until this is debugged. You can try the option >> >> -start_in_debugger noxterm >> >> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >> >> Barry >> >> On Aug 1, 2013, at 5:52 PM, Michele Rosso >> >> wrote: >> >> >>> Barry, >>> >>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>> Also, -da_refine is never used (see previous output). >>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>> >>> and now the error is different: >>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>> [1]PETSC ERROR: Floating point exception! >>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>> [2]PETSC ERROR: Floating point exception! >>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>> [3]PETSC ERROR: Floating point exception! >>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [3]PETSC ERROR: See docs/index.html for manual pages. >>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>> [1]PETSC ERROR: See docs/index.html for manual pages. >>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [1]PETSC ERROR: See docs/index.html for manual pages. >>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>> [3]PETSC ERROR: Configure options >>> Configure run at Thu Aug 1 12:01:44 2013 >>> [1]PETSC ERROR: Configure options >>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>> Configure options >>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> MatMult() line 2174 in src/mat/interface/matrix.c >>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> --------------------- Error Message ------------------------------------ >>> [0]PETSC ERROR: Floating point exception! >>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>> [0]PETSC ERROR: Configure options >>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>> >>> #PETSc Option Table entries: >>> -ksp_view >>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>> -mg_levels_ksp_type chebyshev >>> -mg_levels_pc_type jacobi >>> -options_left >>> -pc_mg_galerkin >>> -pc_type mg >>> #End of PETSc Option Table entries >>> There are no unused options. >>> >>> Michele >>> >>> >>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>> >>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>> >>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>> >>>> Barry >>>> >>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>> >>>> >>>> >>>> wrote: >>>> >>>> >>>> >>>>> Barry, >>>>> >>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>> >>>>> ! Create matrix >>>>> call form_matrix( A , qrho, lsf, head ) >>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>> >>>>> ! Create rhs term >>>>> call form_rhs(work, qrho, lsf, b , head) >>>>> >>>>> ! Solve system >>>>> call KSPSetFromOptions(ksp,ierr) >>>>> call KSPSetUp(ksp,ierr) >>>>> call KSPSolve(ksp,b,x,ierr) >>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>> >>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>> >>>>> >>>>> Michele >>>>> >>>>> >>>>> >>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>> >>>>> >>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>> >>>>>> >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> >>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>> >>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>> >>>>>>> The output is: >>>>>>> >>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [2]PETSC ERROR: Configure options >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>> --------------------- Error Message ------------------------------------ >>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> Arguments are incompatible! >>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> Zero diagonal on row 0! >>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> See docs/index.html for manual pages. >>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [3]PETSC ERROR: Configure options >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [1]PETSC ERROR: Configure options >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [0]PETSC ERROR: Configure options >>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> #PETSc Option Table entries: >>>>>>> -da_refine 4 >>>>>>> -ksp_view >>>>>>> -options_left >>>>>>> -pc_mg_galerkin >>>>>>> -pc_type mg >>>>>>> #End of PETSc Option Table entries >>>>>>> There is one unused database option. It is: >>>>>>> Option left: name:-da_refine value: 4 >>>>>>> >>>>>>> >>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>> >>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>> ! Create Global Vectors >>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>> call VecDuplicate(b,x,ierr) >>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>> ! Create matrix >>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>> ! Create solver >>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>> >>>>>>> ! To allow using option from command line >>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>> >>>>>>> >>>>>>> Michele >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>> >>>>>>> >>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>> >>>>>>>> If it doesn't work, send us all the output. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Barry, >>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>> Thank you, >>>>>>>>> Michele >>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Barry, >>>>>>>>>>> >>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>> I tried to run my case with >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>> >>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>> type: cg >>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using nonzero initial guess >>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>> type: mg >>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>> type: chebyshev >>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>> type: gmres >>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>> type: sor >>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>> type: mpiaij >>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>> type: sor >>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>> type: mpiaij >>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>> type: mpiaij >>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>> -da_refine 4 >>>>>>>>>>> -ksp_view >>>>>>>>>>> -options_left >>>>>>>>>>> -pc_type mg >>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>> >>>>>>>>>>> Michele >>>>>>>>>>> >>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>> >>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Hi, >>>>>>>>>>>>> >>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>> >>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>> >>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>> >>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>> >>>>>>>>>>>>> I run with >>>>>>>>>>>>> >>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>> >>>>>>>>>>>>> and the output is: >>>>>>>>>>>>> >>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>> type: cg >>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>> type: gamg >>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>> type: preonly >>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>> type: preonly >>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>> type: preonly >>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>> type: lu >>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>> type: lu >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>> type: preonly >>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>> type: lu >>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>> type: preonly >>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>> type: lu >>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>> -options_left >>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thank you, >>>>>>>>>>>>> Michele >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >> > > <2decomp_fft-1.5.847-modified.tar.gz> From mrosso at uci.edu Fri Aug 2 16:52:56 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 02 Aug 2013 14:52:56 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> Message-ID: <51FC2A38.5000000@uci.edu> Barry, thank you very much for your help. I was trying to debug the error with no success! Now it works like a charm for me too! I have still two questions for you: 1) How did you choose the number of levels to use: trial and error? 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? Again, thank you very much! Michele On 08/02/2013 02:38 PM, Barry Smith wrote: > Finally got it. My failing memory. I had to add the line > > call KSPSetDMActive(ksp,PETSC_FALSE,ierr) > > immediately after KSPSetDM() and > > change > > call DMCreateMatrix(da,MATMPIAIJ,A,ierr) > > to > > call DMCreateMatrix(da,MATAIJ,A,ierr) > > so it will work in both parallel and sequential then > > ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 > > works great with 2 levels. > > Barry > > > > > On Aug 1, 2013, at 6:29 PM, Michele Rosso wrote: > >> Barry, >> >> no problem. I attached the full code in test_poisson_solver.tar.gz. >> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >> Please, contact me for any question/suggestion. >> I the mean time I will try to debug it. >> >> Michele >> >> >> >> >> On 08/01/2013 04:19 PM, Barry Smith wrote: >>> Run on one process until this is debugged. You can try the option >>> >>> -start_in_debugger noxterm >>> >>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>> >>> Barry >>> >>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>> >>> wrote: >>> >>> >>>> Barry, >>>> >>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>> Also, -da_refine is never used (see previous output). >>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>> >>>> and now the error is different: >>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>> [1]PETSC ERROR: Floating point exception! >>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>> [2]PETSC ERROR: Floating point exception! >>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>> [3]PETSC ERROR: Floating point exception! >>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>> [3]PETSC ERROR: Configure options >>>> Configure run at Thu Aug 1 12:01:44 2013 >>>> [1]PETSC ERROR: Configure options >>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>> Configure options >>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> --------------------- Error Message ------------------------------------ >>>> [0]PETSC ERROR: Floating point exception! >>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>> [0]PETSC ERROR: Configure options >>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>> >>>> #PETSc Option Table entries: >>>> -ksp_view >>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>> -mg_levels_ksp_type chebyshev >>>> -mg_levels_pc_type jacobi >>>> -options_left >>>> -pc_mg_galerkin >>>> -pc_type mg >>>> #End of PETSc Option Table entries >>>> There are no unused options. >>>> >>>> Michele >>>> >>>> >>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>> >>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>> >>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>> >>>>> Barry >>>>> >>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>> >>>>> >>>>> >>>>> wrote: >>>>> >>>>> >>>>> >>>>>> Barry, >>>>>> >>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>> >>>>>> ! Create matrix >>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>> >>>>>> ! Create rhs term >>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>> >>>>>> ! Solve system >>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>> call KSPSetUp(ksp,ierr) >>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>> >>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>> >>>>>> >>>>>> Michele >>>>>> >>>>>> >>>>>> >>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>> >>>>>> >>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>> >>>>>>> >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>> >>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>> >>>>>>>> The output is: >>>>>>>> >>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> Arguments are incompatible! >>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>> Zero diagonal on row 0! >>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> See docs/index.html for manual pages. >>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>> #PETSc Option Table entries: >>>>>>>> -da_refine 4 >>>>>>>> -ksp_view >>>>>>>> -options_left >>>>>>>> -pc_mg_galerkin >>>>>>>> -pc_type mg >>>>>>>> #End of PETSc Option Table entries >>>>>>>> There is one unused database option. It is: >>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>> >>>>>>>> >>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>> >>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>> ! Create Global Vectors >>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>> ! Create matrix >>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>> ! Create solver >>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>> >>>>>>>> ! To allow using option from command line >>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>> >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>> >>>>>>>> >>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>> >>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Barry, >>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>> Thank you, >>>>>>>>>> Michele >>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Barry, >>>>>>>>>>>> >>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>> >>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>> type: cg >>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>> left preconditioning >>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>> type: mg >>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>> type: chebyshev >>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>> type: gmres >>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>> left preconditioning >>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>> type: sor >>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>> type: mpiaij >>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>> left preconditioning >>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>> type: sor >>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>> type: mpiaij >>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>> type: mpiaij >>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>> -ksp_view >>>>>>>>>>>> -options_left >>>>>>>>>>>> -pc_type mg >>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>> >>>>>>>>>>>> Michele >>>>>>>>>>>> >>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>> >>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>> >>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>> >>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>> >>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>> >>>>>>>>>>>>>> I run with >>>>>>>>>>>>>> >>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>> >>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>> >>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>> Michele >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >> <2decomp_fft-1.5.847-modified.tar.gz> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 2 17:11:14 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 2 Aug 2013 17:11:14 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <51FC2A38.5000000@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> Message-ID: <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: > Barry, > > thank you very much for your help. I was trying to debug the error with no success! > Now it works like a charm for me too! > I have still two questions for you: > > 1) How did you choose the number of levels to use: trial and error? I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. > 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? I don't know of anything. But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. Barry > > Again, thank you very much! > > Michele > > On 08/02/2013 02:38 PM, Barry Smith wrote: >> Finally got it. My failing memory. I had to add the line >> >> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >> >> immediately after KSPSetDM() and >> >> change >> >> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >> >> to >> >> call DMCreateMatrix(da,MATAIJ,A,ierr) >> >> so it will work in both parallel and sequential then >> >> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >> >> works great with 2 levels. >> >> Barry >> >> >> >> >> On Aug 1, 2013, at 6:29 PM, Michele Rosso >> >> wrote: >> >> >>> Barry, >>> >>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>> Please, contact me for any question/suggestion. >>> I the mean time I will try to debug it. >>> >>> Michele >>> >>> >>> >>> >>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>> >>>> Run on one process until this is debugged. You can try the option >>>> >>>> -start_in_debugger noxterm >>>> >>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>> >>>> Barry >>>> >>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>> >>>> >>>> >>>> wrote: >>>> >>>> >>>> >>>>> Barry, >>>>> >>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>> Also, -da_refine is never used (see previous output). >>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>> >>>>> and now the error is different: >>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>> [1]PETSC ERROR: Floating point exception! >>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>> [2]PETSC ERROR: Floating point exception! >>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>> [3]PETSC ERROR: Floating point exception! >>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>> [3]PETSC ERROR: Configure options >>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>> [1]PETSC ERROR: Configure options >>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>> Configure options >>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> --------------------- Error Message ------------------------------------ >>>>> [0]PETSC ERROR: Floating point exception! >>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>> [0]PETSC ERROR: Configure options >>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>> >>>>> #PETSc Option Table entries: >>>>> -ksp_view >>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>> -mg_levels_ksp_type chebyshev >>>>> -mg_levels_pc_type jacobi >>>>> -options_left >>>>> -pc_mg_galerkin >>>>> -pc_type mg >>>>> #End of PETSc Option Table entries >>>>> There are no unused options. >>>>> >>>>> Michele >>>>> >>>>> >>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>> >>>>> >>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>> >>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>> >>>>>> Barry >>>>>> >>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> >>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>> >>>>>>> ! Create matrix >>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>> >>>>>>> ! Create rhs term >>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>> >>>>>>> ! Solve system >>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>> call KSPSetUp(ksp,ierr) >>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>> >>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>> >>>>>>> >>>>>>> Michele >>>>>>> >>>>>>> >>>>>>> >>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Barry, >>>>>>>>> >>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>> >>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>> >>>>>>>>> The output is: >>>>>>>>> >>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> Arguments are incompatible! >>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> Zero diagonal on row 0! >>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> See docs/index.html for manual pages. >>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> #PETSc Option Table entries: >>>>>>>>> -da_refine 4 >>>>>>>>> -ksp_view >>>>>>>>> -options_left >>>>>>>>> -pc_mg_galerkin >>>>>>>>> -pc_type mg >>>>>>>>> #End of PETSc Option Table entries >>>>>>>>> There is one unused database option. It is: >>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>> >>>>>>>>> >>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>> >>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>> ! Create Global Vectors >>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>> ! Create matrix >>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>> ! Create solver >>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>> >>>>>>>>> ! To allow using option from command line >>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>> >>>>>>>>> >>>>>>>>> Michele >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>> >>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Barry, >>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>> Thank you, >>>>>>>>>>> Michele >>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Barry, >>>>>>>>>>>>> >>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>> >>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>> type: cg >>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>> type: mg >>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>> type: sor >>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>> type: sor >>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>> -options_left >>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>> >>>>>>>>>>>>> Michele >>>>>>>>>>>>> >>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>> >>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>> <2decomp_fft-1.5.847-modified.tar.gz> >>> >> > From mrosso at uci.edu Fri Aug 2 17:14:18 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 02 Aug 2013 15:14:18 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> Message-ID: <51FC2F3A.8010303@uci.edu> Thank you! I will try with a larger mesh as soon as my supercomputer gets back on line! I will let you how it goes! Michele On 08/02/2013 03:11 PM, Barry Smith wrote: > On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: > >> Barry, >> >> thank you very much for your help. I was trying to debug the error with no success! >> Now it works like a charm for me too! >> I have still two questions for you: >> >> 1) How did you choose the number of levels to use: trial and error? > I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. > >> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? > I don't know of anything. > > But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. > > Barry > >> Again, thank you very much! >> >> Michele >> >> On 08/02/2013 02:38 PM, Barry Smith wrote: >>> Finally got it. My failing memory. I had to add the line >>> >>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>> >>> immediately after KSPSetDM() and >>> >>> change >>> >>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>> >>> to >>> >>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>> >>> so it will work in both parallel and sequential then >>> >>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>> >>> works great with 2 levels. >>> >>> Barry >>> >>> >>> >>> >>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>> >>> wrote: >>> >>> >>>> Barry, >>>> >>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>> Please, contact me for any question/suggestion. >>>> I the mean time I will try to debug it. >>>> >>>> Michele >>>> >>>> >>>> >>>> >>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>> >>>>> Run on one process until this is debugged. You can try the option >>>>> >>>>> -start_in_debugger noxterm >>>>> >>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>> >>>>> Barry >>>>> >>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>> >>>>> >>>>> >>>>> wrote: >>>>> >>>>> >>>>> >>>>>> Barry, >>>>>> >>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>> Also, -da_refine is never used (see previous output). >>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>> >>>>>> and now the error is different: >>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [1]PETSC ERROR: Floating point exception! >>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [2]PETSC ERROR: Floating point exception! >>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [3]PETSC ERROR: Floating point exception! >>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [3]PETSC ERROR: Configure options >>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [1]PETSC ERROR: Configure options >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> Configure options >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> --------------------- Error Message ------------------------------------ >>>>>> [0]PETSC ERROR: Floating point exception! >>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [0]PETSC ERROR: Configure options >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> >>>>>> #PETSc Option Table entries: >>>>>> -ksp_view >>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>> -mg_levels_ksp_type chebyshev >>>>>> -mg_levels_pc_type jacobi >>>>>> -options_left >>>>>> -pc_mg_galerkin >>>>>> -pc_type mg >>>>>> #End of PETSc Option Table entries >>>>>> There are no unused options. >>>>>> >>>>>> Michele >>>>>> >>>>>> >>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>> >>>>>> >>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>> >>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>> >>>>>>>> ! Create matrix >>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>> >>>>>>>> ! Create rhs term >>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>> >>>>>>>> ! Solve system >>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>> >>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>> >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Barry, >>>>>>>>>> >>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>> >>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>> >>>>>>>>>> The output is: >>>>>>>>>> >>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> Arguments are incompatible! >>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>> -da_refine 4 >>>>>>>>>> -ksp_view >>>>>>>>>> -options_left >>>>>>>>>> -pc_mg_galerkin >>>>>>>>>> -pc_type mg >>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>> There is one unused database option. It is: >>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>> >>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>> ! Create Global Vectors >>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>> ! Create matrix >>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>> ! Create solver >>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>> >>>>>>>>>> ! To allow using option from command line >>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Michele >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>> >>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Barry, >>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>> Thank you, >>>>>>>>>>>> Michele >>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>> >>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Michele >>>>>>>>>>>>>> >>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Fri Aug 2 18:04:37 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 2 Aug 2013 16:04:37 -0700 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error In-Reply-To: References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F8B@EMAIL04.pnl.gov> Thank you, Mat, problem resolved. It's floating point errors. Located it after turning on the debugging option. Thanks, Shuangshuang From: Matthew Knepley [mailto:knepley at gmail.com] Sent: Friday, August 02, 2013 1:43 PM To: Jin, Shuangshuang Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] DIVERGED_NONLINEAR_SOLVE error On Sat, Aug 3, 2013 at 4:41 AM, Jin, Shuangshuang > wrote: Is there a quick way to turn on the debugging in my build, or I have to do the following again? Configure options --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --download-scalapack --download-parmetis --download-metis --download-elemental It usually takes over an hour to reconfigure PETSc on my machine... That is the way. I think its time to upgrade your Commodore 64 :) Matt Thanks, Shuangshuang From: Matthew Knepley [mailto:knepley at gmail.com] Sent: Friday, August 02, 2013 1:33 PM To: Jin, Shuangshuang Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] DIVERGED_NONLINEAR_SOLVE error On Sat, Aug 3, 2013 at 4:22 AM, Jin, Shuangshuang > wrote: Hello, My code solves a linear system AX=B using superlu_dist in PETSc, and use some of X's data to solve a DAE problem. I get a very wild error: When I use less than 8 processors to run the code, it runs just fine with correct results. When I use greater than 8 processors, such as 16 or 32 processors, I'll get an error and a lot of generated core.##### files. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: ! [0]PETSC ERROR: TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Development GIT revision: a0a914e661bf6402b8edabe0f5a2dad46323f69f GIT Date: 2013-06-05 14:18:39 -0500 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: dynSim on a arch-complex named node0055.local by d3m956 Fri Aug 2 11:56:10 2013 [0]PETSC ERROR: Libraries linked from /pic/projects/ds/petsc-dev.6.06.13/arch-complex/lib [0]PETSC ERROR: Configure run at Fri Jul 26 14:32:37 2013 [0]PETSC ERROR: Configure options --with-scalar-type=complex --with-clanguage=C++ PETSC_ARCH=arch-complex --with-fortran-kernels=generic --download-superlu_dist --download-mumps --download-scalapack --download-parmetis --download-metis --download-elemental --with-debugging=0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: TSStep() line 2515 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c [0]PETSC ERROR: TSSolve() line 2632 in /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c [0]PETSC ERROR: simu() line 566 in "unknowndirectory/"simulation.C [0]PETSC ERROR: runSimulation() line 99 in "unknowndirectory/"dynSim.h [node0055:32539] *** Process received signal *** [node0055:32535] *** Process received signal *** [node0055:32535] Signal: Aborted (6) [node0055:32535] Signal code: (24153104) [node0055:32534] *** Process received signal *** [node0055:32534] Signal: Aborted (6) [node0055:32534] Signal code: (24199552) [node0055:32539] Signal: Aborted (6) [node0055:32539] Signal code: (24157648) [node0055:32537] *** Process received signal *** [node0055:32537] Signal: Aborted (6) [node0055:32537] Signal code: (24546704) [node0055:32538] *** Process received signal *** The Error Message from PETSc pointed out that "TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery!", but I think it's because the superlu_dist computed an all "nan" X as I printed it out. However, I don't understand why using 8 or 16 processors should make such a difference. It sounds like you are computing a NaN somewhere, possibly your residual evaluation. However, we should catch this when we evaluate the norm. Please turn on debugging in your build. Matt Can anyone give me some help for the trouble shooting? Thanks, Shuangshuang -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Fri Aug 2 18:05:14 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 2 Aug 2013 16:05:14 -0700 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error In-Reply-To: References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F8C@EMAIL04.pnl.gov> Thank you. I'll definitely try this to make things easier. Shuangshuang -----Original Message----- From: Barry Smith [mailto:bsmith at mcs.anl.gov] Sent: Friday, August 02, 2013 2:03 PM To: Jin, Shuangshuang Cc: Matthew Knepley; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] DIVERGED_NONLINEAR_SOLVE error Use two PETSC_ARCH PETSC_ARCH=arch-complex-debug and PETSC_ARCH=arch-complex-opt then you can switch back and forth between them without rebuilding. Barry On Aug 2, 2013, at 3:41 PM, "Jin, Shuangshuang" wrote: > Is there a quick way to turn on the debugging in my build, or I have to do the following again? > > Configure options --with-scalar-type=complex --with-clanguage=C++ > PETSC_ARCH=arch-complex --with-fortran-kernels=generic > --download-superlu_dist --download-mumps --download-scalapack > --download-parmetis --download-metis --download-elemental > > It usually takes over an hour to reconfigure PETSc on my machine... > > Thanks, > Shuangshuang > > From: Matthew Knepley [mailto:knepley at gmail.com] > Sent: Friday, August 02, 2013 1:33 PM > To: Jin, Shuangshuang > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] DIVERGED_NONLINEAR_SOLVE error > > On Sat, Aug 3, 2013 at 4:22 AM, Jin, Shuangshuang wrote: > Hello, > > My code solves a linear system AX=B using superlu_dist in PETSc, and use some of X's data to solve a DAE problem. I get a very wild error: > > When I use less than 8 processors to run the code, it runs just fine with correct results. When I use greater than 8 processors, such as 16 or 32 processors, I'll get an error and a lot of generated core.##### files. > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: ! > [0]PETSC ERROR: TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery! > [0]PETSC ERROR: > ---------------------------------------------------------------------- > -- [0]PETSC ERROR: Petsc Development GIT revision: > a0a914e661bf6402b8edabe0f5a2dad46323f69f GIT Date: 2013-06-05 > 14:18:39 -0500 [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ---------------------------------------------------------------------- > -- [0]PETSC ERROR: dynSim on a arch-complex named node0055.local by > d3m956 Fri Aug 2 11:56:10 2013 [0]PETSC ERROR: Libraries linked from > /pic/projects/ds/petsc-dev.6.06.13/arch-complex/lib > [0]PETSC ERROR: Configure run at Fri Jul 26 14:32:37 2013 [0]PETSC > ERROR: Configure options --with-scalar-type=complex > --with-clanguage=C++ PETSC_ARCH=arch-complex > --with-fortran-kernels=generic --download-superlu_dist > --download-mumps --download-scalapack --download-parmetis > --download-metis --download-elemental --with-debugging=0 [0]PETSC > ERROR: > ---------------------------------------------------------------------- > -- [0]PETSC ERROR: TSStep() line 2515 in > /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c > [0]PETSC ERROR: TSSolve() line 2632 in > /pic/projects/ds/petsc-dev.6.06.13/src/ts/interface/ts.c > [0]PETSC ERROR: simu() line 566 in "unknowndirectory/"simulation.C > [0]PETSC ERROR: runSimulation() line 99 in "unknowndirectory/"dynSim.h > [node0055:32539] *** Process received signal *** [node0055:32535] *** > Process received signal *** [node0055:32535] Signal: Aborted (6) > [node0055:32535] Signal code: (24153104) [node0055:32534] *** Process > received signal *** [node0055:32534] Signal: Aborted (6) > [node0055:32534] Signal code: (24199552) [node0055:32539] Signal: > Aborted (6) [node0055:32539] Signal code: (24157648) [node0055:32537] > *** Process received signal *** [node0055:32537] Signal: Aborted (6) > [node0055:32537] Signal code: (24546704) [node0055:32538] *** Process > received signal *** > > The Error Message from PETSc pointed out that "TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery!", but I think it's because the superlu_dist computed an all "nan" X as I printed it out. > > However, I don't understand why using 8 or 16 processors should make such a difference. > > It sounds like you are computing a NaN somewhere, possibly your > residual evaluation. However, we should catch this when we evaluate the norm. Please turn on debugging in your build. > > Matt > > Can anyone give me some help for the trouble shooting? > > Thanks, > Shuangshuang > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From mfadams at lbl.gov Fri Aug 2 18:06:09 2013 From: mfadams at lbl.gov (Mark F. Adams) Date: Fri, 2 Aug 2013 19:06:09 -0400 Subject: [petsc-users] GAMG speed In-Reply-To: <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> Message-ID: On Aug 2, 2013, at 6:11 PM, Barry Smith wrote: > > On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: > >> Barry, >> >> thank you very much for your help. I was trying to debug the error with no success! >> Now it works like a charm for me too! >> I have still two questions for you: >> >> 1) How did you choose the number of levels to use: trial and error? > > I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. > >> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? > > I don't know of anything. > Don't you need an iterative coarse grid solver? > But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. I've found that the one outer cleaning is plenty and in fact can work w/o it. > > Barry > >> >> Again, thank you very much! >> >> Michele >> >> On 08/02/2013 02:38 PM, Barry Smith wrote: >>> Finally got it. My failing memory. I had to add the line >>> >>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>> >>> immediately after KSPSetDM() and >>> >>> change >>> >>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>> >>> to >>> >>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>> >>> so it will work in both parallel and sequential then >>> >>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>> >>> works great with 2 levels. >>> >>> Barry >>> >>> >>> >>> >>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>> >>> wrote: >>> >>> >>>> Barry, >>>> >>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>> Please, contact me for any question/suggestion. >>>> I the mean time I will try to debug it. >>>> >>>> Michele >>>> >>>> >>>> >>>> >>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>> >>>>> Run on one process until this is debugged. You can try the option >>>>> >>>>> -start_in_debugger noxterm >>>>> >>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>> >>>>> Barry >>>>> >>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>> >>>>> >>>>> >>>>> wrote: >>>>> >>>>> >>>>> >>>>>> Barry, >>>>>> >>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>> Also, -da_refine is never used (see previous output). >>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>> >>>>>> and now the error is different: >>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [1]PETSC ERROR: Floating point exception! >>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [2]PETSC ERROR: Floating point exception! >>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [3]PETSC ERROR: Floating point exception! >>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [3]PETSC ERROR: Configure options >>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [1]PETSC ERROR: Configure options >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> Configure options >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> --------------------- Error Message ------------------------------------ >>>>>> [0]PETSC ERROR: Floating point exception! >>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [0]PETSC ERROR: Configure options >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> >>>>>> #PETSc Option Table entries: >>>>>> -ksp_view >>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>> -mg_levels_ksp_type chebyshev >>>>>> -mg_levels_pc_type jacobi >>>>>> -options_left >>>>>> -pc_mg_galerkin >>>>>> -pc_type mg >>>>>> #End of PETSc Option Table entries >>>>>> There are no unused options. >>>>>> >>>>>> Michele >>>>>> >>>>>> >>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>> >>>>>> >>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>> >>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>> >>>>>>>> ! Create matrix >>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>> >>>>>>>> ! Create rhs term >>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>> >>>>>>>> ! Solve system >>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>> >>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>> >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Barry, >>>>>>>>>> >>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>> >>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>> >>>>>>>>>> The output is: >>>>>>>>>> >>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> Arguments are incompatible! >>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>> -da_refine 4 >>>>>>>>>> -ksp_view >>>>>>>>>> -options_left >>>>>>>>>> -pc_mg_galerkin >>>>>>>>>> -pc_type mg >>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>> There is one unused database option. It is: >>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>> >>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>> ! Create Global Vectors >>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>> ! Create matrix >>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>> ! Create solver >>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>> >>>>>>>>>> ! To allow using option from command line >>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Michele >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>> >>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Barry, >>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>> Thank you, >>>>>>>>>>>> Michele >>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>> >>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Michele >>>>>>>>>>>>>> >>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>> >>> >> > From bsmith at mcs.anl.gov Fri Aug 2 18:25:09 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 2 Aug 2013 18:25:09 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> Message-ID: On Aug 2, 2013, at 6:06 PM, "Mark F. Adams" wrote: > > On Aug 2, 2013, at 6:11 PM, Barry Smith wrote: > >> >> On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: >> >>> Barry, >>> >>> thank you very much for your help. I was trying to debug the error with no success! >>> Now it works like a charm for me too! >>> I have still two questions for you: >>> >>> 1) How did you choose the number of levels to use: trial and error? >> >> I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. >> >>> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? >> >> I don't know of anything. >> > > Don't you need an iterative coarse grid solver? Yes, right or the hack of -mg_coarse_pc_factor_shift_type nonzero Barry > >> But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. > > I've found that the one outer cleaning is plenty and in fact can work w/o it. > >> >> Barry >> >>> >>> Again, thank you very much! >>> >>> Michele >>> >>> On 08/02/2013 02:38 PM, Barry Smith wrote: >>>> Finally got it. My failing memory. I had to add the line >>>> >>>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>>> >>>> immediately after KSPSetDM() and >>>> >>>> change >>>> >>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>> >>>> to >>>> >>>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>>> >>>> so it will work in both parallel and sequential then >>>> >>>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>>> >>>> works great with 2 levels. >>>> >>>> Barry >>>> >>>> >>>> >>>> >>>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>>> >>>> wrote: >>>> >>>> >>>>> Barry, >>>>> >>>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>>> Please, contact me for any question/suggestion. >>>>> I the mean time I will try to debug it. >>>>> >>>>> Michele >>>>> >>>>> >>>>> >>>>> >>>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>>> >>>>>> Run on one process until this is debugged. You can try the option >>>>>> >>>>>> -start_in_debugger noxterm >>>>>> >>>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>>> >>>>>> Barry >>>>>> >>>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>>> >>>>>> >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> >>>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>>> Also, -da_refine is never used (see previous output). >>>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>>> >>>>>>> and now the error is different: >>>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [1]PETSC ERROR: Floating point exception! >>>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [2]PETSC ERROR: Floating point exception! >>>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [3]PETSC ERROR: Floating point exception! >>>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [3]PETSC ERROR: Configure options >>>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [1]PETSC ERROR: Configure options >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> Configure options >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> --------------------- Error Message ------------------------------------ >>>>>>> [0]PETSC ERROR: Floating point exception! >>>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [0]PETSC ERROR: Configure options >>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> >>>>>>> #PETSc Option Table entries: >>>>>>> -ksp_view >>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>> -mg_levels_ksp_type chebyshev >>>>>>> -mg_levels_pc_type jacobi >>>>>>> -options_left >>>>>>> -pc_mg_galerkin >>>>>>> -pc_type mg >>>>>>> #End of PETSc Option Table entries >>>>>>> There are no unused options. >>>>>>> >>>>>>> Michele >>>>>>> >>>>>>> >>>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>>> >>>>>>> >>>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>> >>>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Barry, >>>>>>>>> >>>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>>> >>>>>>>>> ! Create matrix >>>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>> >>>>>>>>> ! Create rhs term >>>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>>> >>>>>>>>> ! Solve system >>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>>> >>>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>>> >>>>>>>>> >>>>>>>>> Michele >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Barry, >>>>>>>>>>> >>>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>>> >>>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>>> >>>>>>>>>>> The output is: >>>>>>>>>>> >>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> Arguments are incompatible! >>>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>> -da_refine 4 >>>>>>>>>>> -ksp_view >>>>>>>>>>> -options_left >>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>> -pc_type mg >>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>>> >>>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>>> ! Create Global Vectors >>>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>>> ! Create matrix >>>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>>> ! Create solver >>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>>> >>>>>>>>>>> ! To allow using option from command line >>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Michele >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>>> >>>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Barry, >>>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>>> Thank you, >>>>>>>>>>>>> Michele >>>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>>> >>>> >>> >> > From danyang.su at gmail.com Fri Aug 2 19:43:45 2013 From: danyang.su at gmail.com (Danyang Su) Date: Fri, 02 Aug 2013 17:43:45 -0700 Subject: [petsc-users] How to configure metis/parmetis in cygwin Message-ID: <51FC5241.7070704@gmail.com> Hi All, I can install petsc successfully in CYGWIN without metis/parmetis. But when i configure with metis or parmetis, there will be some error. First I tried the following configuration ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-64-bit-indices --download-f-blas-lapack --download-superlu_dist --download-mumps --download-hypre --download-parmetis --download-metis There is error in build metis library so I build metis and parmetis manually, and then configure with the following configuration ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' --with-cxx='win32fe cl' --with-64-bit-indices --with-parmetis-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/include --with-parmetis-lib=/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libparmetis/Release/parmetis.lib --with-metis-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/metis/include --with-metis-lib=/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib --download-f-blas-lapack --download-superlu_dist --download-mumps --download-hypre Then i get the following error ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- --with-metis-lib=['/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib'] and --with-metis-include=['/cygdrive/c/cygwin/packages/parmetis-4.0.3/metis/include'] did not work ******************************************************************************* The log file for the last configuration is attached. Thanks and regards, Danyang -------------- next part -------------- Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC =============================================================================== ** Cygwin-python detected. Threads do not work correctly. *** ** Disabling thread usage for this run of ./configure ******* =============================================================================== ================================================================================ ================================================================================ Starting Configure Run at Fri Aug 2 17:26:21 2013 Configure Options: --configModules=PETSc.Configure --optionsModule=PETSc.compilerOptions --with-cc="win32fe cl" --with-fc="win32fe ifort" --with-cxx="win32fe cl" --with-64-bit-indices --with-parmetis-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/include --with-parmetis-lib=/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libparmetis/Release/parmetis.lib --with-metis-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/metis/include --with-metis-lib=/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib --download-f-blas-lapack --download-superlu_dist --download-mumps --download-hypre --useThreads=0 Working directory: /cygdrive/c/cygwin/packages/petsc-3.4.2 Machine platform: ('CYGWIN_NT-6.1-WOW64', 'nwmop', '1.7.22(0.268/5/3)', '2013-07-22 17:06', 'i686', '') Python version: 2.7.3 (default, Dec 18 2012, 13:50:09) [GCC 4.5.3] ================================================================================ Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC ================================================================================ TEST configureExternalPackagesDir from config.framework(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/framework.py:821) TESTING: configureExternalPackagesDir from config.framework(config/BuildSystem/config/framework.py:821) ================================================================================ TEST configureDebuggers from PETSc.utilities.debuggers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/debuggers.py:22) TESTING: configureDebuggers from PETSc.utilities.debuggers(config/PETSc/utilities/debuggers.py:22) Find a default debugger and determine its arguments Checking for program /usr/local/bin/gdb...not found Checking for program /usr/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/gdb...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/gdb...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/gdb...not found Checking for program /cygdrive/c/Windows/system32/gdb...not found Checking for program /cygdrive/c/Windows/gdb...not found Checking for program /cygdrive/c/Windows/System32/Wbem/gdb...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/gdb...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/gdb...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/gdb...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/gdb...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/gdb...not found Checking for program /cygdrive/c/MinGW/bin/gdb...found Defined make macro "GDB" to "/cygdrive/c/MinGW/bin/gdb" Checking for program /usr/local/bin/dbx...not found Checking for program /usr/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/dbx...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/dbx...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/dbx...not found Checking for program /cygdrive/c/Windows/system32/dbx...not found Checking for program /cygdrive/c/Windows/dbx...not found Checking for program /cygdrive/c/Windows/System32/Wbem/dbx...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dbx...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/dbx...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/dbx...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/dbx...not found Checking for program /cygdrive/c/MinGW/bin/dbx...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/dbx...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/dbx...not found Checking for program /home/dsu/dbx...not found Checking for program /usr/local/bin/xdb...not found Checking for program /usr/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/xdb...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/xdb...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/xdb...not found Checking for program /cygdrive/c/Windows/system32/xdb...not found Checking for program /cygdrive/c/Windows/xdb...not found Checking for program /cygdrive/c/Windows/System32/Wbem/xdb...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/xdb...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/xdb...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/xdb...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/xdb...not found Checking for program /cygdrive/c/MinGW/bin/xdb...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/xdb...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/xdb...not found Checking for program /home/dsu/xdb...not found Checking for program /usr/local/bin/dsymutil...not found Checking for program /usr/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/dsymutil...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/dsymutil...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/dsymutil...not found Checking for program /cygdrive/c/Windows/system32/dsymutil...not found Checking for program /cygdrive/c/Windows/dsymutil...not found Checking for program /cygdrive/c/Windows/System32/Wbem/dsymutil...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/dsymutil...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/dsymutil...not found Checking for program /cygdrive/c/MinGW/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/dsymutil...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/dsymutil...not found Checking for program /home/dsu/dsymutil...not found Defined make macro "DSYMUTIL" to "true" Defined "USE_GDB_DEBUGGER" to "1" ================================================================================ TEST configureCLanguage from PETSc.utilities.languages(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/languages.py:28) TESTING: configureCLanguage from PETSc.utilities.languages(config/PETSc/utilities/languages.py:28) Choose whether to compile the PETSc library using a C or C++ compiler C language is C Defined "CLANGUAGE_C" to "1" ================================================================================ TEST configureFortranLanguage from PETSc.utilities.languages(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/languages.py:37) TESTING: configureFortranLanguage from PETSc.utilities.languages(config/PETSc/utilities/languages.py:37) Turn on Fortran bindings Using Fortran ================================================================================ TEST configureMake from config.programs(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/programs.py:24) TESTING: configureMake from config.programs(config/BuildSystem/config/programs.py:24) Check various things about make Checking for program /usr/local/bin/make...not found Checking for program /usr/bin/make...found Defined make macro "MAKE" to "/usr/bin/make" Checking for program /usr/local/bin/strings...not found Checking for program /usr/bin/strings...found Defined make macro "STRINGS" to "/usr/bin/strings" sh: /usr/bin/strings /usr/bin/make Executing: /usr/bin/strings /usr/bin/make sh: $xqB [^_] [^_] 3S$3C [^_] ,[^_] <-tL<@t<<+uI ,[^_] ue;} <@u. [^_] [^] r.v(1 [^_] [^_] L[^_] [^_] [^_] [^_] t+;5 [^_] [^_] <[^_] $/*B $L*B <[^_] [^_] [^_] $T+B $T+B $T+B $4+B ,[^_] [^_] ,[^_] $l,B S$;V$ [^_] $4,B $l-B $H-B [^_] <[^_] ,[^_] <[^_] $T.B $G.B $`/B $e0B $z0B $|1B $(/B @[^] $P0B $80B $`1B $<1B <[^_] 2[^_] 2^_] 2^_] 2[^_] 2^_] 2^_] ,[^_] ;] [^_] [^_] t6;} [^_] \[^_] [^_] $,7B $h6B $@6B $@6B WVSP $\7B [^_] $08B $T8B $|8B $U9B [^] $h9B [^_] [^_] <\t__B 8.t. $m`B $V`B $+5P $(`B [^_] [^_] $$aB $HaB $5^B [^_] L[^_ $2aB L[^_] $5^B WVSP [^_] [^_] [^_] BN at t [^] [^_] ;-tc $4bB $,bB $DbB S$tL @N at u [^_] [^_] $$cB $tcB [^_] $CgB $DcB $lB ,[^_] $RlB t.<$u $ZlB ,[^_] [^_] [^_] [^_] [^_] t.;5 WVS1 8.t( [^_] [^_] tj=t ,[^_] ,[^_] $(){}$`^~! cyggcc_s-1.dll __register_frame_info cyggcj-11.dll _Jv_RegisterClasses __deregister_frame_info attempt to use unsupported feature: `%s' touch: Archive `%s' does not exist touch: `%s' is not a valid archive touch: touch: Member `%s' does not exist in `%s' touch: Bad return code from ar_member_touch on `%s' ! ARFILENAMES/ *** [%s] Archive member `%s' may be bogus; not deleted *** Archive member `%s' may be bogus; not deleted *** [%s] Deleting file `%s' *** Deleting file `%s' unlink: .SUFFIXES Recipe has too many lines (%ud) $(MAKE) ${MAKE} kill # recipe to execute (built-in): (from `%s', line %lu): %c%.*s .SUFFIXES SUFFIXES ARFLAGS OBJC CHECKOUT,v +$(if $(wildcard $@),,$(CO) $(COFLAGS) $< $@) COFLAGS $(CC) -E $(FC) F77FLAGS $(FFLAGS) LINT lint YACC yacc MAKEINFO makeinfo TEXI2DVI texi2dvi WEAVE weave CWEAVE cweave TANGLE tangle CTANGLE ctangle rm -f LINK.o $(CC) $(LDFLAGS) $(TARGET_ARCH) COMPILE.c $(CC) $(CFLAGS) $(CPPFLAGS) $(TARGET_ARCH) -c LINK.c $(CC) $(CFLAGS) $(CPPFLAGS) $(LDFLAGS) $(TARGET_ARCH) COMPILE.m $(OBJC) $(OBJCFLAGS) $(CPPFLAGS) $(TARGET_ARCH) -c LINK.m $(OBJC) $(OBJCFLAGS) $(CPPFLAGS) $(LDFLAGS) $(TARGET_ARCH) COMPILE.cc $(CXX) $(CXXFLAGS) $(CPPFLAGS) $(TARGET_ARCH) -c COMPILE.C $(COMPILE.cc) COMPILE.cpp LINK.cc $(CXX) $(CXXFLAGS) $(CPPFLAGS) $(LDFLAGS) $(TARGET_ARCH) LINK.C $(LINK.cc) LINK.cpp YACC.y $(YACC) $(YFLAGS) LEX.l $(LEX) $(LFLAGS) -t YACC.m LEX.m COMPILE.f $(FC) $(FFLAGS) $(TARGET_ARCH) -c LINK.f $(FC) $(FFLAGS) $(LDFLAGS) $(TARGET_ARCH) COMPILE.F $(FC) $(FFLAGS) $(CPPFLAGS) $(TARGET_ARCH) -c LINK.F $(FC) $(FFLAGS) $(CPPFLAGS) $(LDFLAGS) $(TARGET_ARCH) COMPILE.r $(FC) $(FFLAGS) $(RFLAGS) $(TARGET_ARCH) -c LINK.r $(FC) $(FFLAGS) $(RFLAGS) $(LDFLAGS) $(TARGET_ARCH) COMPILE.def $(M2C) $(M2FLAGS) $(DEFFLAGS) $(TARGET_ARCH) COMPILE.mod $(M2C) $(M2FLAGS) $(MODFLAGS) $(TARGET_ARCH) COMPILE.p $(PC) $(PFLAGS) $(CPPFLAGS) $(TARGET_ARCH) -c LINK.p $(PC) $(PFLAGS) $(CPPFLAGS) $(LDFLAGS) $(TARGET_ARCH) LINK.s $(CC) $(ASFLAGS) $(LDFLAGS) $(TARGET_MACH) COMPILE.s $(AS) $(ASFLAGS) $(TARGET_MACH) LINK.S $(CC) $(ASFLAGS) $(CPPFLAGS) $(LDFLAGS) $(TARGET_MACH) COMPILE.S $(CC) $(ASFLAGS) $(CPPFLAGS) $(TARGET_MACH) -c PREPROCESS.S $(CC) -E $(CPPFLAGS) PREPROCESS.F $(FC) $(FFLAGS) $(CPPFLAGS) $(TARGET_ARCH) -F PREPROCESS.r $(FC) $(FFLAGS) $(RFLAGS) $(TARGET_ARCH) -F LINT.c $(LINT) $(LINTFLAGS) $(CPPFLAGS) $(TARGET_ARCH) OUTPUT_OPTION -o $@ .LIBPATTERNS lib%.so lib%.a $(AR) $(ARFLAGS) $@ $< %.out @rm -f $@ cp $< $@ %.w %.ch $(CTANGLE) $^ $@ %.tex $(CWEAVE) $^ $@ $(CHECKOUT,v) RCS/%,v RCS/% $(GET) $(GFLAGS) $(SCCS_OUTPUT_OPTION) $< SCCS/s.% $(LINK.o) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.s) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.S) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.c) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.cc) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.C) $^ $(LOADLIBES) $(LDLIBS) -o $@ .cpp $(LINK.cpp) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.f) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.m) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.p) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.F) $^ $(LOADLIBES) $(LDLIBS) -o $@ $(LINK.r) $^ $(LOADLIBES) $(LDLIBS) -o $@ .mod $(COMPILE.mod) -o $@ -e $@ $^ .def.sym $(COMPILE.def) -o $@ $< cat $< >$@ chmod a+x $@ .s.o $(COMPILE.s) -o $@ $< .S.o $(COMPILE.S) -o $@ $< .c.o $(COMPILE.c) $(OUTPUT_OPTION) $< .cc.o $(COMPILE.cc) $(OUTPUT_OPTION) $< .C.o $(COMPILE.C) $(OUTPUT_OPTION) $< .cpp.o $(COMPILE.cpp) $(OUTPUT_OPTION) $< .f.o $(COMPILE.f) $(OUTPUT_OPTION) $< .m.o $(COMPILE.m) $(OUTPUT_OPTION) $< .p.o $(COMPILE.p) $(OUTPUT_OPTION) $< .F.o $(COMPILE.F) $(OUTPUT_OPTION) $< .r.o $(COMPILE.r) $(OUTPUT_OPTION) $< .mod.o $(COMPILE.mod) -o $@ $< .c.ln $(LINT.c) -C$* $< .y.ln $(YACC.y) $< $(LINT.c) -C$* y.tab.c $(RM) y.tab.c .l.ln @$(RM) $*.c $(LEX.l) $< > $*.c $(LINT.c) -i $*.c -o $@ $(RM) $*.c .y.c $(YACC.y) $< mv -f y.tab.c $@ .l.c @$(RM) $@ $(LEX.l) $< > $@ .ym.m $(YACC.m) $< mv -f y.tab.c $@ .lm.m @$(RM) $@ $(LEX.m) $< > $@ .F.f $(PREPROCESS.F) $(OUTPUT_OPTION) $< .r.f $(PREPROCESS.r) $(OUTPUT_OPTION) $< .l.r $(LEX.l) $< > $@ mv -f lex.yy.r $@ .S.s $(PREPROCESS.S) $< > $@ .texinfo.info $(MAKEINFO) $(MAKEINFO_FLAGS) $< -o $@ .texi.info .txinfo.info .tex.dvi $(TEX) $< .texinfo.dvi $(TEXI2DVI) $(TEXI2DVI_FLAGS) $< .texi.dvi .txinfo.dvi .w.c $(CTANGLE) $< - $@ .web.p $(TANGLE) $< .w.tex $(CWEAVE) $< - $@ .web.tex $(WEAVE) $< INTERNAL: readdir: %s # Directories # %s: could not be stat'd. # %s (device %ld, inode %ld): could not be opened. # %s (device %ld, inode %ld): files, impossibilities so far. impossibilities in %lu directories. Recursive variable `%s' references itself (eventually) unterminated variable reference warning: undefined variable `%.*s' name %s: Field '%s' not cached: %s hname vpath stem *name != '\0' /netrel/src/make-3.82.90-1/file.c strcache_iscached (name) Recipe was specified for file `%s' at %s:%lu, Recipe for file `%s' was found by implicit rule search, but `%s' is now considered the same file as `%s'. Recipe for `%s' will be ignored in favor of the one for `%s'. can't rename single-colon `%s' to double-colon `%s' can't rename double-colon `%s' to single-colon `%s' *** Deleting intermediate file `%s' Removing intermediate files... unlink: .SUFFIXES .PRECIOUS .LOW_RESOLUTION_TIME .PHONY .INTERMEDIATE .SECONDARY .EXPORT_ALL_VARIABLES .IGNORE .SILENT .NOTPARALLEL %04d-%02d-%02d %02d:%02d:%02d .%09d Current time %s: Timestamp out of range; substituting %s | %s .RECIPEPREFIX = # Not a target: %s:%s # Precious file (prerequisite of .PRECIOUS). # Phony target (prerequisite of .PHONY). # Command line target. # A default, MAKEFILES, or -include/sinclude makefile. # Implicit rule search has been done. # Implicit rule search has not been done. # Implicit/static pattern stem: `%s' # File is an intermediate prerequisite. # Also makes: # Modification time never checked. # File does not exist. # File is very old. # Last modified %s # File has been updated. # File has not been updated. # Recipe currently running (THIS IS A BUG). # Dependencies recipe running (THIS IS A BUG). # Successfully updated. question_flag # Needs to be updated (-q is set). # Failed to be updated. # Invalid value in `update_status' member! # Files # files hash-table stats: print_file enter_file lookup_file undefined recursive simple Internal error: func_error: '%s' insufficient number of arguments (%d) to function `%s' unimplemented on this platform: function `%s' warning: undefined variable `%.*s' default environment file environment override command line override automatic suffix basename addprefix filter %s: '%s' non-numeric first argument to `wordlist' function non-numeric second argument to `wordlist' function invalid first argument to `wordlist' function: `%d' non-numeric first argument to `word' function first argument to `word' function must be greater than 0 %s:%lu: pipe fork Cleaning up temporary batch file %s unterminated call to function `%s': missing `%c' abspath addsuffix notdir subst filter-out findstring firstword flavor join lastword patsubst realpath shell sort strip wildcard word wordlist words origin foreach call info error warning value eval POSIXLY_CORRECT %s: option `%s' is ambiguous %s: option `--%s' doesn't allow an argument %s: option `%c%s' doesn't allow an argument %s: option `%s' requires an argument %s: unrecognized option `--%s' %s: unrecognized option `%c%s' %s: illegal option -- %c %s: invalid option -- %c %s: option requires an argument -- %c %s: option `-W %s' is ambiguous %s: option `-W %s' doesn't allow an argument Avoiding implicit rule recursion. Trying pattern rule with stem `%.*s'. Rejecting impossible rule prerequisite `%s'. Rejecting impossible implicit prerequisite `%s'. Trying rule prerequisite `%s'. Trying implicit prerequisite `%s'. Found prerequisite `%s' as VPATH `%s' Looking for a rule with intermediate file `%s'. Looking for an implicit rule for `%s'. Looking for archive-member implicit rule for `%s'. INTERNAL: Freeing child %p (%s) but no tokens left! write jobserver Released token for child %p (%s). *** (core dumped) (ignored) %s:%lu %s: recipe for target `%s' failed %s[%s] Error %d%s %s[%s] %s%s%s bash rksh dash ap <= end /netrel/src/make-3.82.90-1/job.c %s (line %d) Bad shell context (!unixy && !batch_mode_shell) %s: Command not found SHELL %s: Shell program not found execvp: $(SHELL) $(.SHELLFLAGS) $(IFS) vfork cannot enforce load limits on this operating system cannot enforce load limit: getloadavg Estimated system load = %f (actual = %f) (max requested = %f) (remote) Putting child %p (%s) PID %s%s on the chain. f->command_state == cs_finished *** Waiting for unfinished jobs.... Live child %p (%s) PID %s %s remote_status wait Reaping losing child %p PID %s %s Reaping winning child %p PID %s %s Cleaning up temp batch file %s .DELETE_ON_ERROR Removing child %p PID %s%s from chain. don't Need a job token; we %shave children Duplicate the job FD INTERNAL: no children as we go to sleep on read Obtained token for child %p (%s). read jobs pipe Read returned EBADF. %s: target `%s' does not exist %s: update target `%s' due to: %s construct_command_argv_internal break case continue eval exec exit export login logout read readonly shift switch test times trap ulimit umask unset while start_waiting_job %sGNU Make %s %sBuilt for %s %sBuilt for %s (%s) %sCopyright (C) 2010 Free Software Foundation, Inc. %sLicense GPLv3+: GNU GPL version 3 or later %sThis is free software: you are free to change and redistribute it. %sThere is NO WARRANTY, to the extent permitted by law. # Make data base, printed on %s # Finished Make data base on %s INTERNAL: Exiting with %u jobserver tokens (should be 0)! write INTERNAL: Exiting with %u jobserver tokens available; should be %u! empty string invalid as file name MAKECMDGOALS MFLAGS MAKEFLAGS %s: Entering an unknown directory %s: Leaving an unknown directory %s: Entering directory `%s' %s: Leaving directory `%s' %s[%u]: Entering an unknown directory %s[%u]: Leaving an unknown directory %s[%u]: Entering directory `%s' %s[%u]: Leaving directory `%s' the `%s%s' option requires a non-empty string argument the `-%c' option requires a positive integral argument Usage: %s [options] [target] ... This program built for %s This program built for %s (%s) Report bugs to /tmp /usr/share/locale make getcwd .VARIABLES .RECIPEPREFIX .SHELLFLAGS target-specific order-only second-expansion else-if shortest-stem undefine oneshell archives jobserver check-symlink .FEATURES SHELL MAKE_RESTARTS unknown debug level specification `%s' MAKE_COMMAND $(MAKE_COMMAND) MAKE -*-command-variables-*- ${-*-command-variables-*-} MAKEOVERRIDES MAKELEVEL CURDIR Makefile from standard input specified twice. TMPDIR fopen (temporary file) fwrite (temporary file) .DEFAULT .DEFAULT_GOAL -*-eval-flags-*- internal error: multiple --jobserver-fds options %d,%d internal error: invalid --jobserver-fds string `%s' Jobserver client (fds %d,%d) warning: -jN forced in submake: disabling jobserver mode. dup jobserver warning: jobserver unavailable: using -j1. Add `+' to parent make rule. creating jobs pipe init jobserver pipe Updating makefiles.... Makefile `%s' might loop; not remaking it. Failed to remake makefile `%s'. Included makefile `%s' was not found. Makefile `%s' was not found chdir Couldn't change back to original directory. MAKEFLAGS=%s Re-executing[%u]: %s=%u MAKE_RESTARTS= MAKE_RESTARTS=%u BOGUS_UPDATE_STATUS /netrel/src/make-3.82.90-1/main.c unlink (temporary file): .DEFAULT_GOAL contains more than one target No targets specified and no makefile found No targets Updating goal targets.... warning: Clock skew detected. Your build may be incomplete. main always-make directory basic debug environment-overrides file help ignore-errors include-dir jobs jobserver-fds keep-going load-average check-symlink-times just-print old-file print-data-base question no-builtin-rules no-builtin-variables silent no-keep-going touch trace version print-directory no-print-directory what-if warn-undefined-variables eval quiet stop new-file assume-new assume-old max-load dry-run recon makefile Options: -b, -m Ignored for compatibility. -B, --always-make Unconditionally make all targets. -C DIRECTORY, --directory=DIRECTORY Change to DIRECTORY before doing anything. -d Print lots of debugging information. --debug[=FLAGS] Print various types of debugging information. -e, --environment-overrides Environment variables override makefiles. --eval=STRING Evaluate STRING as a makefile statement. -f FILE, --file=FILE, --makefile=FILE Read FILE as a makefile. -h, --help Print this message and exit. -i, --ignore-errors Ignore errors from recipes. -I DIRECTORY, --include-dir=DIRECTORY Search DIRECTORY for included makefiles. -j [N], --jobs[=N] Allow N jobs at once; infinite jobs with no arg. -k, --keep-going Keep going when some targets can't be made. -l [N], --load-average[=N], --max-load[=N] Don't start multiple jobs unless load is below N. -L, --check-symlink-times Use the latest mtime between symlinks and target. -n, --just-print, --dry-run, --recon Don't actually run any recipe; just print them. -o FILE, --old-file=FILE, --assume-old=FILE Consider FILE to be very old and don't remake it. -p, --print-data-base Print make's internal database. -q, --question Run no recipe; exit status says if up to date. -r, --no-builtin-rules Disable the built-in implicit rules. -R, --no-builtin-variables Disable the built-in variable settings. -s, --silent, --quiet Don't echo recipes. -S, --no-keep-going, --stop Turns off -k. -t, --touch Touch targets instead of remaking them. --trace Print tracing information. -v, --version Print the version number of make and exit. -w, --print-directory Print the current directory. --no-print-directory Turn off -w, even if it was turned on implicitly. -W FILE, --what-if=FILE, --new-file=FILE, --assume-new=FILE Consider FILE to be infinitely new. --warn-undefined-variables Warn when an undefined variable is referenced. %s: %s[%u]: %s:%lu: %s:%lu: *** %s: *** %s[%u]: *** . Stop. %s%s: %s %s: %s virtual memory exhausted write error: %s write error ifdef ifndef ifeq ifneq else endif Extraneous text after `%s' directive extraneous `%s' only one `else' per conditional warning: NUL character seen; rest of line ignored export override private define undefine prerequisites cannot be defined in recipes mixed implicit and static pattern rules mixed implicit and normal rules .POSIX .SHELLFLAGS .SECONDEXPANSION .ONESHELL target `%s' doesn't match the target pattern target file `%s' has both : and :: entries target `%s' given more than once in the same rule. warning: overriding recipe for target `%s' warning: ignoring old recipe for target `%s' $(HOME) HOME .INCLUDE_DIRS virtual memory exhausted empty variable name extraneous text after `define' directive missing `endef', unterminated `define' endef extraneous text after `endef' directive v != NULL /netrel/src/make-3.82.90-1/read.c invalid syntax in conditional unexport vpath include -include sinclude %s: %s recipe commences before first target missing rule before recipe (did you mean TAB instead of 8 spaces?) missing separator%s *p2 != '\0' v != 0 Malformed target-specific variable definition missing target pattern multiple target patterns target pattern contains no `%%' .DEFAULT_GOAL missing `endif' Reading makefile `%s' (no default goal) (search path) (don't care) (no ~ expansion) MAKEFILE_LIST Reading makefiles... $(MAKEFILES) /usr/include /usr/gnu/include /usr/local/include eval record_target_var GNUmakefile makefile Makefile stat: lstat: readlink: %sNo rule to make target `%s'%s %sNo rule to make target `%s', needed by `%s'%s *** $(.LIBPATTERNS) .LIBPATTERNS element `%s' is not a pattern %s/%s %.2g Warning: File `%s' has modification time %s s in the future touch %s touch: open: touch: fstat: touch: read: touch: lseek: touch: write: Pruning file `%s'. Considering target file `%s'. Recently tried and failed to update file `%s'. File `%s' was considered already. Still updating file `%s'. Finished updating file `%s'. File `%s' does not exist. *** Warning: .LOW_RESOLUTION_TIME file `%s' has a high resolution time stamp Found an implicit rule for `%s'. No implicit rule found for `%s'. Using default recipe for `%s'. Circular %s <- %s dependency dropped. Finished prerequisites of target file `%s'. The prerequisites of `%s' are being made. Giving up on target file `%s'. Target `%s' not remade because of errors. Prerequisite `%s' is order-only for target `%s'. Prerequisite `%s' of target `%s' does not exist. Prerequisite `%s' is newer than target `%s'. Prerequisite `%s' is older than target `%s'. Target `%s' is double-colon and has no prerequisites. No recipe for `%s' and no prerequisites actually changed. Making `%s' due to always-make flag. No need to remake target `%s' ; using VPATH name `%s' Must remake target `%s'. Ignoring VPATH name `%s'. Recipe of `%s' is being run. Failed to remake target file `%s'. Successfully remade target file `%s'. Target file `%s' needs remade under -q. file->update_status >= 0 && file->update_status <= 2 /netrel/src/make-3.82.90-1/remake.c Using default commands for `%s'. Nothing to be done for `%s'. `%s' is up to date. /lib /usr/lib update_file_1 _(knN r->suffixes[0] != NULL /netrel/src/make-3.82.90-1/rule.c r->suffixes[i] != NULL (%.o) # Implicit Rules # No implicit rules. # %u implicit rules, %u (%.1f%%) terminal. BUG: num_pattern_rules is wrong! %u != %u create_pattern_rule install_pattern_rule %s No strcache buffers total_buffers == numbuffs + 1 /netrel/src/make-3.82.90-1/strcache.c %s strcache buffers: %lu (%lu) / strings = %lu / storage = %lu B / avg = %lu B %s current buf: size = %hu B / used = %hu B / count = %hu / avg = %hu B %s other used: total = %lu B / count = %lu / avg = %lu B %s other free: total = %lu B / max = %lu B / min = %lu B / avg = %hu B %s strcache performance: lookups = %lu / hit rate = %lu%% # hash-table stats: strcache_print_stats automatic default environment makefile environment under -e command line `override' directive private (from `%s', line %lu) define %s endef %s %s= $(subst ,,%s) .RECIPEPREFIX .VARIABLES current_variable_set_list->next != NULL /netrel/src/make-3.82.90-1/variable.c MAKELEVEL %s%s%s MAKE_VERSION SHELL MAKEFILES $(patsubst %/,%,$(dir $@)) $(patsubst %/,%,$(dir $%)) $(patsubst %/,%,$(dir $*)) $(patsubst %/,%,$(dir $<)) $(patsubst %/,%,$(dir $?)) $(patsubst %/,%,$(dir $^)) $(patsubst %/,%,$(dir $+)) $(notdir $@) $(notdir $%) $(notdir $*) $(notdir $<) $(notdir $?) $(notdir $^) $(notdir $+) %s=%u empty variable name # variable set hash-table stats: # Variables # Pattern-specific Variable Values %s : # %u pattern-specific variable values # No pattern-specific variable values. pop_variable_scope 3.82.90 i686-pc-cygwin $(strip $(VPATH)) $(strip $(GPATH)) # VPATH Search Paths vpath %s %s%c # No `vpath' search paths. # %u `vpath' search paths. # No general (`VPATH' variable) search path. # General (`VPATH' variable) search path: can't allocate %lu bytes for hash table: memory exhausted Load=%ld/%ld=%.0f%%, Rehash=%d, Collisions=%ld/%ld=%.0f%% BPOSIXLY_CORRECT alnum alpha blank cntrl digit graph lower print punct space upper xdigit next != NULL /netrel/src/make-3.82.90-1/glob/glob.c HOME glob GetModuleHandleA GetProcAddress __assert_func __ctype_ptr__ __errno __getreent __main _dll_crt0 at 0 _exit _fcntl64 _fdopen64 _fopen64 _fstat64 _impure_ptr _lseek64 _lstat64 _open64 _stat64 abort alarm atexit atof atoi atol calloc chdir clock_gettime close closedir ctime cygwin_detach_dll cygwin_internal dll_dllcrt0 dup2 execvp exit fclose fflush fgets fileno fork fprintf fputs fread free fwrite getcwd getenv getlogin getpid getpwnam getrlimit kill localtime malloc memcpy memmove memset mkstemp opendir pipe printf putc putenv puts qsort read readdir readlink realloc realpath remove setlocale setrlimit setvbuf sigaction sigaddset sigemptyset signal sigprocmask sprintf sscanf strchr strcmp strcpy strdup strerror strlen strncmp strncpy strndup strpbrk strrchr strsignal strstr time tolower unlink vfork vfprintf wait waitpid write libintl_bindtextdomain libintl_gettext libintl_textdomain KERNEL32.dll cygwin1.dll cygintl-8.dll Defined make macro "OMAKE " to "/usr/bin/make --no-print-directory" Defined make rule "libc" with dependencies "${LIBNAME}(${OBJSC})" and code [] Defined make rule "libcu" with dependencies "${LIBNAME}(${OBJSCU})" and code [] Defined make rule "libf" with dependencies "${OBJSF}" and code -${AR} ${AR_FLAGS} ${LIBNAME} ${OBJSF} module multiprocessing found 12 cores: using make_np = 5 Defined make macro "MAKE_NP" to "5" ================================================================================ TEST configureMkdir from config.programs(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/programs.py:111) TESTING: configureMkdir from config.programs(config/BuildSystem/config/programs.py:111) Make sure we can have mkdir automatically make intermediate directories Checking for program /usr/local/bin/mkdir...not found Checking for program /usr/bin/mkdir...found sh: /usr/bin/mkdir -p .conftest/tmp Executing: /usr/bin/mkdir -p .conftest/tmp sh: Adding -p flag to /usr/bin/mkdir -p to automatically create directories Defined make macro "MKDIR" to "/usr/bin/mkdir -p" ================================================================================ TEST configurePrograms from config.programs(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/programs.py:133) TESTING: configurePrograms from config.programs(config/BuildSystem/config/programs.py:133) Check for the programs needed to build and run PETSc Checking for program /usr/local/bin/sh...not found Checking for program /usr/bin/sh...found Defined make macro "SHELL" to "/usr/bin/sh" Checking for program /usr/local/bin/sed...not found Checking for program /usr/bin/sed...found Defined make macro "SED" to "/usr/bin/sed" All intermediate test results are stored in /tmp/petsc-1nzsmm All intermediate test results are stored in /tmp/petsc-1nzsmm/config.programs sh: /usr/bin/sed -i s/sed/sd/g "/tmp/petsc-1nzsmm/config.programs/sed1" Executing: /usr/bin/sed -i s/sed/sd/g "/tmp/petsc-1nzsmm/config.programs/sed1" sh: Adding SEDINPLACE cmd: /usr/bin/sed -i Defined make macro "SEDINPLACE" to "/usr/bin/sed -i" Checking for program /usr/local/bin/mv...not found Checking for program /usr/bin/mv...found Defined make macro "MV" to "/usr/bin/mv" Checking for program /usr/local/bin/cp...not found Checking for program /usr/bin/cp...found Defined make macro "CP" to "/usr/bin/cp" Checking for program /usr/local/bin/grep...not found Checking for program /usr/bin/grep...found Defined make macro "GREP" to "/usr/bin/grep" Checking for program /usr/local/bin/rm...not found Checking for program /usr/bin/rm...found Defined make macro "RM" to "/usr/bin/rm -f" Checking for program /usr/local/bin/diff...not found Checking for program /usr/bin/diff...found sh: "/usr/bin/diff" -w "/tmp/petsc-1nzsmm/config.programs/diff1" "/tmp/petsc-1nzsmm/config.programs/diff2" Executing: "/usr/bin/diff" -w "/tmp/petsc-1nzsmm/config.programs/diff1" "/tmp/petsc-1nzsmm/config.programs/diff2" sh: Defined make macro "DIFF" to "/usr/bin/diff -w" Checking for program /usr/ucb/ps...not found Checking for program /usr/usb/ps...not found Checking for program /home/dsu/ps...not found Checking for program /usr/local/bin/gzip...not found Checking for program /usr/bin/gzip...found Defined make macro "GZIP" to "/usr/bin/gzip" Defined "HAVE_GZIP" to "1" Defined make macro "PYTHON" to "/usr/bin/python" ================================================================================ TEST configureGit from config.sourceControl(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/sourceControl.py:24) TESTING: configureGit from config.sourceControl(config/BuildSystem/config/sourceControl.py:24) Find the Git executable Checking for program /usr/local/bin/git...not found Checking for program /usr/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/git...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/git...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/git...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/git...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/git...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/git...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/git...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/git...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/git...not found Checking for program /cygdrive/c/Windows/system32/git...not found Checking for program /cygdrive/c/Windows/git...not found Checking for program /cygdrive/c/Windows/System32/Wbem/git...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/git...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/git...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/git...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/git...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/git...not found Checking for program /cygdrive/c/MinGW/bin/git...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/git...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/git...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/git...not found Checking for program /home/dsu/git...not found ================================================================================ TEST configureMercurial from config.sourceControl(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/sourceControl.py:35) TESTING: configureMercurial from config.sourceControl(config/BuildSystem/config/sourceControl.py:35) Find the Mercurial executable Checking for program /usr/local/bin/hg...not found Checking for program /usr/bin/hg...found Defined make macro "HG" to "hg" sh: hg version -q Executing: hg version -q sh: Mercurial Distributed SCM (version 2.5.2) ================================================================================ TEST configureCVS from config.sourceControl(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/sourceControl.py:46) TESTING: configureCVS from config.sourceControl(config/BuildSystem/config/sourceControl.py:46) Find the CVS executable Checking for program /usr/local/bin/cvs...not found Checking for program /usr/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/cvs...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/cvs...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/cvs...not found Checking for program /cygdrive/c/Windows/system32/cvs...not found Checking for program /cygdrive/c/Windows/cvs...not found Checking for program /cygdrive/c/Windows/System32/Wbem/cvs...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/cvs...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/cvs...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/cvs...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/cvs...not found Checking for program /cygdrive/c/MinGW/bin/cvs...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/cvs...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/cvs...not found Checking for program /home/dsu/cvs...not found ================================================================================ TEST configureSubversion from config.sourceControl(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/sourceControl.py:55) TESTING: configureSubversion from config.sourceControl(config/BuildSystem/config/sourceControl.py:55) Find the Subversion executable Checking for program /usr/local/bin/svn...not found Checking for program /usr/bin/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/svn...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/svn...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/svn...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/svn...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/svn...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/svn...not found Checking for program /cygdrive/c/Windows/system32/svn...not found Checking for program /cygdrive/c/Windows/svn...not found Checking for program /cygdrive/c/Windows/System32/Wbem/svn...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/svn...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/svn...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/svn...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/svn...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/svn...found Defined make macro "SVN" to "svn" sh: svn --version -q Executing: svn --version -q sh: 1.7.7 ================================================================================ TEST configureDirectories from PETSc.utilities.petscdir(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/petscdir.py:28) TESTING: configureDirectories from PETSc.utilities.petscdir(config/PETSc/utilities/petscdir.py:28) Checks PETSC_DIR and sets if not set Version Information: #define PETSC_VERSION_RELEASE 1 #define PETSC_VERSION_MAJOR 3 #define PETSC_VERSION_MINOR 4 #define PETSC_VERSION_SUBMINOR 2 #define PETSC_VERSION_PATCH 0 #define PETSC_VERSION_DATE "Jul, 02, 2013" #define PETSC_VERSION_GIT "a071802d3efee8b987703a6ce2cf5d9a25fa8160" #define PETSC_VERSION_DATE_GIT "2013-07-02 09:33:06 -0500" #define PETSC_VERSION_(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ Defined make macro "DIR" to "/cygdrive/c/cygwin/packages/petsc-3.4.2" Defined "DIR" to ""/cygdrive/c/cygwin/packages/petsc-3.4.2"" ================================================================================ TEST configureExternalPackagesDir from PETSc.utilities.petscdir(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/petscdir.py:77) TESTING: configureExternalPackagesDir from PETSc.utilities.petscdir(config/PETSc/utilities/petscdir.py:77) ================================================================================ TEST configureInstallationMethod from PETSc.utilities.petscdir(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/petscdir.py:84) TESTING: configureInstallationMethod from PETSc.utilities.petscdir(config/PETSc/utilities/petscdir.py:84) This is a tarball installation ================================================================================ TEST configureETags from PETSc.utilities.Etags(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/Etags.py:27) TESTING: configureETags from PETSc.utilities.Etags(config/PETSc/utilities/Etags.py:27) Determine if etags files exist and try to create otherwise Found etags file ================================================================================ TEST getDatafilespath from PETSc.utilities.dataFilesPath(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/dataFilesPath.py:29) TESTING: getDatafilespath from PETSc.utilities.dataFilesPath(config/PETSc/utilities/dataFilesPath.py:29) Checks what DATAFILESPATH should be Defined make macro "DATAFILESPATH" to "None" ================================================================================ TEST printEnvVariables from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1504) TESTING: printEnvVariables from config.setCompilers(config/BuildSystem/config/setCompilers.py:1504) **** printenv **** LIB=C:\Program Files (x86)\Intel\Composer XE 2013\compiler\lib;C:\Program Files (x86)\Intel\Composer XE 2013\compiler\lib\intel64;C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\LIB\amd64;C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\LIB\amd64;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\lib\x64;C:\Program Files (x86)\Intel\Composer XE 2013\mkl\lib\intel64;C:\Program Files (x86)\Intel\Composer XE 2013\compiler\lib\intel64; VS100COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\ COMPUTERNAME=NWMOP SCRIPT_NAME=compilervars_arch.bat ADVISOR_XE_2013_DIR=C:\Program Files (x86)\Intel\Advisor XE 2013\.\ !C:=C:\Program Files (x86)\Intel\Composer XE 2013 WIN_TITLE_ARCH=Intel(R) 64 PRODUCT_NAME=Intel Composer XE 2013 INFOPATH=/usr/local/info:/usr/share/info:/usr/info: SHELL=/bin/bash CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files INTEL_LICENSE_FILE=C:\Program Files (x86)\Common Files\Intel\Licenses FrameworkVersion64=v4.0.30319 MANPATH=/usr/local/man:/usr/share/man:/usr/man: FrameworkDir=C:\Windows\Microsoft.NET\Framework64 ARCH_PATH_MPI=em64t COMSPEC=C:\Windows\system32\cmd.exe WIN_TITLE_VS=Visual Studio 2010 ARCH_PATH=intel64 TARGET_ARCH=intel64 HOMEDRIVE=C: MKLROOT=C:\Program Files (x86)\Intel\Composer XE 2013\mkl FrameworkVersion=v4.0.30319 SYSTEMDRIVE=C: HOSTNAME=nwmop PETSC_ARCH=arch-mswin-c-debug PROCESSOR_LEVEL=6 OS=Windows_NT C_INCLUDE_PATH=C:\MinGW\include PRODUCT_NAME_FULL=Intel(R) Composer XE 2013 Update 5 (package 198) TARGET_VS=vs2010 INTEL_DEV_REDIST=C:\Program Files (x86)\Common Files\Intel\Shared Libraries\ Platform=X64 CommandPromptType=Native USER=dsu IFORT_COMPILER13=C:\Program Files (x86)\Intel\Composer XE 2013\ SYSTEMROOT=C:\Windows PS1=\[\e]0;\w\a\]\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$ tmp=C:\Users\dsu\AppData\Local\Temp TEMP=/tmp SHLVL=1 VISUALSVN_SERVER=C:\Program Files (x86)\VisualSVN Server\ PETSC_DIR=/cygdrive/c/cygwin/packages/petsc-3.4.2 HOMEPATH=\Users\dsu WindowsSdkDir=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\ ROOT=C:\Program Files (x86)\Intel\Composer XE 2013 LOGONSERVER=\\NWMOP C_TARGET_ARCH=intel64 MSVS_VAR_SCRIPT="C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\..\..\VC\vcvarsall.bat" PRINTER=HP LaserJet P1505n INSPECTOR_2013_DIR=C:\Program Files (x86)\Intel\Inspector XE 2013\ SESSIONNAME=Console INCLUDE=C:\Program Files (x86)\Intel\Composer XE 2013\compiler\include;C:\Program Files (x86)\Intel\Composer XE 2013\compiler\include\intel64;C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\INCLUDE;C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\INCLUDE;C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\include;C:\Program Files (x86)\Intel\Composer XE 2013\mkl\include; PATH_PHAST=C:\Program Files (x86)\USGS\phast-2.4.1-7430\\bin APPDATA=C:\Users\dsu\AppData\Roaming PUBLIC=C:\Users\Public VBOX_INSTALL_PATH=C:\Program Files\Oracle\VirtualBox\ TMP=/tmp PSModulePath=C:\Windows\system32\WindowsPowerShell\v1.0\Modules\ USERDOMAIN=nwmop COMMONPROGRAMFILES=C:\Program Files (x86)\Common Files HOME=/home/dsu LANG=en_US.UTF-8 LIBRARY_PATH=C:\MinGW\lib ProgramData=C:\ProgramData PROCESSOR_ARCHITECTURE=x86 ALLUSERSPROFILE=C:\ProgramData _=./configure BUNDLE_NAME=Intel(R) Parallel Studio XE 2013 ProgramFiles(x86)=C:\Program Files (x86) ProgramW6432=C:\Program Files USERNAME=dsu FrameworkDIR64=C:\Windows\Microsoft.NET\Framework64 PROMPT=$P$G INSPECTOR_XE_2013_DIR=C:\Program Files (x86)\Intel\Inspector XE 2013\ PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC CommonProgramW6432=C:\Program Files\Common Files WINDIR=C:\Windows Framework35Version=v3.5 temp=C:\Users\dsu\AppData\Local\Temp NUMBER_OF_PROCESSORS=12 PROCESSOR_ARCHITEW6432=AMD64 WIN_TITLE=Intel Composer XE 2013 Intel(R) 64 Visual Studio 2010 TARGET_VS_ARCH=amd64 OLDPWD=/home/dsu USERPROFILE=C:\Users\dsu LIBPATH=C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Windows\Microsoft.NET\Framework64\v3.5;C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\LIB\amd64;C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ATLMFC\LIB\amd64; PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 45 Stepping 7, GenuineIntel PROGRAMFILES=C:\Program Files (x86) PROCESSOR_REVISION=2d07 PATH=/usr/local/bin:/usr/bin:/cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64:/cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64:/cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319:/cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools:/cygdrive/c/Program Files (x86)/HTML Help Workshop:/cygdrive/c/Program Files (x86)/HTML Help Workshop:/cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64:/cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64:/cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin:/cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl:/cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler:/cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32:/cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32:/cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32:/cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt:/cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt:/cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler:/cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler:/cygdrive/c/Windows/system32:/cygdrive/c/Windows:/cygdrive/c/Windows/System32/Wbem:/cygdrive/c/Windows/System32/WindowsPowerShell/v1.0:/cygdrive/c/Program Files/TEC100/BIN:/cygdrive/c/Program Files (x86)/MPICH2/bin:/cygdrive/c/Program Files/MPICH2/bin:/cygdrive/c/Program Files (x86)/VisualSVN/bin:/cygdrive/c/Program Files (x86)/CMake 2.8/bin:/cygdrive/c/Program Files/doxygen/bin:/cygdrive/c/Program Files (x86)/Graphviz 2.28/bin:/cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin:/cygdrive/c/MinGW/bin:/cygdrive/c/Program Files/TortoiseSVN/bin:/cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin:/cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE:/cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt TERM=cygwin TZ=America/Vancouver VSINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 10.0\ VTUNE_AMPLIFIER_XE_2013_DIR=C:\Program Files (x86)\Intel\VTune Amplifier XE 2013\ !::=::\ LOCALAPPDATA=C:\Users\dsu\AppData\Local TEC100HOME=C:\Program Files\TEC100 FP_NO_HOST_CHECK=NO BIN_ROOT=C:\Program Files (x86)\Intel\Composer XE 2013\bin\ PWD=/cygdrive/c/cygwin/packages/petsc-3.4.2 VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ ================================================================================ TEST resetEnvCompilers from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1511) TESTING: resetEnvCompilers from config.setCompilers(config/BuildSystem/config/setCompilers.py:1511) ================================================================================ TEST checkMPICompilerOverride from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1476) TESTING: checkMPICompilerOverride from config.setCompilers(config/BuildSystem/config/setCompilers.py:1476) Check if --with-mpi-dir is used along with CC CXX or FC compiler options. This usually prevents mpi compilers from being used - so issue a warning ================================================================================ TEST requireMpiLdPath from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1495) TESTING: requireMpiLdPath from config.setCompilers(config/BuildSystem/config/setCompilers.py:1495) OpenMPI wrappers require LD_LIBRARY_PATH set ================================================================================ TEST checkVendor from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:399) TESTING: checkVendor from config.setCompilers(config/BuildSystem/config/setCompilers.py:399) Determine the compiler vendor Compiler vendor is "" ================================================================================ TEST checkInitialFlags from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:409) TESTING: checkInitialFlags from config.setCompilers(config/BuildSystem/config/setCompilers.py:409) Initialize the compiler and linker flags Pushing language C Initialized CFLAGS to Initialized CFLAGS to Initialized LDFLAGS to Popping language C Pushing language CUDA Initialized CUDAFLAGS to Initialized CUDAFLAGS to Initialized LDFLAGS to Popping language CUDA Pushing language Cxx Initialized CXXFLAGS to Initialized CXX_CXXFLAGS to Initialized LDFLAGS to Popping language Cxx Pushing language FC Initialized FFLAGS to Initialized FFLAGS to Initialized LDFLAGS to Popping language FC Initialized CPPFLAGS to Initialized CUDAPPFLAGS to Initialized CXXCPPFLAGS to Initialized CC_LINKER_FLAGS to [] Initialized CXX_LINKER_FLAGS to [] Initialized FC_LINKER_FLAGS to [] Initialized CUDAC_LINKER_FLAGS to [] Initialized sharedLibraryFlags to [] Initialized dynamicLibraryFlags to [] ================================================================================ TEST checkCCompiler from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:525) TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:525) Locate a functional C compiler Checking for program /usr/local/bin/win32fe...not found Checking for program /usr/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/win32fe...not found Checking for program /cygdrive/c/Windows/system32/win32fe...not found Checking for program /cygdrive/c/Windows/win32fe...not found Checking for program /cygdrive/c/Windows/System32/Wbem/win32fe...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/win32fe...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/win32fe...not found Checking for program /cygdrive/c/MinGW/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/win32fe...not found Checking for program /home/dsu/win32fe...not found Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "CC" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl" Pushing language C All intermediate test results are stored in /tmp/petsc-1nzsmm/config.setCompilers sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language C ================================================================================ TEST checkCPreprocessor from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:560) TESTING: checkCPreprocessor from config.setCompilers(config/BuildSystem/config/setCompilers.py:560) Locate a functional C preprocessor Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "CPP" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E" Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.SET\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 21 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.SET\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.SET\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\stdlib.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Popping language C ================================================================================ TEST checkCUDACompiler from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:594) TESTING: checkCUDACompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:594) Locate a functional CUDA compiler Checking for program /usr/local/bin/nvcc...not found Checking for program /usr/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/nvcc...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/nvcc...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/nvcc...not found Checking for program /cygdrive/c/Windows/system32/nvcc...not found Checking for program /cygdrive/c/Windows/nvcc...not found Checking for program /cygdrive/c/Windows/System32/Wbem/nvcc...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/nvcc...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/nvcc...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/nvcc...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/nvcc...not found Checking for program /cygdrive/c/MinGW/bin/nvcc...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/nvcc...not found Checking for program /home/dsu/nvcc...not found Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/nvcc...not found Checking for program /usr/local/bin/nvcc...not found Checking for program /usr/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/nvcc...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/nvcc...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/nvcc...not found Checking for program /cygdrive/c/Windows/system32/nvcc...not found Checking for program /cygdrive/c/Windows/nvcc...not found Checking for program /cygdrive/c/Windows/System32/Wbem/nvcc...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/nvcc...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/nvcc...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/nvcc...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/nvcc...not found Checking for program /cygdrive/c/MinGW/bin/nvcc...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/nvcc...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/nvcc...not found Checking for program /home/dsu/nvcc...not found Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/nvcc...not found ================================================================================ TEST checkCUDAPreprocessor from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:632) TESTING: checkCUDAPreprocessor from config.setCompilers(config/BuildSystem/config/setCompilers.py:632) Locate a functional CUDA preprocessor ================================================================================ TEST checkCxxCompiler from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:735) TESTING: checkCxxCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:735) Locate a functional Cxx compiler Checking for program /usr/local/bin/win32fe...not found Checking for program /usr/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/win32fe...not found Checking for program /cygdrive/c/Windows/system32/win32fe...not found Checking for program /cygdrive/c/Windows/win32fe...not found Checking for program /cygdrive/c/Windows/System32/Wbem/win32fe...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/win32fe...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/win32fe...not found Checking for program /cygdrive/c/MinGW/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/win32fe...not found Checking for program /home/dsu/win32fe...not found Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "CXX" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl" Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language Cxx ================================================================================ TEST checkCxxPreprocessor from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:773) TESTING: checkCxxPreprocessor from config.setCompilers(config/BuildSystem/config/setCompilers.py:773) Locate a functional Cxx preprocessor Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "CXXCPP" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E" Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.SET\\conftest.cc" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 21 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.SET\\conftest.cc" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\conffix.h" extern "C" { } #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.setcompilers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.SET\\conftest.cc" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\cstdlib" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\yvals.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" namespace vc_attributes { #line 55 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum YesNoMaybe { No = 0x0fff0001, Maybe = 0x0fff0010, Yes = 0x0fff0100 }; typedef enum YesNoMaybe YesNoMaybe; enum AccessType { NoAccess = 0, Read = 1, Write = 2, ReadWrite = 3 }; typedef enum AccessType AccessType; [repeatable] [source_annotation_attribute( Parameter )] struct PreAttribute { PreAttribute(); #line 85 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" unsigned int Deref; YesNoMaybe Valid; YesNoMaybe Null; YesNoMaybe Tainted; AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; YesNoMaybe NullTerminated; const wchar_t* Condition; }; [repeatable] [source_annotation_attribute( Parameter|ReturnValue )] struct PostAttribute { PostAttribute(); #line 116 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" unsigned int Deref; YesNoMaybe Valid; YesNoMaybe Null; YesNoMaybe Tainted; AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; YesNoMaybe NullTerminated; YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( Parameter )] struct FormatStringAttribute { FormatStringAttribute(); #line 147 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [repeatable] [source_annotation_attribute( ReturnValue )] struct InvalidCheckAttribute { InvalidCheckAttribute(); #line 159 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" long Value; }; [source_annotation_attribute( Method )] struct SuccessAttribute { SuccessAttribute(); #line 169 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" const wchar_t* Condition; }; [repeatable] [source_annotation_attribute( Parameter )] struct PreBoundAttribute { PreBoundAttribute(); #line 180 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" unsigned int Deref; }; [repeatable] [source_annotation_attribute( Parameter|ReturnValue )] struct PostBoundAttribute { PostBoundAttribute(); #line 190 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" unsigned int Deref; }; [repeatable] [source_annotation_attribute( Parameter )] struct PreRangeAttribute { PreRangeAttribute(); #line 200 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" unsigned int Deref; const char* MinVal; const char* MaxVal; }; [repeatable] [source_annotation_attribute( Parameter|ReturnValue )] struct PostRangeAttribute { PostRangeAttribute(); #line 212 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" }; #line 222 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef ::vc_attributes::YesNoMaybe SA_YesNoMaybe; const ::vc_attributes::YesNoMaybe SA_Yes = ::vc_attributes::Yes; const ::vc_attributes::YesNoMaybe SA_No = ::vc_attributes::No; const ::vc_attributes::YesNoMaybe SA_Maybe = ::vc_attributes::Maybe; typedef ::vc_attributes::AccessType SA_AccessType; const ::vc_attributes::AccessType SA_NoAccess = ::vc_attributes::NoAccess; const ::vc_attributes::AccessType SA_Read = ::vc_attributes::Read; const ::vc_attributes::AccessType SA_Write = ::vc_attributes::Write; const ::vc_attributes::AccessType SA_ReadWrite = ::vc_attributes::ReadWrite; typedef ::vc_attributes::PreAttribute SA_Pre; typedef ::vc_attributes::PostAttribute SA_Post; typedef ::vc_attributes::FormatStringAttribute SA_FormatString; typedef ::vc_attributes::InvalidCheckAttribute SA_InvalidCheck; typedef ::vc_attributes::SuccessAttribute SA_Success; typedef ::vc_attributes::PreBoundAttribute SA_PreBound; typedef ::vc_attributes::PostBoundAttribute SA_PostBound; typedef ::vc_attributes::PreRangeAttribute SA_PreRange; typedef ::vc_attributes::PostRangeAttribute SA_PostRange; #line 266 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Popping language Cxx ================================================================================ TEST checkFortranCompiler from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:882) TESTING: checkFortranCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:882) Locate a functional Fortran compiler Checking for program /usr/local/bin/win32fe...not found Checking for program /usr/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/win32fe...not found Checking for program /cygdrive/c/Windows/system32/win32fe...not found Checking for program /cygdrive/c/Windows/win32fe...not found Checking for program /cygdrive/c/Windows/System32/Wbem/win32fe...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/win32fe...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/win32fe...not found Checking for program /cygdrive/c/MinGW/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/win32fe...not found Checking for program /home/dsu/win32fe...not found Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "FC" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort" Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language FC ================================================================================ TEST checkFortranComments from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:901) TESTING: checkFortranComments from config.setCompilers(config/BuildSystem/config/setCompilers.py:901) Make sure fortran comment "!" works Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: ! comment program main end Fortran comments can use ! in column 1 Popping language FC ================================================================================ TEST checkPIC from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:964) TESTING: checkPIC from config.setCompilers(config/BuildSystem/config/setCompilers.py:964) Determine the PIC option for each compiler - There needs to be a test that checks that the functionality is actually working Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib Trying C compiler flag -PIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -PIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -PIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-PIC' Rejecting C linker flag -PIC due to cl : Command line warning D9002 : ignoring unknown option '-PIC' Rejected C compiler flag -PIC because linker cannot handle it Trying C compiler flag -fPIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -fPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -fPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-fPIC' Rejecting C linker flag -fPIC due to cl : Command line warning D9002 : ignoring unknown option '-fPIC' Rejected C compiler flag -fPIC because linker cannot handle it Trying C compiler flag -KPIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -KPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -KPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-KPIC' Rejecting C linker flag -KPIC due to cl : Command line warning D9002 : ignoring unknown option '-KPIC' Rejected C compiler flag -KPIC because linker cannot handle it Trying C compiler flag -qpic sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qpic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qpic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-qpic' Rejecting C linker flag -qpic due to cl : Command line warning D9002 : ignoring unknown option '-qpic' Rejected C compiler flag -qpic because linker cannot handle it Popping language C Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib Trying Cxx compiler flag -PIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -PIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -PIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-PIC' Rejecting Cxx linker flag -PIC due to cl : Command line warning D9002 : ignoring unknown option '-PIC' Rejected Cxx compiler flag -PIC because linker cannot handle it Trying Cxx compiler flag -fPIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -fPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -fPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-fPIC' Rejecting Cxx linker flag -fPIC due to cl : Command line warning D9002 : ignoring unknown option '-fPIC' Rejected Cxx compiler flag -fPIC because linker cannot handle it Trying Cxx compiler flag -KPIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -KPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -KPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-KPIC' Rejecting Cxx linker flag -KPIC due to cl : Command line warning D9002 : ignoring unknown option '-KPIC' Rejected Cxx compiler flag -KPIC because linker cannot handle it Trying Cxx compiler flag -qpic sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qpic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qpic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-qpic' Rejecting Cxx linker flag -qpic due to cl : Command line warning D9002 : ignoring unknown option '-qpic' Rejected Cxx compiler flag -qpic because linker cannot handle it Popping language Cxx Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= icl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Intel(R) Visual Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 13.1.3.198 Build 20130607 Copyright (C) 1985-2013 Intel Corporation. All rights reserved. Intel(R) Fortran Compiler Help ============================== Intel(R) Compiler includes compiler options that optimize for instruction sets that are available in both Intel(R) and non-Intel microprocessors, but may perform additional optimizations for Intel microprocessors than for non-Intel microprocessors. In addition, certain compiler options for Intel(R) Compiler are reserved for Intel microprocessors. For a detailed description of these compiler options, including the instructions they implicate, please refer to "Intel(R) Compiler User and Reference Guides > Compiler Options." usage: ifort [options] file1 [file2 ...] [/link linker_options] where options represents zero or more compiler options fileN is a Fortran source (.f .for .ftn .f90 .fpp .i .i90), assembly (.asm), object (.obj), static library (.lib), or other linkable file linker_options represents zero or more linker options Notes ----- 1. Many FL32 options are supported; a warning is printed for unsupported options. 2. Intel Fortran compiler options may be placed in your ifort.cfg file. Some options listed are only available on a specific system i32 indicates the feature is available on systems based on IA-32 architecture i64em indicates the feature is available on systems using Intel(R) 64 architecture Compiler Option List -------------------- Optimization ------------ /O1 optimize for maximum speed, but disable some optimizations which increase code size for a small speed benefit /O2 optimize for maximum speed (DEFAULT) /O3 optimize for maximum speed and enable more aggressive optimizations that may not improve performance on some programs /Ox enable maximum optimizations (same as /O2) /Os enable speed optimizations, but disable some optimizations which increase code size for small speed benefit (overrides /Ot) /Ot enable speed optimizations (overrides /Os) /Od disable optimizations /Oy[-] enable/disable using EBP as a general purpose register (no frame pointer) (i32 only) /fast enable /QxHOST /O3 /Qipo /Qprec-div- options set by /fast cannot be overridden with the exception of /QxHOST, list options separately to change behavior /Oa[-] assume no aliasing in program /Ow[-] assume no aliasing within functions, but assume aliasing across calls Code Generation --------------- /Qx generate specialized code to run exclusively on processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. Optimizes for the Intel NetBurst(R) microarchitecture. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the enhanced Pentium(R) M processor microarchitecture and Intel NetBurst(R) microarchitecture. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Core(TM) microarchitecture. SSE4.1 May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator instructions for Intel processors. May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for Intel(R) 45nm Hi-k next generation Intel Core(TM) microarchitecture. SSE4.2 May generate Intel(R) SSE4 Efficient Accelerated String and Text Processing instructions supported by Intel(R) Core(TM) i7 processors. May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator, Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for the Intel(R) Core(TM) processor family. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. SSSE3_ATOM May generate MOVBE instructions for Intel processors, depending on the setting of option /Qinstruction. May also generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Atom(TM) processor and Intel(R) Centrino(R) Atom(TM) Processor Technology. /QxHost generate instructions for the highest instruction set and processor available on the compilation host machine /Qax[,,...] generate code specialized for processors specified by while also generating generic IA-32 instructions. includes one or more of the following: SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. /arch: generate specialized code to optimize for processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions SSE3 May generate Intel(R) SSE3, SSE2 and SSE instructions SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2 and SSE instructions SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions AVX May generate Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions /Qinstruction: Refine instruction set output for the selected target processor [no]movbe - Do/do not generate MOVBE instructions with SSSE3_ATOM (requires /QxSSSE3_ATOM) /Qextend-arguments:[32|64] By default, unprototyped scalar integer arguments are passed in 32-bits (sign-extended if necessary). On Intel(R) 64, unprototyped scalar integer arguments may be extended to 64-bits. Interprocedural Optimization (IPO) ---------------------------------- /Qip[-] enable(DEFAULT)/disable single-file IP optimization within files /Qipo[n] enable multi-file IP optimization between files /Qipo-c generate a multi-file object file (ipo_out.obj) /Qipo-S generate a multi-file assembly file (ipo_out.asm) /Qip-no-inlining disable full and partial inlining /Qip-no-pinlining disable partial inlining /Qipo-separate create one object file for every source file (overrides /Qipo[n]) /Qipo-jobs specify the number of jobs to be executed simultaneously during the IPO link phase Advanced Optimizations ---------------------- /Qunroll[n] set maximum number of times to unroll loops. Omit n to use default heuristics. Use n=0 to disable the loop unroller /Qunroll-aggressive[-] enables more aggressive unrolling heuristics /Qscalar-rep[-] enable(DEFAULT)/disable scalar replacement (requires /O3) /Qpad[-] enable/disable(DEFAULT) changing variable and array memory layout /Qsafe-cray-ptr Cray pointers do not alias with other variables /Qansi-alias[-] enable/disable(DEFAULT) use of ANSI aliasing rules optimizations; user asserts that the program adheres to these rules /Qcomplex-limited-range[-] enable/disable(DEFAULT) the use of the basic algebraic expansions of some complex arithmetic operations. This can allow for some performance improvement in programs which use a lot of complex arithmetic at the loss of some exponent range. /reentrancy: specify whether the threaded, reentrant run-time support should be used Keywords: none (same as /noreentrancy), threaded, async /noreentrancy do not use threaded, reentrant run-time support /heap-arrays[:n] temporary arrays of minimum size n (in kilobytes) are allocated in heap memory rather than on the stack. If n is not specified, all temporary arrays are allocated in heap memory. /heap-arrays- temporary arrays are allocated on the stack (DEFAULT) /Qopt-multi-version-aggressive[-] enables more aggressive multi-versioning to check for pointer aliasing and scalar replacement /Qopt-ra-region-strategy[:] select the method that the register allocator uses to partition each routine into regions routine - one region per routine block - one region per block trace - one region per trace loop - one region per loop default - compiler selects best option /Qvec[-] enables(DEFAULT)/disables vectorization /Qvec-guard-write[-] enables cache/bandwidth optimization for stores under conditionals within vector loops /Qvec-threshold[n] sets a threshold for the vectorization of loops based on the probability of profitable execution of the vectorized loop in parallel /Qopt-malloc-options:{0|1|2|3|4} specify malloc configuration parameters. Specifying a non-zero value will cause alternate configuration parameters to be set for how malloc allocates and frees memory /Qopt-jump-tables: control the generation of jump tables default - let the compiler decide when a jump table, a series of if-then-else constructs or a combination is generated large - generate jump tables up to a certain pre-defined size (64K entries) - generate jump tables up to in size use /Qopt-jump-tables- to lower switch statements as chains of if-then-else constructs /Qopt-block-factor: specify blocking factor for loop blocking /Qopt-streaming-stores: specifies whether streaming stores are generated always - enables generation of streaming stores under the assumption that the application is memory bound auto - compiler decides when streaming stores are used (DEFAULT) never - disables generation of streaming stores /Qmkl[:] link to the Intel(R) Math Kernel Library (Intel(R) MKL) and bring in the associated headers parallel - link using the threaded Intel(R) MKL libraries. This is the default when /Qmkl is specified sequential - link using the non-threaded Intel(R) MKL libraries cluster - link using the Intel(R) MKL Cluster libraries plus the sequential Intel(R) MKL libraries /Qimsl link to the International Mathematics and Statistics Library* (IMSL* library) /Qopt-subscript-in-range[-] assumes no overflows in the intermediate computation of the subscripts /Qcoarray[:shared|distributed] enable/disable(DEFAULT) coarray syntax for data parallel programming. The default is shared-memory; distributed memory is only valid with the Intel(R) Cluster Toolkit /Qcoarray-num-images:n set default number of coarray images /Qopt-matmul[-] replace matrix multiplication with calls to intrinsics and threading libraries for improved performance (DEFAULT at /O3 /Qparallel) /Qsimd[-] enables(DEFAULT)/disables vectorization using SIMD directive /Qguide-opts: tells the compiler to analyze certain code and generate recommendations that may improve optimizations /Qguide-file[:] causes the results of guide to be output to a file /Qguide-file-append[:] causes the results of guide to be appended to a file /Qguide[:] lets you set a level (1 - 4) of guidance for auto-vectorization, auto-parallelization, and data transformation (DEFAULT is 4 when the option is specified) /Qguide-data-trans[:] lets you set a level (1 - 4) of guidance for data transformation (DEFAULT is 4 when the option is specified) /Qguide-par[:] lets you set a level (1 - 4) of guidance for auto-parallelization (DEFAULT is 4 when the option is specified) /Qguide-vec[:] lets you set a level (1 - 4) of guidance for auto-vectorization (DEFAULT is 4 when the option is specified) /Qguide-profile:<[file|dir]>[,[file|dir],...] specify a loop profiler data file (or set of files in a directory) when using the /Qguide option /Qopt-mem-layout-trans[:] controls the level of memory layout transformations performed by the compiler 0 - disable memory layout transformations (same as /Qopt-mem-layout-trans-) 1 - enable basic memory layout transformations 2 - enable more memory layout transformations (DEFAULT when the option is specified) 3 - enable aggressive memory layout transformations /Qopt-prefetch[:n] enable levels of prefetch insertion, where 0 disables. n may be 0 through 4 inclusive. Default is 2. /Qopt-prefetch- disable(DEFAULT) prefetch insertion. Equivalent to /Qopt-prefetch:0 Profile Guided Optimization (PGO) --------------------------------- /Qprof-dir specify directory for profiling output files (*.dyn and *.dpi) /Qprof-src-root specify project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-root-cwd specify the current directory as the project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-dir[-] specify whether directory names of sources should be considered when looking up profile records within the .dpi file /Qprof-file specify file name for profiling summary file /Qprof-data-order[-] enable/disable(DEFAULT) static data ordering with profiling /Qprof-func-order[-] enable/disable(DEFAULT) function ordering with profiling /Qprof-gen[:keyword] instrument program for profiling. Optional keyword may be srcpos or globdata /Qprof-gen- disable profiling instrumentation /Qprof-use[:] enable use of profiling information during optimization weighted - invokes profmerge with -weighted option to scale data based on run durations [no]merge - enable(default)/disable the invocation of the profmerge tool /Qprof-use- disable use of profiling information during optimization /Qcov-gen instrument program for profiling /Qcov-dir specify directory for profiling output files (*.dyn and *.dpi) /Qcov-file specify file name for profiling summary file /Qinstrument-functions[-] determine whether function entry and exit points are instrumented /Qprof-hotness-threshold: set the hotness threshold for function grouping and function ordering val indicates percentage of functions to be placed in hot region. This option requires /Qprof-use and /Qprof-func-order /Qprof-value-profiling:[,,...] limit value profiling none - inhibit all types of value profiling nodivide - inhibit value profiling of non-compile time constants used in division or remainder operations noindcall - inhibit value profiling of function addresses at indirect call sites /Qprofile-functions enable instrumentation in generated code for collecting function execution time profiles /Qprofile-loops: enable instrumentation in generated code for collecting loop execution time profiles inner - instrument inner loops outer - instrument outer loops all - instrument all loops /Qprofile-loops-report: Control the level of instrumentation inserted for reporting loop execution profiles 1 - report loop times 2 - report loop times and iteration counts Optimization Reports -------------------- /Qvec-report[n] control amount of vectorizer diagnostic information n=0 no diagnostic information n=1 indicate vectorized loops (DEFAULT when enabled) n=2 indicate vectorized/non-vectorized loops n=3 indicate vectorized/non-vectorized loops and prohibiting data dependence information n=4 indicate non-vectorized loops n=5 indicate non-vectorized loops and prohibiting data dependence information n=6 indicate vectorized/non-vectorized loops with greater details and prohibiting data dependence information n=7 indicate vector code quality message ids and data values for vectorized loops /Qopt-report[:n] generate an optimization report to stderr 0 disable optimization report output 1 minimum report output 2 medium output (DEFAULT when enabled) 3 maximum report output /Qopt-report-file: specify the filename for the generated report /Qopt-report-phase: specify the phase that reports are generated against /Qopt-report-routine: reports on routines containing the given name /Qopt-report-help display the optimization phases available for reporting /Qtcheck[:mode] enable analysis of threaded applications (requires Intel(R) Thread Checker; cannot be used with compiler alone) tci - instruments a program to perform a thread-count-independent analysis tcd - instruments a program to perform a thread-count-dependent analysis (DEFAULT when mode is not used) api - instruments a program at the api-imports level /Qtcollect[:] inserts instrumentation probes calling the Intel(R) Trace Collector API. The library .lib is linked in the default being VT.lib (requires Intel(R) Trace Collector) /Qtcollect-filter:file Enable or disable the instrumentation of specified functions. (requires Intel(R) Trace Collector) OpenMP* and Parallel Processing ------------------------------ /Qopenmp enable the compiler to generate multi-threaded code based on the OpenMP* directives (same as /openmp) /Qopenmp-stubs enables the user to compile OpenMP programs in sequential mode. The OpenMP directives are ignored and a stub OpenMP library is linked (sequential) /Qopenmp-report{0|1|2} control the OpenMP parallelizer diagnostic level /Qopenmp-lib: choose which OpenMP library version to link with compat - use the Microsoft compatible OpenMP run-time libraries (DEFAULT) /Qopenmp-threadprivate: choose which threadprivate implementation to use compat - use the Microsoft compatible thread local storage legacy - use the Intel compatible implementation (DEFAULT) /Qparallel enable the auto-parallelizer to generate multi-threaded code for loops that can be safely executed in parallel /Qpar-report{0|1|2|3} control the auto-parallelizer diagnostic level /Qpar-threshold[n] set threshold for the auto-parallelization of loops where n is an integer from 0 to 100 /Qpar-runtime-control[n] Control parallelizer to generate runtime check code for effective automatic parallelization. n=0 no runtime check based auto-parallelization n=1 generate runtime check code under conservative mode (DEFAULT when enabled) n=2 generate runtime check code under heuristic mode n=3 generate runtime check code under aggressive mode /Qpar-schedule-static[:n] Specifies a scheduling algorithm for DO loop iteration. Divides iterations into contiguous pieces. Size n if specified, equal sized pieces if not. /Qpar-schedule-static_balanced[:n] Divides iterations into even-sized chunks. Size n if specified, equal sized pieces if not. /Qpar-schedule-static-steal[:n] Divides iterations into even-sized chunks, but allows threads to steal parts of chunks from neighboring threads /Qpar-schedule-dynamic[:n] Specifies a scheduling algorithm for DO loop iteration. Assigns iterations to threads in chunks dynamically. Chunk size is n iterations if specified, otherwise 1. /Qpar-schedule-guided[:n] Specifies a scheduling algorithm for DO loop iteration. Indicates a minimum number of iterations. If specified, n is the minimum number, otherwise 1. /Qpar-schedule-guided-analytical[:n] Divides iterations by using exponential distribution or dynamic distributions. /Qpar-schedule-runtime Specifies a scheduling algorithm for DO loop iteration. Defers the scheduling decision until runtime. /Qpar-schedule-auto Lets the compiler or run-time system determine the scheduling algorithm. /Qpar-adjust-stack perform fiber-based main thread stack adjustment /Qpar-affinity=[,...][,][,] tune application performance by setting different thread affinity /Qpar-num-threads= tune application performance by setting different number of threads /Qparallel-source-info[:n] enable(DEFAULT)/disable the emission of source location information for parallel code generation with OpenMP and auto-parallelization 0 - disable (same as /Qparallel-source-info-) 1 - emit routine name and line information (DEFAULT) 2 - emit path, file, routine name and line information /Qpar same as /Qparallel Floating Point -------------- /fp: enable floating point model variation except[-] - enable/disable floating point semantics fast[=1|2] - enables more aggressive floating point optimizations precise - allows value-safe optimizations source - enables intermediates in source precision strict - enables /fp:precise /fp:except, disables contractions and enables pragma stdc fenv_access /Qfp-speculation: enable floating point speculations with the following conditions: fast - speculate floating point operations (DEFAULT) safe - speculate only when safe strict - same as off off - disables speculation of floating-point operations /Qpc32 set internal FPU precision to 24 bit significand /Qprec improve floating-point precision (speed impact less than /Op) /Qprec-sqrt[-] determine if certain square root optimizations are enabled /Qprec-div[-] improve precision of FP divides (some speed impact) /Qfast-transcendentals[-] generate a faster version of the transcendental functions /Qfp-port[-] round fp results at assignments and casts (some speed impact) /Qfp-stack-check enable fp stack checking after every function/procedure call /Qrcd rounding mode to enable fast float-to-int conversions /rounding-mode:chopped set internal FPU rounding control to truncate /Qftz[-] enable/disable flush denormal results to zero /fpe:{0|1|3} specifies program-wide behavior on floating point exceptions /fpe-all:{0|1|3} specifies floating point exception behavior on all functions and subroutines. Also sets /assume:ieee_fpe_flags /[no]fltconsistency specify that improved floating-point consistency should be used /Qfma[-] enable/disable the combining of floating point multiplies and add/subtract operations /[no]recursive compile all procedures for possible recursive execution Inlining -------- /Ob control inline expansion: n=0 disable inlining (same as /inline:none) n=1 inline functions declared with ATTRIBUTES INLINE or FORCEINLINE n=2 inline any function, at the compiler's discretion /Qinline-min-size: set size limit for inlining small routines /Qinline-min-size- no size limit for inlining small routines /Qinline-max-size: set size limit for inlining large routines /Qinline-max-size- no size limit for inlining large routines /Qinline-max-total-size: maximum increase in size for inline function expansion /Qinline-max-total-size- no size limit for inline function expansion /Qinline-max-per-routine: maximum number of inline instances in any function /Qinline-max-per-routine- no maximum number of inline instances in any function /Qinline-max-per-compile: maximum number of inline instances in the current compilation /Qinline-max-per-compile- no maximum number of inline instances in the current compilation /Qinline-factor: set inlining upper limits by n percentage /Qinline-factor- do not set set inlining upper limits /Qinline-forceinline treat inline routines as forceinline /Qinline-dllimport allow(DEFAULT)/disallow functions declared DEC$ ATTRIBUTES DLLIMPORT to be inlined /Qinline-calloc directs the compiler to inline calloc() calls as malloc()/memset() /inline[:keyword] Specifies the level of inline function expansion keywords: all (same as /Ob2 /Ot), size (same as /Ob2 /Os) speed (same as /Ob2 /Ot), none or manual (same as /Ob0) Output, Debug, PCH ------------------ /c compile to object (.obj) only, do not link /nolink, /compile-only same as /c /S compile to assembly (.asm) only, do not link /FAs produce assembly file with optional source annotations /FAc produce assembly file with optional code annotations /FA produce assembly file /Fa[file] name assembly file (or directory for multiple files; i.e. /FaMYDIR\) /Fo[file] name object file (or directory for multiple files; i.e. /FoMYDIR\) /Fe[file] name executable file or directory /object: specify the name of the object file, or the directory to which object file(s) should be written. (e.g. /object:MYOBJ or /object:MYDIR\) /exe: specifies the name to be used for the built program (.exe) or dynamic-link (.dll) library /map: specify that a link map file should be generated /list: specify that a listing file should be generated /list-line-len:# overrides the default line length (80) in a listing file /list-page-len:# overrides the default page length (66) in a listing file /show: controls the contents of the listing file keywords: all, none, [no]include, [no]map, [no]options /Zi, /ZI, /Z7 produce symbolic debug information in object file (implies /Od when another optimization option is not explicitly set) /debug[:keyword] enable debug information and control output of enhanced debug information keywords: all, full, minimal, none, [no]inline-debug-info /nodebug do not enable debug information /debug-parameters[:keyword] control output of debug information for PARAMETERS keywords: all, used, none (same as /nodebug-parameters) /nodebug-parameters do not output debug information for PARAMETERS /Qd-lines, /[no]d-lines compile debug statements (indicated by D in column 1) /pdbfile[:filename] specify that debug related information should be generated to a program database file /nopdbfile do not generate debug related information to a program database file /Qtrapuv trap uninitialized variables /RTCu report use of variable that was not initialized /Qmap-opts enable option mapping tool Preprocessor ------------ /D[{=|#}] define macro /define:symbol[=] same as /D /nodefines specifies that any /D macros go to the preprocessor only, and not to the compiler /U remove predefined macro /undefine: remove predefined macro (same as /U) /allow:nofpp-comments If a Fortran end-of-line comment is seen within a #define, treat it as part of the definition. Default is allow:fpp-comments /E preprocess to stdout /EP preprocess to stdout, omitting #line directives /EP /P preprocess to file, omitting #line directives /P preprocess to file /preprocess-only same as /P /[no]keep keep/remove preprocessed file generated by preprocessor as input to compiler stage. Not affected by /Qsave-temps. Default is /nokeep /fpp[n], /[no]fpp run Fortran preprocessor on source files prior to compilation n=0 disable running the preprocessor, equivalent to nofpp n=1,2,3 run preprocessor /module:path specify path where mod files should be placed and first location to look for mod files /u remove all predefined macros /I add directory to include file search path /[no]include: same as /I /X remove standard directories from include file search path /[no]gen-dep[:filename] generate dependency information. If no filename is specified, output to stdout /gen-depformat:keyword generate dependency information in the specified format. One of: make, nmake Component Control ----------------- /Qoption,, pass options to tool specified by /Qlocation,, set as the location of tool specified by Language -------- /[no]altparam specify if alternate form of parameter constant declarations (without parenthesis) is recognized. Default is to recognize /assume: specify assumptions made by the optimizer and code generator keywords: none, [no]byterecl, [no]buffered_io, [no]bscc (nobscc same as /nbs), [no]cc_omp, [no]minus0, [no]dummy_aliases (same as /Qcommon-args), [no]ieee_fpe_flags, [no]fpe_summary, [no]old_boz, [no]old_complex_align, [no]old_logical_ldio, [no]old_ldout_format, [no]old_maxminloc, [no]old_unit_star, [no]old_xor, [no]protect_constants, [no]protect_parens, [no]realloc_lhs, [no]2underscore, [no]underscore (same as /us), [no]std_intent_in, [no]std_mod_proc_name, [no]source_include, [no]split_common,[no]writeable_strings /ccdefault: specify default carriage control for units 6 and * keywords: default, fortran, list or none /[no]check: check run-time conditions. Default is /nocheck keywords: all (same as /4Yb, /C), none (same as /nocheck, /4Nb), [no]arg_temp_created, [no]bounds (same as /CB), [no]format, [no]output_conversion, [no]pointer (same as /CA), [no]uninit (same as /CU), [no]stack /Qcommon-args assume "by reference" subprogram arguments may alias one another. Same as /assume:dummy_aliases /[no]extend-source[:] specify rightmost column for fixed form sources keywords: 72 (same as /noextend-source and /4L72), 80 (same as /4L80), 132 (same as /4L132. Default if you specify /extend-source without a keyword.) /fixed specify source files are in fixed format. Same as /FI and /4Nf /nofixed indicates free format /free specify source files are in free format. Same as /FR and /4Yf /nofree indicates fixed format /names: specify how source code identifiers and external names are interpreted. keywords: as_is, lowercase, uppercase /[no]pad-source, /Qpad-source[-] make compiler acknowledge blanks at the end of a line /stand[:] specifies level of conformance with ANSI standard to check for. If keyword is not specified, level of conformance is f03 keywords: f90 (same as /4Ys), f95, f03, none (same as /nostand) /standard-semantics sets assume keywords to conform to the semantics of the f03 standard. May result in performance loss. assume keywords set by /standard-semantics: byterecl, fpe_summary, minus0, noold_maxminloc, noold_unit_star, noold_xor, protect_parens, realloc_lhs, std_intent_in, std_mod_proc_name, noold_ldout_format /syntax-only, /Zs perform syntax and semantic checking only (no object file produced) Compiler Diagnostics -------------------- /w disable all warnings /W disable warnings (n = 0) or show warnings (n = 1 DEFAULT, same as /warn:general) /warn: specifies the level of warning messages issued keywords: all, none (same as /nowarn) [no]alignments, [no]declarations, [no]errors, [no]general, [no]ignore_loc, [no]interfaces, [no]stderrors, [no]truncated_source, [no]uncalled, [no]unused, [no]usage /nowarn suppress all warning messages /WB turn a compile-time bounds check into a warning /[no]traceback specify whether the compiler generates PC correlation data used to display a symbolic traceback rather than a hexadecimal traceback at runtime failure /[no]gen-interfaces [[no]source] generate interface blocks for all routines in the file. Can be checked using -warn interfaces nosource indicates temporary source files should not be saved /error-limit: specify the maximum number of error-level or fatal-level compiler errors allowed /noerror-limit set no maximum number on error-level or fatal-level error messages /Qdiag-enable:[,,...] enable the specified diagnostics or diagnostic groups /Qdiag-disable:[,,...] disable the specified diagnostics or diagnostic groups where may be individual diagnostic numbers or group names. where group names include: sc[n] - perform source code analysis: n=1 for critical errors, n=2 for all errors and n=3 for all errors and warnings sc- {full|concise|precise} - perform static analysis and determine the analysis mode. Full mode - attempts to find all program weaknesses, even at the expense of more false positives. Concise mode - attempts to reduce false positives somewhat more than reducing false negatives. Precise mode - attempts to avoid all false positives Default: full if /Qdiag-enable:sc{[1|2|3]} is present; otherwise None (static analysis diagnostics are disabled). sc-include - perform source code analysis on include files sc-single-file - This option tells static analysis to process each file individually. Default: OFF sc-enums - This option tells static analysis to treat enumeration variables as known values equal to any one of the associated enumeration literals. Default: OFF sc-parallel[n] - perform analysis of parallelization in source code: n=1 for critical errors, n=2 for errors, n=3 for all errors and warnings warn - diagnostic messages that have "warning" severity level. error - diagnostic messages that have "error" severity level. remark - diagnostic messages that are remarks or comments. vec - diagnostic messages issued by the vectorizer. par - diagnostic messages issued by the auto-parallelizer openmp - diagnostic messages issued by the OpenMP* parallelizer. cpu-dispatch Specifies the CPU dispatch remarks. /Qdiag-error:[,,...] output the specified diagnostics or diagnostic groups as errors /Qdiag-warning:[,,...] output the specified diagnostics or diagnostic groups as warnings /Qdiag-remark:[,,...] output the the specified diagnostics or diagnostic groups as remarks /Qdiag-dump display the currently enabled diagnostic messages to stdout or to a specified diagnostic output file. /Qdiag-sc-dir: directory where diagnostics from static analysis are created, rather than current working directory. /Qdiag-file[:] where diagnostics are emitted to. Not specifying this causes messages to be output to stderr /Qdiag-file-append[:] where diagnostics are emitted to. When already exists, output is appended to the file /Qdiag-id-numbers[-] enable(DEFAULT)/disable the diagnostic specifiers to be output in numeric form /Qdiag-error-limit: specify the maximum number of errors emitted Miscellaneous ------------- /[no]logo display compiler version information. /nologo disables the output /Qsox[:[,keyword]] enable saving of compiler options, version and additional information in the executable. Use /Qsox- to disable(DEFAULT) profile - include profiling data inline - include inlining information /bintext: place the string specified into the object file and executable /Qsave-temps store the intermediate files in current directory and name them based on the source file. Only saves files that are generated by default /what display detailed compiler version information /watch: tells the driver to output processing information keywords: all, none (same as /nowatch), [no]source, [no]cmd [no]mic-cmd /nowatch suppress processing information output (DEFAULT) /Tf compile file as Fortran source /extfor: specify extension of file to be recognized as a Fortran file /extfpp: specify extension of file to be recognized as a preprocessor file /libdir[:keyword] control the library names that should be emitted into the object file keywords: all, none (same as /nolibdir), [no]automatic, [no]user /nolibdir no library names should be emitted into the object file /MP[] create multiple processes that can be used to compile large numbers of source files at the same time /bigobj generate objects with increased address capacity Data ---- /4I{2|4|8} set default KIND of integer and logical variables to 2, 4, or 8 /integer-size: specifies the default size of integer and logical variables size: 16, 32, 64 /4R{8|16} set default size of real to 8 or 16 bytes /real-size: specify the size of REAL and COMPLEX declarations, constants, functions, and intrinsics size: 32, 64, 128 /Qautodouble same as /real-size:64 or /4R8 /double-size: defines the size of DOUBLE PRECISION and DOUBLE COMPLEX declarations, constants, functions, and intrinsics size: 64, 128 /[no]fpconstant extends the precision of single precision constants assigned to double precision variables to double precision /[no]intconstant use Fortran 77 semantics, rather than Fortran 90/95, to determine kind of integer constants /auto make all local variables AUTOMATIC /Qauto-scalar make scalar local variables AUTOMATIC (DEFAULT) /Qsave save all variables (static allocation) (same as /noauto, opposite of /auto) /Qzero[-] enable/disable(DEFAULT) implicit initialization to zero of local scalar variables of intrinsic type INTEGER, REAL, COMPLEX, or LOGICAL that are saved and not initialized /Qdyncom make given common blocks dynamically-allocated /Zp[n] specify alignment constraint for structures (n=1,2,4,8,16 /Zp16 DEFAULT) /[no]align analyze and reorder memory layout for variables and arrays /align: specify how data items are aligned keywords: all (same as /align), none (same as /noalign), [no]commons, [no]dcommons, [no]qcommons, [no]zcommons, rec1byte, rec2byte, rec4byte, rec8byte, rec16byte, rec32byte, array8byte, array16byte, array32byte, array64byte, array128byte, array256byte, [no]records, [no]sequence /GS enable overflow security checks. /GS- disables (DEFAULT) /Qpatchable-addresses generate code such that references to statically assigned addresses can be patched with arbitrary 64-bit addresses. /Qfnalign[-] align the start of functions to an optimal machine-dependent value. When disabled (DEFAULT) align on a 2-byte boundary /Qfnalign:[2|16] align the start of functions on a 2 (DEFAULT) or 16 byte boundary /Qglobal-hoist[-] enable(DEFAULT)/disable external globals are load safe /Qkeep-static-consts[-] enable/disable(DEFAULT) emission of static const variables even when not referenced /Qnobss-init disable placement of zero-initialized variables in BSS (use DATA) /Qzero-initialized-in-bss[-] put explicitly zero initialized variables into the DATA section instead of the BSS section /convert: specify the format of unformatted files containing numeric data keywords: big_endian, cray, ibm, little_endian, native, vaxd, vaxg /Qimf-absolute-error:value[:funclist] define the maximum allowable absolute error for math library function results value - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-accuracy-bits:bits[:funclist] define the relative error, measured by the number of correct bits, for math library function results bits - a positive, floating-point number funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-arch-consistency:value[:funclist] ensures that the math library functions produce consistent results across different implementations of the same architecture value - true or false funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-max-error:ulps[:funclist] defines the maximum allowable relative error, measured in ulps, for math library function results ulps - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-precision:value[:funclist] defines the accuracy (precision) for math library functions value - defined as one of the following values high - equivalent to max-error = 0.6 medium - equivalent to max-error = 4 (DEFAULT) low - equivalent to accuracy-bits = 11 (single precision); accuracy-bits = 26 (double precision) funclist - optional comma separated list of one or more math library functions to which the attribute should be applied Compatibility ------------- /fpscomp[:] specify the level of compatibility to adhere to with Fortran PowerStation keywords: all, none (same as /nofpscomp), [no]filesfromcmd, [no]general, [no]ioformat, [no]ldio_spacing, [no]libs, [no]logicals /nofpscomp no specific level of compatibility with Fortran PowerStation /f66 allow extensions that enhance FORTRAN-66 compatibility /f77rtl specify that the Fortran 77 specific run-time support should be used /nof77rtl disables /vms enable VMS I/O statement extensions /Qvc enable compatibility with a specific Microsoft* Visual Studio version 9 - Microsoft* Visual Studio 2008 compatibility 10 - Microsoft* Visual Studio 2010 compatibility 11 - Microsoft* Visual Studio 2012 compatibility Linking/Linker -------------- /link specify that all options following '/link' are for the linker /extlnk: specify extension of file to be passed directly to linker /F set the stack reserve amount specified to the linker /dbglibs use the debug version of runtime libraries, when appropriate /libs: specifies which type of run-time library to link to. keywords: static, dll, qwin, qwins /LD[d] produce a DLL instead of an EXE ('d' = debug version) /dll same as /LD /MD[d] use dynamically-loaded, multithread C runtime /MDs[d] use dynamically-loaded, singlethread Fortran runtime, and multithread C runtime /MT[d] use statically-linked, multithread C runtime (DEFAULT with Microsoft Visual Studio 2005 and later) /ML[d] use statically-linked, single thread C runtime (only valid in Microsoft Visual Studio 2003 environment) /MG, /winapp use Windows API runtime libraries /Zl omit library names from object file /threads specify that multi-threaded libraries should be linked against /nothreads disables multi-threaded libraries Deprecated Options ------------------ /Qinline-debug-info use /debug:inline-debug-info /Gf use /GF /ML[d] upgrade to /MT[d] /Quse-asm No replacement /Qprof-genx use /Qprof-gen:srcpos /Qdiag-enable:sv[] use /Qdiag-enable:sc[] /Qdiag-enable:sv-include use /Qdiag-enable:sc-include /Qdiag-sv use /Qdiag-enable:sc[] /Qdiag-sv-error use /Qdiag-disable:warning /Qdiag-sv-include use /Qdiag-enable:sc-include /Qdiag-sv-level No replacement /Qdiag-sv-sup use /Qdiag-disable:[,,...] /Qtprofile No replacement /arch:SSE use /arch:IA32 /QxK upgrade to /arch:SSE2 /QaxK upgrade to /arch:SSE2 /QxW use /arch:SSE2 /QaxW use /arch:SSE2 /QxN use /QxSSE2 /QaxN use /QaxSSE2 /QxP use /QxSSE3 /QaxP use /QaxSSE3 /QxT use /QxSSSE3 /QaxT use /QaxSSSE3 /QxS use /QxSSE4.1 /QaxS use /QaxSSE4.1 /QxH use /QxSSE4.2 /QaxH use /QaxSSE4.2 /QxO use /arch:SSE3 /Qvc7.1 No replacement /QIfist use /Qrcd /QxSSE3_ATOM use /QxSSSE3_ATOM /Qrct No replacement /Op use /fltconsistency /debug:partial No replacement /tune: use /Qx /architecture: use /arch: /1, /Qonetrip use /f66 /Fm use /map /Qcpp, /Qfpp use /fpp /Qdps use /altparam /Qextend-source use /extend-source /Qlowercase use /names:lowercase /Quppercase use /names:uppercase /Qvms use /vms /asmattr:keyword use /FA[c|s|cs] /noasmattr,/asmattr:none use /FA /asmfile use /Fa /automatic use /auto /cm use /warn:nousage /optimize:0 use /Od /optimize:1,2 use /O1 /optimize:3,4 use /O2 /optimize:5 use /O3 /source use /Tf /unix No replacement /us use /assume:underscore /unroll use /Qunroll /w90, /w95 No replacement /Zd use /debug:minimal /help, /? [category] print full or category help message Valid categories include advanced - Advanced Optimizations codegen - Code Generation compatibility - Compatibility component - Component Control data - Data deprecated - Deprecated Options diagnostics - Compiler Diagnostics float - Floating Point help - Help inline - Inlining ipo - Interprocedural Optimization (IPO) language - Language link - Linking/Linker misc - Miscellaneous opt - Optimization output - Output pgo - Profile Guided Optimization (PGO) preproc - Preprocessor reports - Optimization Reports openmp - OpenMP and Parallel Processing Copyright (C) 1985-2013, Intel Corporation. All rights reserved. * Other names and brands may be claimed as the property of others. Trying FC compiler flag -PIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -PIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -PIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/PIC' Rejecting FC linker flag -PIC due to ifort: command line warning #10006: ignoring unknown option '/PIC' Rejected FC compiler flag -PIC because linker cannot handle it Trying FC compiler flag -fPIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -fPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -fPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/fPIC' Rejecting FC linker flag -fPIC due to ifort: command line warning #10006: ignoring unknown option '/fPIC' Rejected FC compiler flag -fPIC because linker cannot handle it Trying FC compiler flag -KPIC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -KPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -KPIC /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/KPIC' Rejecting FC linker flag -KPIC due to ifort: command line warning #10006: ignoring unknown option '/KPIC' Rejected FC compiler flag -KPIC because linker cannot handle it Trying FC compiler flag -qpic sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qpic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qpic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/qpic' Rejecting FC linker flag -qpic due to ifort: command line warning #10006: ignoring unknown option '/qpic' Rejected FC compiler flag -qpic because linker cannot handle it Popping language FC ================================================================================ TEST checkLargeFileIO from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:998) TESTING: checkLargeFileIO from config.setCompilers(config/BuildSystem/config/setCompilers.py:998) ================================================================================ TEST checkArchiver from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1097) TESTING: checkArchiver from config.setCompilers(config/BuildSystem/config/setCompilers.py:1097) Check that the archiver exists and can make a library usable by the compiler Pushing language C sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /usr/local/bin/ranlib...not found Checking for program /usr/bin/ranlib...found Defined make macro "RANLIB" to "/usr/bin/ranlib -c" sh: /usr/bin/ar cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /usr/bin/ar cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: sh: /usr/bin/ranlib -c /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a Executing: /usr/bin/ranlib -c /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a sh: Possible ERROR while running ranlib: error message = {/usr/bin/ranlib: invalid option -- c } Ranlib is not functional with your archiver. Try --with-ranlib=true if ranlib is unnecessary. sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /usr/local/bin/ranlib...not found Checking for program /usr/bin/ranlib...found Defined make macro "RANLIB" to "/usr/bin/ranlib" sh: /usr/bin/ar cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /usr/bin/ar cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: sh: /usr/bin/ranlib /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a Executing: /usr/bin/ranlib /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 sh: LINK : fatal error LNK1181: cannot open input file 'libconf1.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1181: cannot open input file 'libconf1.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 sh: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored conftest.obj : error LNK2019: unresolved external symbol foo referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored conftest.obj : error LNK2019: unresolved external symbol foo referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /usr/local/bin/true...not found Checking for program /usr/bin/true...found Defined make macro "RANLIB" to "/usr/bin/true" sh: /usr/bin/ar cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /usr/bin/ar cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: sh: /usr/bin/true /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a Executing: /usr/bin/true /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 sh: LINK : fatal error LNK1181: cannot open input file 'libconf1.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1181: cannot open input file 'libconf1.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 sh: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored conftest.obj : error LNK2019: unresolved external symbol foo referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconf1.lib : warning LNK4003: invalid library format; library ignored conftest.obj : error LNK2019: unresolved external symbol foo referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /usr/local/bin/ranlib...not found Checking for program /usr/bin/ranlib...found Defined make macro "RANLIB" to "/usr/bin/ranlib -c" sh: /usr/bin/ar -X64 cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /usr/bin/ar -X64 cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: Possible ERROR while running archiver: ret = 256 error message = {/usr/bin/ar: invalid option -- X Usage: /usr/bin/ar [emulation options] [-]{dmpqrstx}[abcDfilMNoPsSTuvV] [member-name] [count] archive-file file... /usr/bin/ar -M [ - read options from --target=BFDNAME - specify the target object format as BFDNAME emulation options: No emulation specific options /usr/bin/ar: supported targets: pe-i386 pei-i386 elf32-i386 elf32-little elf32-big srec symbolsrec verilog tekhex binary ihex } Archiver is not functional sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /usr/local/bin/ranlib...not found Checking for program /usr/bin/ranlib...found Defined make macro "RANLIB" to "/usr/bin/ranlib" sh: /usr/bin/ar -X64 cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /usr/bin/ar -X64 cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: Possible ERROR while running archiver: ret = 256 error message = {/usr/bin/ar: invalid option -- X Usage: /usr/bin/ar [emulation options] [-]{dmpqrstx}[abcDfilMNoPsSTuvV] [member-name] [count] archive-file file... /usr/bin/ar -M [ - read options from --target=BFDNAME - specify the target object format as BFDNAME emulation options: No emulation specific options /usr/bin/ar: supported targets: pe-i386 pei-i386 elf32-i386 elf32-little elf32-big srec symbolsrec verilog tekhex binary ihex } Archiver is not functional sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. sh: ar -V Executing: ar -V sh: GNU ar (GNU Binutils) 2.23.52.20130309 Copyright 2013 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /usr/local/bin/true...not found Checking for program /usr/bin/true...found Defined make macro "RANLIB" to "/usr/bin/true" sh: /usr/bin/ar -X64 cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /usr/bin/ar -X64 cr /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: Possible ERROR while running archiver: ret = 256 error message = {/usr/bin/ar: invalid option -- X Usage: /usr/bin/ar [emulation options] [-]{dmpqrstx}[abcDfilMNoPsSTuvV] [member-name] [count] archive-file file... /usr/bin/ar -M [ - read options from --target=BFDNAME - specify the target object format as BFDNAME emulation options: No emulation specific options /usr/bin/ar: supported targets: pe-i386 pei-i386 elf32-i386 elf32-little elf32-big srec symbolsrec verilog tekhex binary ihex } Archiver is not functional Defined make macro "FAST_AR_FLAGS" to "-a -P512" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/win32fe...not found Checking for program /usr/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/win32fe...not found Checking for program /cygdrive/c/Windows/system32/win32fe...not found Checking for program /cygdrive/c/Windows/win32fe...not found Checking for program /cygdrive/c/Windows/System32/Wbem/win32fe...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/win32fe...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/win32fe...not found Checking for program /cygdrive/c/MinGW/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/win32fe...not found Checking for program /home/dsu/win32fe...not found Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "AR" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe tlib" Checking for program /usr/local/bin/true...not found Checking for program /usr/bin/true...found Defined make macro "RANLIB" to "/usr/bin/true" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe tlib -a -P512 /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe tlib -a -P512 /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: Possible ERROR while running archiver: ret = 25600 Archiver is not functional Defined make macro "FAST_AR_FLAGS" to "-a" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /usr/local/bin/win32fe...not found Checking for program /usr/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/bin/intel64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/BIN/amd64/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v4.0.30319/win32fe...not found Checking for program /cygdrive/c/Windows/Microsoft.NET/Framework64/v3.5/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/VC/VCPackages/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/Tools/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HTML Help Workshop/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/NETFX 4.0 Tools/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/x64/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft SDKs/Windows/v7.0A/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mkl/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Advisor XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/VTune Amplifier XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Inspector XE 2013/bin32/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/mpirt/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/intel64/compiler/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Common Files/Intel/Shared Libraries/redist/ia32/compiler/win32fe...not found Checking for program /cygdrive/c/Windows/system32/win32fe...not found Checking for program /cygdrive/c/Windows/win32fe...not found Checking for program /cygdrive/c/Windows/System32/Wbem/win32fe...not found Checking for program /cygdrive/c/Windows/System32/WindowsPowerShell/v1.0/win32fe...not found Checking for program /cygdrive/c/Program Files/TEC100/BIN/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/MPICH2/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/VisualSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/CMake 2.8/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/doxygen/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Graphviz 2.28/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/USGS/phast-2.4.1-7430/bin/win32fe...not found Checking for program /cygdrive/c/MinGW/bin/win32fe...not found Checking for program /cygdrive/c/Program Files/TortoiseSVN/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/HDF_Group/HDF5/1.8.11/bin/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Microsoft Visual Studio 10.0/Common7/IDE/win32fe...not found Checking for program /cygdrive/c/Program Files (x86)/Intel/Composer XE 2013/redist/intel64/mpirt/win32fe...not found Checking for program /home/dsu/win32fe...not found Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "AR" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib" Checking for program /usr/local/bin/true...not found Checking for program /usr/bin/true...found Defined make macro "RANLIB" to "/usr/bin/true" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a /tmp/petsc-1nzsmm/config.setCompilers/conf1.o sh: sh: /usr/bin/true /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a Executing: /usr/bin/true /tmp/petsc-1nzsmm/config.setCompilers/libconf1.a sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 sh: LINK : fatal error LNK1181: cannot open input file 'libconf1.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1181: cannot open input file 'libconf1.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconf1 sh: Defined make macro "AR_FLAGS" to "-a" Defined make macro "AR_LIB_SUFFIX" to "lib" Popping language C ================================================================================ TEST checkSharedLinker from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1212) TESTING: checkSharedLinker from config.setCompilers(config/BuildSystem/config/setCompilers.py:1212) Check that the linker can produce shared libraries sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 Checking shared linker /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl using flags ['-shared'] Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "LD_SHARED" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -shared /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -shared /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-shared' Rejecting C linker flag -shared due to cl : Command line warning D9002 : ignoring unknown option '-shared' sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("hello"); return 0;} Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/libconftest.so /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/libconftest.so /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LIBCMT.lib(crt0.obj) : error LNK2019: unresolved external symbol main referenced in function __tmainCRTStartup C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconftest.so : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LIBCMT.lib(crt0.obj) : error LNK2019: unresolved external symbol main referenced in function __tmainCRTStartup C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconftest.so : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("hello"); return 0;} Deleting "LD_SHARED" Checking shared linker /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl using flags ['-dynamic'] Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "LD_SHARED" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -dynamic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -dynamic /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-dynamic' Rejecting C linker flag -dynamic due to cl : Command line warning D9002 : ignoring unknown option '-dynamic' sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("hello"); return 0;} Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/libconftest.so /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/libconftest.so /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LIBCMT.lib(crt0.obj) : error LNK2019: unresolved external symbol main referenced in function __tmainCRTStartup C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconftest.so : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LIBCMT.lib(crt0.obj) : error LNK2019: unresolved external symbol main referenced in function __tmainCRTStartup C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconftest.so : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("hello"); return 0;} Deleting "LD_SHARED" Checking shared linker /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl using flags ['-qmkshrobj'] Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "LD_SHARED" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qmkshrobj /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -qmkshrobj /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-qmkshrobj' Rejecting C linker flag -qmkshrobj due to cl : Command line warning D9002 : ignoring unknown option '-qmkshrobj' sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("hello"); return 0;} Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/libconftest.so /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/libconftest.so /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LIBCMT.lib(crt0.obj) : error LNK2019: unresolved external symbol main referenced in function __tmainCRTStartup C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconftest.so : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LIBCMT.lib(crt0.obj) : error LNK2019: unresolved external symbol main referenced in function __tmainCRTStartup C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\libconftest.so : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("hello"); return 0;} Deleting "LD_SHARED" sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Checking shared linker /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib using flags [] Checking for program /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe...found Defined make macro "LD_SHARED" to "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("hello"); return 0;} Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.setCompilers/libconftest.lib /tmp/petsc-1nzsmm/config.setCompilers/conftest.o ; /usr/bin/true /tmp/petsc-1nzsmm/config.setCompilers/libconftest.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.setCompilers/libconftest.lib /tmp/petsc-1nzsmm/config.setCompilers/conftest.o ; /usr/bin/true /tmp/petsc-1nzsmm/config.setCompilers/libconftest.lib sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(void); int main() { int ret = foo(); if(ret);; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconftest Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/tmp/petsc-1nzsmm/config.setCompilers -lconftest sh: Using shared linker /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib with flags [] and library extension lib sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 ================================================================================ TEST checkSharedLinkerPaths from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1290) TESTING: checkSharedLinkerPaths from config.setCompilers(config/BuildSystem/config/setCompilers.py:1290) Determine the shared linker path options - IRIX: -rpath - Linux, OSF: -Wl,-rpath, - Solaris: -R - FreeBSD: -Wl,-R, Pushing language C sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -V Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -V sh: cl : Command line error D8004 : '/V' requires an argument Trying C linker flag -Wl,-rpath, sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line error D8021 : invalid numeric argument '/Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2' Possible ERROR while running linker: output: cl : Command line error D8021 : invalid numeric argument '/Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Rejecting linker flag -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 due to nonzero status from link Rejected C linker flag -Wl,-rpath, Trying C linker flag -R sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -R/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -R/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-R/cygdrive/c/cygwin/packages/petsc-3.4.2' Rejecting C linker flag -R/cygdrive/c/cygwin/packages/petsc-3.4.2 due to cl : Command line warning D9002 : ignoring unknown option '-R/cygdrive/c/cygwin/packages/petsc-3.4.2' Rejected C linker flag -R Trying C linker flag -rpath sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-rpath' cl : Command line warning D9024 : unrecognized source file type 'C:\cygwin\packages\petsc-3.4.2', object file assumed LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' Possible ERROR while running linker: output: cl : Command line warning D9002 : ignoring unknown option '-rpath' cl : Command line warning D9024 : unrecognized source file type 'C:\cygwin\packages\petsc-3.4.2', object file assumed LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Rejecting linker flag -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 due to nonzero status from link Rejecting C linker flag -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 due to cl : Command line warning D9002 : ignoring unknown option '-rpath' cl : Command line warning D9024 : unrecognized source file type 'C:\cygwin\packages\petsc-3.4.2', object file assumed LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' Rejected C linker flag -rpath Trying C linker flag -Wl,-R, sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line error D8021 : invalid numeric argument '/Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2' Possible ERROR while running linker: output: cl : Command line error D8021 : invalid numeric argument '/Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Rejecting linker flag -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 due to nonzero status from link Rejected C linker flag -Wl,-R, Popping language C Pushing language Cxx sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -V Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -V sh: cl : Command line error D8004 : '/V' requires an argument Trying Cxx linker flag -Wl,-rpath, sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line error D8021 : invalid numeric argument '/Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2' Possible ERROR while running linker: output: cl : Command line error D8021 : invalid numeric argument '/Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2' ret = 512 Pushing language CXX Popping language CXX in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Rejecting linker flag -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 due to nonzero status from link Rejected Cxx linker flag -Wl,-rpath, Trying Cxx linker flag -R sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -R/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -R/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-R/cygdrive/c/cygwin/packages/petsc-3.4.2' Rejecting Cxx linker flag -R/cygdrive/c/cygwin/packages/petsc-3.4.2 due to cl : Command line warning D9002 : ignoring unknown option '-R/cygdrive/c/cygwin/packages/petsc-3.4.2' Rejected Cxx linker flag -R Trying Cxx linker flag -rpath sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-rpath' cl : Command line warning D9024 : unrecognized source file type 'C:\cygwin\packages\petsc-3.4.2', object file assumed LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' Possible ERROR while running linker: output: cl : Command line warning D9002 : ignoring unknown option '-rpath' cl : Command line warning D9024 : unrecognized source file type 'C:\cygwin\packages\petsc-3.4.2', object file assumed LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' ret = 512 Pushing language CXX Popping language CXX in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Rejecting linker flag -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 due to nonzero status from link Rejecting Cxx linker flag -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 due to cl : Command line warning D9002 : ignoring unknown option '-rpath' cl : Command line warning D9024 : unrecognized source file type 'C:\cygwin\packages\petsc-3.4.2', object file assumed LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' Rejected Cxx linker flag -rpath Trying Cxx linker flag -Wl,-R, sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line error D8021 : invalid numeric argument '/Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2' Possible ERROR while running linker: output: cl : Command line error D8021 : invalid numeric argument '/Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2' ret = 512 Pushing language CXX Popping language CXX in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Rejecting linker flag -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 due to nonzero status from link Rejected Cxx linker flag -Wl,-R, Popping language Cxx Pushing language FC sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -V Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -V sh: cl : Command line error D8004 : '/V' requires an argument Trying FC linker flag -Wl,-rpath, sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10157: ignoring option '/W'; argument is of wrong type Rejecting FC linker flag -Wl,-rpath,/cygdrive/c/cygwin/packages/petsc-3.4.2 due to ifort: command line warning #10157: ignoring option '/W'; argument is of wrong type Rejected FC linker flag -Wl,-rpath, Trying FC linker flag -R sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -R/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -R/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/R/cygdrive/c/cygwin/packages/petsc-3.4.2' Rejecting FC linker flag -R/cygdrive/c/cygwin/packages/petsc-3.4.2 due to ifort: command line warning #10006: ignoring unknown option '/R/cygdrive/c/cygwin/packages/petsc-3.4.2' Rejected FC linker flag -R Trying FC linker flag -rpath sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/rpath' ifort: command line warning #10161: unrecognized source type 'C:\cygwin\packages\petsc-3.4.2'; object file assumed ipo: error #11018: Cannot open C:\cygwin\packages\petsc-3.4.2 LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' Possible ERROR while running linker: output: ifort: command line warning #10006: ignoring unknown option '/rpath' ifort: command line warning #10161: unrecognized source type 'C:\cygwin\packages\petsc-3.4.2'; object file assumed ipo: error #11018: Cannot open C:\cygwin\packages\petsc-3.4.2 LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' ret = 20480 Pushing language FC Popping language FC in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: program main end Rejecting linker flag -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 due to nonzero status from link Rejecting FC linker flag -rpath /cygdrive/c/cygwin/packages/petsc-3.4.2 due to ifort: command line warning #10006: ignoring unknown option '/rpath' ifort: command line warning #10161: unrecognized source type 'C:\cygwin\packages\petsc-3.4.2'; object file assumed ipo: error #11018: Cannot open C:\cygwin\packages\petsc-3.4.2 LINK : fatal error LNK1104: cannot open file 'C:\cygwin\packages\petsc-3.4.2' Rejected FC linker flag -rpath Trying FC linker flag -Wl,-R, sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10157: ignoring option '/W'; argument is of wrong type Rejecting FC linker flag -Wl,-R,/cygdrive/c/cygwin/packages/petsc-3.4.2 due to ifort: command line warning #10157: ignoring option '/W'; argument is of wrong type Rejected FC linker flag -Wl,-R, Popping language FC ================================================================================ TEST checkLibC from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1325) TESTING: checkLibC from config.setCompilers(config/BuildSystem/config/setCompilers.py:1325) Test whether we need to explicitly include libc in shared linking - Mac OSX requires an explicit reference to libc for shared linking ================================================================================ TEST checkDynamicLinker from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1374) TESTING: checkDynamicLinker from config.setCompilers(config/BuildSystem/config/setCompilers.py:1374) Check that the linker can dynamicaly load shared libraries Checking for header: dlfcn.h All intermediate test results are stored in /tmp/petsc-1nzsmm/config.headers sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 21 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'dlfcn.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include Dynamic loading disabled since dlfcn.h was missing ================================================================================ TEST output from config.setCompilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/setCompilers.py:1420) TESTING: output from config.setCompilers(config/BuildSystem/config/setCompilers.py:1420) Output module data as defines and substitutions Substituting "CC" with "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl" Substituting "CFLAGS" with "" Defined make macro "CC_LINKER_SLFLAG" to "-L" Substituting "CPP" with "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E" Substituting "CPPFLAGS" with "" Substituting "CXX" with "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl" Substituting "CXX_CXXFLAGS" with "" Substituting "CXXFLAGS" with "" Substituting "CXX_LINKER_SLFLAG" with "-L" Substituting "CXXCPP" with "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E" Substituting "CXXCPPFLAGS" with "" Substituting "FC" with "/cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort" Substituting "FFLAGS" with "" Defined make macro "FC_LINKER_SLFLAG" to "-L" Substituting "LDFLAGS" with "" Substituting "LIBS" with "" Substituting "SHARED_LIBRARY_FLAG" with "" ================================================================================ TEST configureCompilerFlags from config.compilerFlags(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilerFlags.py:65) TESTING: configureCompilerFlags from config.compilerFlags(config/BuildSystem/config/compilerFlags.py:65) Get the default compiler flags Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --version Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --version sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 getCompilerVersion: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 \nMicrosoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib Trying C compiler flag -MT sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -MT Trying C compiler flag -wd4996 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -wd4996 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib Trying C compiler flag -Z7 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -Z7 Popping language C Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --version Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --version sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 getCompilerVersion: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 \nMicrosoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib Trying Cxx compiler flag -MT sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -MT Trying Cxx compiler flag -GR sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -GR Trying Cxx compiler flag -EHsc sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -EHsc sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib Trying Cxx compiler flag -Z7 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -Z7 Trying Cxx compiler flag -Zm200 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -Zm200 Popping language Cxx Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --version Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --version sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Intel(R) Visual Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 13.1.3.198 Build 20130607 getCompilerVersion: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 \nIntel(R) Visual Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 13.1.3.198 Build 20130607 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= icl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Intel(R) Visual Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 13.1.3.198 Build 20130607 Copyright (C) 1985-2013 Intel Corporation. All rights reserved. Intel(R) Fortran Compiler Help ============================== Intel(R) Compiler includes compiler options that optimize for instruction sets that are available in both Intel(R) and non-Intel microprocessors, but may perform additional optimizations for Intel microprocessors than for non-Intel microprocessors. In addition, certain compiler options for Intel(R) Compiler are reserved for Intel microprocessors. For a detailed description of these compiler options, including the instructions they implicate, please refer to "Intel(R) Compiler User and Reference Guides > Compiler Options." usage: ifort [options] file1 [file2 ...] [/link linker_options] where options represents zero or more compiler options fileN is a Fortran source (.f .for .ftn .f90 .fpp .i .i90), assembly (.asm), object (.obj), static library (.lib), or other linkable file linker_options represents zero or more linker options Notes ----- 1. Many FL32 options are supported; a warning is printed for unsupported options. 2. Intel Fortran compiler options may be placed in your ifort.cfg file. Some options listed are only available on a specific system i32 indicates the feature is available on systems based on IA-32 architecture i64em indicates the feature is available on systems using Intel(R) 64 architecture Compiler Option List -------------------- Optimization ------------ /O1 optimize for maximum speed, but disable some optimizations which increase code size for a small speed benefit /O2 optimize for maximum speed (DEFAULT) /O3 optimize for maximum speed and enable more aggressive optimizations that may not improve performance on some programs /Ox enable maximum optimizations (same as /O2) /Os enable speed optimizations, but disable some optimizations which increase code size for small speed benefit (overrides /Ot) /Ot enable speed optimizations (overrides /Os) /Od disable optimizations /Oy[-] enable/disable using EBP as a general purpose register (no frame pointer) (i32 only) /fast enable /QxHOST /O3 /Qipo /Qprec-div- options set by /fast cannot be overridden with the exception of /QxHOST, list options separately to change behavior /Oa[-] assume no aliasing in program /Ow[-] assume no aliasing within functions, but assume aliasing across calls Code Generation --------------- /Qx generate specialized code to run exclusively on processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. Optimizes for the Intel NetBurst(R) microarchitecture. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the enhanced Pentium(R) M processor microarchitecture and Intel NetBurst(R) microarchitecture. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Core(TM) microarchitecture. SSE4.1 May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator instructions for Intel processors. May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for Intel(R) 45nm Hi-k next generation Intel Core(TM) microarchitecture. SSE4.2 May generate Intel(R) SSE4 Efficient Accelerated String and Text Processing instructions supported by Intel(R) Core(TM) i7 processors. May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator, Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for the Intel(R) Core(TM) processor family. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. SSSE3_ATOM May generate MOVBE instructions for Intel processors, depending on the setting of option /Qinstruction. May also generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Atom(TM) processor and Intel(R) Centrino(R) Atom(TM) Processor Technology. /QxHost generate instructions for the highest instruction set and processor available on the compilation host machine /Qax[,,...] generate code specialized for processors specified by while also generating generic IA-32 instructions. includes one or more of the following: SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. /arch: generate specialized code to optimize for processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions SSE3 May generate Intel(R) SSE3, SSE2 and SSE instructions SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2 and SSE instructions SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions AVX May generate Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions /Qinstruction: Refine instruction set output for the selected target processor [no]movbe - Do/do not generate MOVBE instructions with SSSE3_ATOM (requires /QxSSSE3_ATOM) /Qextend-arguments:[32|64] By default, unprototyped scalar integer arguments are passed in 32-bits (sign-extended if necessary). On Intel(R) 64, unprototyped scalar integer arguments may be extended to 64-bits. Interprocedural Optimization (IPO) ---------------------------------- /Qip[-] enable(DEFAULT)/disable single-file IP optimization within files /Qipo[n] enable multi-file IP optimization between files /Qipo-c generate a multi-file object file (ipo_out.obj) /Qipo-S generate a multi-file assembly file (ipo_out.asm) /Qip-no-inlining disable full and partial inlining /Qip-no-pinlining disable partial inlining /Qipo-separate create one object file for every source file (overrides /Qipo[n]) /Qipo-jobs specify the number of jobs to be executed simultaneously during the IPO link phase Advanced Optimizations ---------------------- /Qunroll[n] set maximum number of times to unroll loops. Omit n to use default heuristics. Use n=0 to disable the loop unroller /Qunroll-aggressive[-] enables more aggressive unrolling heuristics /Qscalar-rep[-] enable(DEFAULT)/disable scalar replacement (requires /O3) /Qpad[-] enable/disable(DEFAULT) changing variable and array memory layout /Qsafe-cray-ptr Cray pointers do not alias with other variables /Qansi-alias[-] enable/disable(DEFAULT) use of ANSI aliasing rules optimizations; user asserts that the program adheres to these rules /Qcomplex-limited-range[-] enable/disable(DEFAULT) the use of the basic algebraic expansions of some complex arithmetic operations. This can allow for some performance improvement in programs which use a lot of complex arithmetic at the loss of some exponent range. /reentrancy: specify whether the threaded, reentrant run-time support should be used Keywords: none (same as /noreentrancy), threaded, async /noreentrancy do not use threaded, reentrant run-time support /heap-arrays[:n] temporary arrays of minimum size n (in kilobytes) are allocated in heap memory rather than on the stack. If n is not specified, all temporary arrays are allocated in heap memory. /heap-arrays- temporary arrays are allocated on the stack (DEFAULT) /Qopt-multi-version-aggressive[-] enables more aggressive multi-versioning to check for pointer aliasing and scalar replacement /Qopt-ra-region-strategy[:] select the method that the register allocator uses to partition each routine into regions routine - one region per routine block - one region per block trace - one region per trace loop - one region per loop default - compiler selects best option /Qvec[-] enables(DEFAULT)/disables vectorization /Qvec-guard-write[-] enables cache/bandwidth optimization for stores under conditionals within vector loops /Qvec-threshold[n] sets a threshold for the vectorization of loops based on the probability of profitable execution of the vectorized loop in parallel /Qopt-malloc-options:{0|1|2|3|4} specify malloc configuration parameters. Specifying a non-zero value will cause alternate configuration parameters to be set for how malloc allocates and frees memory /Qopt-jump-tables: control the generation of jump tables default - let the compiler decide when a jump table, a series of if-then-else constructs or a combination is generated large - generate jump tables up to a certain pre-defined size (64K entries) - generate jump tables up to in size use /Qopt-jump-tables- to lower switch statements as chains of if-then-else constructs /Qopt-block-factor: specify blocking factor for loop blocking /Qopt-streaming-stores: specifies whether streaming stores are generated always - enables generation of streaming stores under the assumption that the application is memory bound auto - compiler decides when streaming stores are used (DEFAULT) never - disables generation of streaming stores /Qmkl[:] link to the Intel(R) Math Kernel Library (Intel(R) MKL) and bring in the associated headers parallel - link using the threaded Intel(R) MKL libraries. This is the default when /Qmkl is specified sequential - link using the non-threaded Intel(R) MKL libraries cluster - link using the Intel(R) MKL Cluster libraries plus the sequential Intel(R) MKL libraries /Qimsl link to the International Mathematics and Statistics Library* (IMSL* library) /Qopt-subscript-in-range[-] assumes no overflows in the intermediate computation of the subscripts /Qcoarray[:shared|distributed] enable/disable(DEFAULT) coarray syntax for data parallel programming. The default is shared-memory; distributed memory is only valid with the Intel(R) Cluster Toolkit /Qcoarray-num-images:n set default number of coarray images /Qopt-matmul[-] replace matrix multiplication with calls to intrinsics and threading libraries for improved performance (DEFAULT at /O3 /Qparallel) /Qsimd[-] enables(DEFAULT)/disables vectorization using SIMD directive /Qguide-opts: tells the compiler to analyze certain code and generate recommendations that may improve optimizations /Qguide-file[:] causes the results of guide to be output to a file /Qguide-file-append[:] causes the results of guide to be appended to a file /Qguide[:] lets you set a level (1 - 4) of guidance for auto-vectorization, auto-parallelization, and data transformation (DEFAULT is 4 when the option is specified) /Qguide-data-trans[:] lets you set a level (1 - 4) of guidance for data transformation (DEFAULT is 4 when the option is specified) /Qguide-par[:] lets you set a level (1 - 4) of guidance for auto-parallelization (DEFAULT is 4 when the option is specified) /Qguide-vec[:] lets you set a level (1 - 4) of guidance for auto-vectorization (DEFAULT is 4 when the option is specified) /Qguide-profile:<[file|dir]>[,[file|dir],...] specify a loop profiler data file (or set of files in a directory) when using the /Qguide option /Qopt-mem-layout-trans[:] controls the level of memory layout transformations performed by the compiler 0 - disable memory layout transformations (same as /Qopt-mem-layout-trans-) 1 - enable basic memory layout transformations 2 - enable more memory layout transformations (DEFAULT when the option is specified) 3 - enable aggressive memory layout transformations /Qopt-prefetch[:n] enable levels of prefetch insertion, where 0 disables. n may be 0 through 4 inclusive. Default is 2. /Qopt-prefetch- disable(DEFAULT) prefetch insertion. Equivalent to /Qopt-prefetch:0 Profile Guided Optimization (PGO) --------------------------------- /Qprof-dir specify directory for profiling output files (*.dyn and *.dpi) /Qprof-src-root specify project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-root-cwd specify the current directory as the project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-dir[-] specify whether directory names of sources should be considered when looking up profile records within the .dpi file /Qprof-file specify file name for profiling summary file /Qprof-data-order[-] enable/disable(DEFAULT) static data ordering with profiling /Qprof-func-order[-] enable/disable(DEFAULT) function ordering with profiling /Qprof-gen[:keyword] instrument program for profiling. Optional keyword may be srcpos or globdata /Qprof-gen- disable profiling instrumentation /Qprof-use[:] enable use of profiling information during optimization weighted - invokes profmerge with -weighted option to scale data based on run durations [no]merge - enable(default)/disable the invocation of the profmerge tool /Qprof-use- disable use of profiling information during optimization /Qcov-gen instrument program for profiling /Qcov-dir specify directory for profiling output files (*.dyn and *.dpi) /Qcov-file specify file name for profiling summary file /Qinstrument-functions[-] determine whether function entry and exit points are instrumented /Qprof-hotness-threshold: set the hotness threshold for function grouping and function ordering val indicates percentage of functions to be placed in hot region. This option requires /Qprof-use and /Qprof-func-order /Qprof-value-profiling:[,,...] limit value profiling none - inhibit all types of value profiling nodivide - inhibit value profiling of non-compile time constants used in division or remainder operations noindcall - inhibit value profiling of function addresses at indirect call sites /Qprofile-functions enable instrumentation in generated code for collecting function execution time profiles /Qprofile-loops: enable instrumentation in generated code for collecting loop execution time profiles inner - instrument inner loops outer - instrument outer loops all - instrument all loops /Qprofile-loops-report: Control the level of instrumentation inserted for reporting loop execution profiles 1 - report loop times 2 - report loop times and iteration counts Optimization Reports -------------------- /Qvec-report[n] control amount of vectorizer diagnostic information n=0 no diagnostic information n=1 indicate vectorized loops (DEFAULT when enabled) n=2 indicate vectorized/non-vectorized loops n=3 indicate vectorized/non-vectorized loops and prohibiting data dependence information n=4 indicate non-vectorized loops n=5 indicate non-vectorized loops and prohibiting data dependence information n=6 indicate vectorized/non-vectorized loops with greater details and prohibiting data dependence information n=7 indicate vector code quality message ids and data values for vectorized loops /Qopt-report[:n] generate an optimization report to stderr 0 disable optimization report output 1 minimum report output 2 medium output (DEFAULT when enabled) 3 maximum report output /Qopt-report-file: specify the filename for the generated report /Qopt-report-phase: specify the phase that reports are generated against /Qopt-report-routine: reports on routines containing the given name /Qopt-report-help display the optimization phases available for reporting /Qtcheck[:mode] enable analysis of threaded applications (requires Intel(R) Thread Checker; cannot be used with compiler alone) tci - instruments a program to perform a thread-count-independent analysis tcd - instruments a program to perform a thread-count-dependent analysis (DEFAULT when mode is not used) api - instruments a program at the api-imports level /Qtcollect[:] inserts instrumentation probes calling the Intel(R) Trace Collector API. The library .lib is linked in the default being VT.lib (requires Intel(R) Trace Collector) /Qtcollect-filter:file Enable or disable the instrumentation of specified functions. (requires Intel(R) Trace Collector) OpenMP* and Parallel Processing ------------------------------ /Qopenmp enable the compiler to generate multi-threaded code based on the OpenMP* directives (same as /openmp) /Qopenmp-stubs enables the user to compile OpenMP programs in sequential mode. The OpenMP directives are ignored and a stub OpenMP library is linked (sequential) /Qopenmp-report{0|1|2} control the OpenMP parallelizer diagnostic level /Qopenmp-lib: choose which OpenMP library version to link with compat - use the Microsoft compatible OpenMP run-time libraries (DEFAULT) /Qopenmp-threadprivate: choose which threadprivate implementation to use compat - use the Microsoft compatible thread local storage legacy - use the Intel compatible implementation (DEFAULT) /Qparallel enable the auto-parallelizer to generate multi-threaded code for loops that can be safely executed in parallel /Qpar-report{0|1|2|3} control the auto-parallelizer diagnostic level /Qpar-threshold[n] set threshold for the auto-parallelization of loops where n is an integer from 0 to 100 /Qpar-runtime-control[n] Control parallelizer to generate runtime check code for effective automatic parallelization. n=0 no runtime check based auto-parallelization n=1 generate runtime check code under conservative mode (DEFAULT when enabled) n=2 generate runtime check code under heuristic mode n=3 generate runtime check code under aggressive mode /Qpar-schedule-static[:n] Specifies a scheduling algorithm for DO loop iteration. Divides iterations into contiguous pieces. Size n if specified, equal sized pieces if not. /Qpar-schedule-static_balanced[:n] Divides iterations into even-sized chunks. Size n if specified, equal sized pieces if not. /Qpar-schedule-static-steal[:n] Divides iterations into even-sized chunks, but allows threads to steal parts of chunks from neighboring threads /Qpar-schedule-dynamic[:n] Specifies a scheduling algorithm for DO loop iteration. Assigns iterations to threads in chunks dynamically. Chunk size is n iterations if specified, otherwise 1. /Qpar-schedule-guided[:n] Specifies a scheduling algorithm for DO loop iteration. Indicates a minimum number of iterations. If specified, n is the minimum number, otherwise 1. /Qpar-schedule-guided-analytical[:n] Divides iterations by using exponential distribution or dynamic distributions. /Qpar-schedule-runtime Specifies a scheduling algorithm for DO loop iteration. Defers the scheduling decision until runtime. /Qpar-schedule-auto Lets the compiler or run-time system determine the scheduling algorithm. /Qpar-adjust-stack perform fiber-based main thread stack adjustment /Qpar-affinity=[,...][,][,] tune application performance by setting different thread affinity /Qpar-num-threads= tune application performance by setting different number of threads /Qparallel-source-info[:n] enable(DEFAULT)/disable the emission of source location information for parallel code generation with OpenMP and auto-parallelization 0 - disable (same as /Qparallel-source-info-) 1 - emit routine name and line information (DEFAULT) 2 - emit path, file, routine name and line information /Qpar same as /Qparallel Floating Point -------------- /fp: enable floating point model variation except[-] - enable/disable floating point semantics fast[=1|2] - enables more aggressive floating point optimizations precise - allows value-safe optimizations source - enables intermediates in source precision strict - enables /fp:precise /fp:except, disables contractions and enables pragma stdc fenv_access /Qfp-speculation: enable floating point speculations with the following conditions: fast - speculate floating point operations (DEFAULT) safe - speculate only when safe strict - same as off off - disables speculation of floating-point operations /Qpc32 set internal FPU precision to 24 bit significand /Qprec improve floating-point precision (speed impact less than /Op) /Qprec-sqrt[-] determine if certain square root optimizations are enabled /Qprec-div[-] improve precision of FP divides (some speed impact) /Qfast-transcendentals[-] generate a faster version of the transcendental functions /Qfp-port[-] round fp results at assignments and casts (some speed impact) /Qfp-stack-check enable fp stack checking after every function/procedure call /Qrcd rounding mode to enable fast float-to-int conversions /rounding-mode:chopped set internal FPU rounding control to truncate /Qftz[-] enable/disable flush denormal results to zero /fpe:{0|1|3} specifies program-wide behavior on floating point exceptions /fpe-all:{0|1|3} specifies floating point exception behavior on all functions and subroutines. Also sets /assume:ieee_fpe_flags /[no]fltconsistency specify that improved floating-point consistency should be used /Qfma[-] enable/disable the combining of floating point multiplies and add/subtract operations /[no]recursive compile all procedures for possible recursive execution Inlining -------- /Ob control inline expansion: n=0 disable inlining (same as /inline:none) n=1 inline functions declared with ATTRIBUTES INLINE or FORCEINLINE n=2 inline any function, at the compiler's discretion /Qinline-min-size: set size limit for inlining small routines /Qinline-min-size- no size limit for inlining small routines /Qinline-max-size: set size limit for inlining large routines /Qinline-max-size- no size limit for inlining large routines /Qinline-max-total-size: maximum increase in size for inline function expansion /Qinline-max-total-size- no size limit for inline function expansion /Qinline-max-per-routine: maximum number of inline instances in any function /Qinline-max-per-routine- no maximum number of inline instances in any function /Qinline-max-per-compile: maximum number of inline instances in the current compilation /Qinline-max-per-compile- no maximum number of inline instances in the current compilation /Qinline-factor: set inlining upper limits by n percentage /Qinline-factor- do not set set inlining upper limits /Qinline-forceinline treat inline routines as forceinline /Qinline-dllimport allow(DEFAULT)/disallow functions declared DEC$ ATTRIBUTES DLLIMPORT to be inlined /Qinline-calloc directs the compiler to inline calloc() calls as malloc()/memset() /inline[:keyword] Specifies the level of inline function expansion keywords: all (same as /Ob2 /Ot), size (same as /Ob2 /Os) speed (same as /Ob2 /Ot), none or manual (same as /Ob0) Output, Debug, PCH ------------------ /c compile to object (.obj) only, do not link /nolink, /compile-only same as /c /S compile to assembly (.asm) only, do not link /FAs produce assembly file with optional source annotations /FAc produce assembly file with optional code annotations /FA produce assembly file /Fa[file] name assembly file (or directory for multiple files; i.e. /FaMYDIR\) /Fo[file] name object file (or directory for multiple files; i.e. /FoMYDIR\) /Fe[file] name executable file or directory /object: specify the name of the object file, or the directory to which object file(s) should be written. (e.g. /object:MYOBJ or /object:MYDIR\) /exe: specifies the name to be used for the built program (.exe) or dynamic-link (.dll) library /map: specify that a link map file should be generated /list: specify that a listing file should be generated /list-line-len:# overrides the default line length (80) in a listing file /list-page-len:# overrides the default page length (66) in a listing file /show: controls the contents of the listing file keywords: all, none, [no]include, [no]map, [no]options /Zi, /ZI, /Z7 produce symbolic debug information in object file (implies /Od when another optimization option is not explicitly set) /debug[:keyword] enable debug information and control output of enhanced debug information keywords: all, full, minimal, none, [no]inline-debug-info /nodebug do not enable debug information /debug-parameters[:keyword] control output of debug information for PARAMETERS keywords: all, used, none (same as /nodebug-parameters) /nodebug-parameters do not output debug information for PARAMETERS /Qd-lines, /[no]d-lines compile debug statements (indicated by D in column 1) /pdbfile[:filename] specify that debug related information should be generated to a program database file /nopdbfile do not generate debug related information to a program database file /Qtrapuv trap uninitialized variables /RTCu report use of variable that was not initialized /Qmap-opts enable option mapping tool Preprocessor ------------ /D[{=|#}] define macro /define:symbol[=] same as /D /nodefines specifies that any /D macros go to the preprocessor only, and not to the compiler /U remove predefined macro /undefine: remove predefined macro (same as /U) /allow:nofpp-comments If a Fortran end-of-line comment is seen within a #define, treat it as part of the definition. Default is allow:fpp-comments /E preprocess to stdout /EP preprocess to stdout, omitting #line directives /EP /P preprocess to file, omitting #line directives /P preprocess to file /preprocess-only same as /P /[no]keep keep/remove preprocessed file generated by preprocessor as input to compiler stage. Not affected by /Qsave-temps. Default is /nokeep /fpp[n], /[no]fpp run Fortran preprocessor on source files prior to compilation n=0 disable running the preprocessor, equivalent to nofpp n=1,2,3 run preprocessor /module:path specify path where mod files should be placed and first location to look for mod files /u remove all predefined macros /I add directory to include file search path /[no]include: same as /I /X remove standard directories from include file search path /[no]gen-dep[:filename] generate dependency information. If no filename is specified, output to stdout /gen-depformat:keyword generate dependency information in the specified format. One of: make, nmake Component Control ----------------- /Qoption,, pass options to tool specified by /Qlocation,, set as the location of tool specified by Language -------- /[no]altparam specify if alternate form of parameter constant declarations (without parenthesis) is recognized. Default is to recognize /assume: specify assumptions made by the optimizer and code generator keywords: none, [no]byterecl, [no]buffered_io, [no]bscc (nobscc same as /nbs), [no]cc_omp, [no]minus0, [no]dummy_aliases (same as /Qcommon-args), [no]ieee_fpe_flags, [no]fpe_summary, [no]old_boz, [no]old_complex_align, [no]old_logical_ldio, [no]old_ldout_format, [no]old_maxminloc, [no]old_unit_star, [no]old_xor, [no]protect_constants, [no]protect_parens, [no]realloc_lhs, [no]2underscore, [no]underscore (same as /us), [no]std_intent_in, [no]std_mod_proc_name, [no]source_include, [no]split_common,[no]writeable_strings /ccdefault: specify default carriage control for units 6 and * keywords: default, fortran, list or none /[no]check: check run-time conditions. Default is /nocheck keywords: all (same as /4Yb, /C), none (same as /nocheck, /4Nb), [no]arg_temp_created, [no]bounds (same as /CB), [no]format, [no]output_conversion, [no]pointer (same as /CA), [no]uninit (same as /CU), [no]stack /Qcommon-args assume "by reference" subprogram arguments may alias one another. Same as /assume:dummy_aliases /[no]extend-source[:] specify rightmost column for fixed form sources keywords: 72 (same as /noextend-source and /4L72), 80 (same as /4L80), 132 (same as /4L132. Default if you specify /extend-source without a keyword.) /fixed specify source files are in fixed format. Same as /FI and /4Nf /nofixed indicates free format /free specify source files are in free format. Same as /FR and /4Yf /nofree indicates fixed format /names: specify how source code identifiers and external names are interpreted. keywords: as_is, lowercase, uppercase /[no]pad-source, /Qpad-source[-] make compiler acknowledge blanks at the end of a line /stand[:] specifies level of conformance with ANSI standard to check for. If keyword is not specified, level of conformance is f03 keywords: f90 (same as /4Ys), f95, f03, none (same as /nostand) /standard-semantics sets assume keywords to conform to the semantics of the f03 standard. May result in performance loss. assume keywords set by /standard-semantics: byterecl, fpe_summary, minus0, noold_maxminloc, noold_unit_star, noold_xor, protect_parens, realloc_lhs, std_intent_in, std_mod_proc_name, noold_ldout_format /syntax-only, /Zs perform syntax and semantic checking only (no object file produced) Compiler Diagnostics -------------------- /w disable all warnings /W disable warnings (n = 0) or show warnings (n = 1 DEFAULT, same as /warn:general) /warn: specifies the level of warning messages issued keywords: all, none (same as /nowarn) [no]alignments, [no]declarations, [no]errors, [no]general, [no]ignore_loc, [no]interfaces, [no]stderrors, [no]truncated_source, [no]uncalled, [no]unused, [no]usage /nowarn suppress all warning messages /WB turn a compile-time bounds check into a warning /[no]traceback specify whether the compiler generates PC correlation data used to display a symbolic traceback rather than a hexadecimal traceback at runtime failure /[no]gen-interfaces [[no]source] generate interface blocks for all routines in the file. Can be checked using -warn interfaces nosource indicates temporary source files should not be saved /error-limit: specify the maximum number of error-level or fatal-level compiler errors allowed /noerror-limit set no maximum number on error-level or fatal-level error messages /Qdiag-enable:[,,...] enable the specified diagnostics or diagnostic groups /Qdiag-disable:[,,...] disable the specified diagnostics or diagnostic groups where may be individual diagnostic numbers or group names. where group names include: sc[n] - perform source code analysis: n=1 for critical errors, n=2 for all errors and n=3 for all errors and warnings sc- {full|concise|precise} - perform static analysis and determine the analysis mode. Full mode - attempts to find all program weaknesses, even at the expense of more false positives. Concise mode - attempts to reduce false positives somewhat more than reducing false negatives. Precise mode - attempts to avoid all false positives Default: full if /Qdiag-enable:sc{[1|2|3]} is present; otherwise None (static analysis diagnostics are disabled). sc-include - perform source code analysis on include files sc-single-file - This option tells static analysis to process each file individually. Default: OFF sc-enums - This option tells static analysis to treat enumeration variables as known values equal to any one of the associated enumeration literals. Default: OFF sc-parallel[n] - perform analysis of parallelization in source code: n=1 for critical errors, n=2 for errors, n=3 for all errors and warnings warn - diagnostic messages that have "warning" severity level. error - diagnostic messages that have "error" severity level. remark - diagnostic messages that are remarks or comments. vec - diagnostic messages issued by the vectorizer. par - diagnostic messages issued by the auto-parallelizer openmp - diagnostic messages issued by the OpenMP* parallelizer. cpu-dispatch Specifies the CPU dispatch remarks. /Qdiag-error:[,,...] output the specified diagnostics or diagnostic groups as errors /Qdiag-warning:[,,...] output the specified diagnostics or diagnostic groups as warnings /Qdiag-remark:[,,...] output the the specified diagnostics or diagnostic groups as remarks /Qdiag-dump display the currently enabled diagnostic messages to stdout or to a specified diagnostic output file. /Qdiag-sc-dir: directory where diagnostics from static analysis are created, rather than current working directory. /Qdiag-file[:] where diagnostics are emitted to. Not specifying this causes messages to be output to stderr /Qdiag-file-append[:] where diagnostics are emitted to. When already exists, output is appended to the file /Qdiag-id-numbers[-] enable(DEFAULT)/disable the diagnostic specifiers to be output in numeric form /Qdiag-error-limit: specify the maximum number of errors emitted Miscellaneous ------------- /[no]logo display compiler version information. /nologo disables the output /Qsox[:[,keyword]] enable saving of compiler options, version and additional information in the executable. Use /Qsox- to disable(DEFAULT) profile - include profiling data inline - include inlining information /bintext: place the string specified into the object file and executable /Qsave-temps store the intermediate files in current directory and name them based on the source file. Only saves files that are generated by default /what display detailed compiler version information /watch: tells the driver to output processing information keywords: all, none (same as /nowatch), [no]source, [no]cmd [no]mic-cmd /nowatch suppress processing information output (DEFAULT) /Tf compile file as Fortran source /extfor: specify extension of file to be recognized as a Fortran file /extfpp: specify extension of file to be recognized as a preprocessor file /libdir[:keyword] control the library names that should be emitted into the object file keywords: all, none (same as /nolibdir), [no]automatic, [no]user /nolibdir no library names should be emitted into the object file /MP[] create multiple processes that can be used to compile large numbers of source files at the same time /bigobj generate objects with increased address capacity Data ---- /4I{2|4|8} set default KIND of integer and logical variables to 2, 4, or 8 /integer-size: specifies the default size of integer and logical variables size: 16, 32, 64 /4R{8|16} set default size of real to 8 or 16 bytes /real-size: specify the size of REAL and COMPLEX declarations, constants, functions, and intrinsics size: 32, 64, 128 /Qautodouble same as /real-size:64 or /4R8 /double-size: defines the size of DOUBLE PRECISION and DOUBLE COMPLEX declarations, constants, functions, and intrinsics size: 64, 128 /[no]fpconstant extends the precision of single precision constants assigned to double precision variables to double precision /[no]intconstant use Fortran 77 semantics, rather than Fortran 90/95, to determine kind of integer constants /auto make all local variables AUTOMATIC /Qauto-scalar make scalar local variables AUTOMATIC (DEFAULT) /Qsave save all variables (static allocation) (same as /noauto, opposite of /auto) /Qzero[-] enable/disable(DEFAULT) implicit initialization to zero of local scalar variables of intrinsic type INTEGER, REAL, COMPLEX, or LOGICAL that are saved and not initialized /Qdyncom make given common blocks dynamically-allocated /Zp[n] specify alignment constraint for structures (n=1,2,4,8,16 /Zp16 DEFAULT) /[no]align analyze and reorder memory layout for variables and arrays /align: specify how data items are aligned keywords: all (same as /align), none (same as /noalign), [no]commons, [no]dcommons, [no]qcommons, [no]zcommons, rec1byte, rec2byte, rec4byte, rec8byte, rec16byte, rec32byte, array8byte, array16byte, array32byte, array64byte, array128byte, array256byte, [no]records, [no]sequence /GS enable overflow security checks. /GS- disables (DEFAULT) /Qpatchable-addresses generate code such that references to statically assigned addresses can be patched with arbitrary 64-bit addresses. /Qfnalign[-] align the start of functions to an optimal machine-dependent value. When disabled (DEFAULT) align on a 2-byte boundary /Qfnalign:[2|16] align the start of functions on a 2 (DEFAULT) or 16 byte boundary /Qglobal-hoist[-] enable(DEFAULT)/disable external globals are load safe /Qkeep-static-consts[-] enable/disable(DEFAULT) emission of static const variables even when not referenced /Qnobss-init disable placement of zero-initialized variables in BSS (use DATA) /Qzero-initialized-in-bss[-] put explicitly zero initialized variables into the DATA section instead of the BSS section /convert: specify the format of unformatted files containing numeric data keywords: big_endian, cray, ibm, little_endian, native, vaxd, vaxg /Qimf-absolute-error:value[:funclist] define the maximum allowable absolute error for math library function results value - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-accuracy-bits:bits[:funclist] define the relative error, measured by the number of correct bits, for math library function results bits - a positive, floating-point number funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-arch-consistency:value[:funclist] ensures that the math library functions produce consistent results across different implementations of the same architecture value - true or false funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-max-error:ulps[:funclist] defines the maximum allowable relative error, measured in ulps, for math library function results ulps - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-precision:value[:funclist] defines the accuracy (precision) for math library functions value - defined as one of the following values high - equivalent to max-error = 0.6 medium - equivalent to max-error = 4 (DEFAULT) low - equivalent to accuracy-bits = 11 (single precision); accuracy-bits = 26 (double precision) funclist - optional comma separated list of one or more math library functions to which the attribute should be applied Compatibility ------------- /fpscomp[:] specify the level of compatibility to adhere to with Fortran PowerStation keywords: all, none (same as /nofpscomp), [no]filesfromcmd, [no]general, [no]ioformat, [no]ldio_spacing, [no]libs, [no]logicals /nofpscomp no specific level of compatibility with Fortran PowerStation /f66 allow extensions that enhance FORTRAN-66 compatibility /f77rtl specify that the Fortran 77 specific run-time support should be used /nof77rtl disables /vms enable VMS I/O statement extensions /Qvc enable compatibility with a specific Microsoft* Visual Studio version 9 - Microsoft* Visual Studio 2008 compatibility 10 - Microsoft* Visual Studio 2010 compatibility 11 - Microsoft* Visual Studio 2012 compatibility Linking/Linker -------------- /link specify that all options following '/link' are for the linker /extlnk: specify extension of file to be passed directly to linker /F set the stack reserve amount specified to the linker /dbglibs use the debug version of runtime libraries, when appropriate /libs: specifies which type of run-time library to link to. keywords: static, dll, qwin, qwins /LD[d] produce a DLL instead of an EXE ('d' = debug version) /dll same as /LD /MD[d] use dynamically-loaded, multithread C runtime /MDs[d] use dynamically-loaded, singlethread Fortran runtime, and multithread C runtime /MT[d] use statically-linked, multithread C runtime (DEFAULT with Microsoft Visual Studio 2005 and later) /ML[d] use statically-linked, single thread C runtime (only valid in Microsoft Visual Studio 2003 environment) /MG, /winapp use Windows API runtime libraries /Zl omit library names from object file /threads specify that multi-threaded libraries should be linked against /nothreads disables multi-threaded libraries Deprecated Options ------------------ /Qinline-debug-info use /debug:inline-debug-info /Gf use /GF /ML[d] upgrade to /MT[d] /Quse-asm No replacement /Qprof-genx use /Qprof-gen:srcpos /Qdiag-enable:sv[] use /Qdiag-enable:sc[] /Qdiag-enable:sv-include use /Qdiag-enable:sc-include /Qdiag-sv use /Qdiag-enable:sc[] /Qdiag-sv-error use /Qdiag-disable:warning /Qdiag-sv-include use /Qdiag-enable:sc-include /Qdiag-sv-level No replacement /Qdiag-sv-sup use /Qdiag-disable:[,,...] /Qtprofile No replacement /arch:SSE use /arch:IA32 /QxK upgrade to /arch:SSE2 /QaxK upgrade to /arch:SSE2 /QxW use /arch:SSE2 /QaxW use /arch:SSE2 /QxN use /QxSSE2 /QaxN use /QaxSSE2 /QxP use /QxSSE3 /QaxP use /QaxSSE3 /QxT use /QxSSSE3 /QaxT use /QaxSSSE3 /QxS use /QxSSE4.1 /QaxS use /QaxSSE4.1 /QxH use /QxSSE4.2 /QaxH use /QaxSSE4.2 /QxO use /arch:SSE3 /Qvc7.1 No replacement /QIfist use /Qrcd /QxSSE3_ATOM use /QxSSSE3_ATOM /Qrct No replacement /Op use /fltconsistency /debug:partial No replacement /tune: use /Qx /architecture: use /arch: /1, /Qonetrip use /f66 /Fm use /map /Qcpp, /Qfpp use /fpp /Qdps use /altparam /Qextend-source use /extend-source /Qlowercase use /names:lowercase /Quppercase use /names:uppercase /Qvms use /vms /asmattr:keyword use /FA[c|s|cs] /noasmattr,/asmattr:none use /FA /asmfile use /Fa /automatic use /auto /cm use /warn:nousage /optimize:0 use /Od /optimize:1,2 use /O1 /optimize:3,4 use /O2 /optimize:5 use /O3 /source use /Tf /unix No replacement /us use /assume:underscore /unroll use /Qunroll /w90, /w95 No replacement /Zd use /debug:minimal /help, /? [category] print full or category help message Valid categories include advanced - Advanced Optimizations codegen - Code Generation compatibility - Compatibility component - Component Control data - Data deprecated - Deprecated Options diagnostics - Compiler Diagnostics float - Floating Point help - Help inline - Inlining ipo - Interprocedural Optimization (IPO) language - Language link - Linking/Linker misc - Miscellaneous opt - Optimization output - Output pgo - Profile Guided Optimization (PGO) preproc - Preprocessor reports - Optimization Reports openmp - OpenMP and Parallel Processing Copyright (C) 1985-2013, Intel Corporation. All rights reserved. * Other names and brands may be claimed as the property of others. sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= icl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Intel(R) Visual Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 13.1.3.198 Build 20130607 Copyright (C) 1985-2013 Intel Corporation. All rights reserved. Intel(R) Fortran Compiler Help ============================== Intel(R) Compiler includes compiler options that optimize for instruction sets that are available in both Intel(R) and non-Intel microprocessors, but may perform additional optimizations for Intel microprocessors than for non-Intel microprocessors. In addition, certain compiler options for Intel(R) Compiler are reserved for Intel microprocessors. For a detailed description of these compiler options, including the instructions they implicate, please refer to "Intel(R) Compiler User and Reference Guides > Compiler Options." usage: ifort [options] file1 [file2 ...] [/link linker_options] where options represents zero or more compiler options fileN is a Fortran source (.f .for .ftn .f90 .fpp .i .i90), assembly (.asm), object (.obj), static library (.lib), or other linkable file linker_options represents zero or more linker options Notes ----- 1. Many FL32 options are supported; a warning is printed for unsupported options. 2. Intel Fortran compiler options may be placed in your ifort.cfg file. Some options listed are only available on a specific system i32 indicates the feature is available on systems based on IA-32 architecture i64em indicates the feature is available on systems using Intel(R) 64 architecture Compiler Option List -------------------- Optimization ------------ /O1 optimize for maximum speed, but disable some optimizations which increase code size for a small speed benefit /O2 optimize for maximum speed (DEFAULT) /O3 optimize for maximum speed and enable more aggressive optimizations that may not improve performance on some programs /Ox enable maximum optimizations (same as /O2) /Os enable speed optimizations, but disable some optimizations which increase code size for small speed benefit (overrides /Ot) /Ot enable speed optimizations (overrides /Os) /Od disable optimizations /Oy[-] enable/disable using EBP as a general purpose register (no frame pointer) (i32 only) /fast enable /QxHOST /O3 /Qipo /Qprec-div- options set by /fast cannot be overridden with the exception of /QxHOST, list options separately to change behavior /Oa[-] assume no aliasing in program /Ow[-] assume no aliasing within functions, but assume aliasing across calls Code Generation --------------- /Qx generate specialized code to run exclusively on processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. Optimizes for the Intel NetBurst(R) microarchitecture. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the enhanced Pentium(R) M processor microarchitecture and Intel NetBurst(R) microarchitecture. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Core(TM) microarchitecture. SSE4.1 May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator instructions for Intel processors. May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for Intel(R) 45nm Hi-k next generation Intel Core(TM) microarchitecture. SSE4.2 May generate Intel(R) SSE4 Efficient Accelerated String and Text Processing instructions supported by Intel(R) Core(TM) i7 processors. May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator, Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for the Intel(R) Core(TM) processor family. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. SSSE3_ATOM May generate MOVBE instructions for Intel processors, depending on the setting of option /Qinstruction. May also generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Atom(TM) processor and Intel(R) Centrino(R) Atom(TM) Processor Technology. /QxHost generate instructions for the highest instruction set and processor available on the compilation host machine /Qax[,,...] generate code specialized for processors specified by while also generating generic IA-32 instructions. includes one or more of the following: SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. /arch: generate specialized code to optimize for processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions SSE3 May generate Intel(R) SSE3, SSE2 and SSE instructions SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2 and SSE instructions SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions AVX May generate Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions /Qinstruction: Refine instruction set output for the selected target processor [no]movbe - Do/do not generate MOVBE instructions with SSSE3_ATOM (requires /QxSSSE3_ATOM) /Qextend-arguments:[32|64] By default, unprototyped scalar integer arguments are passed in 32-bits (sign-extended if necessary). On Intel(R) 64, unprototyped scalar integer arguments may be extended to 64-bits. Interprocedural Optimization (IPO) ---------------------------------- /Qip[-] enable(DEFAULT)/disable single-file IP optimization within files /Qipo[n] enable multi-file IP optimization between files /Qipo-c generate a multi-file object file (ipo_out.obj) /Qipo-S generate a multi-file assembly file (ipo_out.asm) /Qip-no-inlining disable full and partial inlining /Qip-no-pinlining disable partial inlining /Qipo-separate create one object file for every source file (overrides /Qipo[n]) /Qipo-jobs specify the number of jobs to be executed simultaneously during the IPO link phase Advanced Optimizations ---------------------- /Qunroll[n] set maximum number of times to unroll loops. Omit n to use default heuristics. Use n=0 to disable the loop unroller /Qunroll-aggressive[-] enables more aggressive unrolling heuristics /Qscalar-rep[-] enable(DEFAULT)/disable scalar replacement (requires /O3) /Qpad[-] enable/disable(DEFAULT) changing variable and array memory layout /Qsafe-cray-ptr Cray pointers do not alias with other variables /Qansi-alias[-] enable/disable(DEFAULT) use of ANSI aliasing rules optimizations; user asserts that the program adheres to these rules /Qcomplex-limited-range[-] enable/disable(DEFAULT) the use of the basic algebraic expansions of some complex arithmetic operations. This can allow for some performance improvement in programs which use a lot of complex arithmetic at the loss of some exponent range. /reentrancy: specify whether the threaded, reentrant run-time support should be used Keywords: none (same as /noreentrancy), threaded, async /noreentrancy do not use threaded, reentrant run-time support /heap-arrays[:n] temporary arrays of minimum size n (in kilobytes) are allocated in heap memory rather than on the stack. If n is not specified, all temporary arrays are allocated in heap memory. /heap-arrays- temporary arrays are allocated on the stack (DEFAULT) /Qopt-multi-version-aggressive[-] enables more aggressive multi-versioning to check for pointer aliasing and scalar replacement /Qopt-ra-region-strategy[:] select the method that the register allocator uses to partition each routine into regions routine - one region per routine block - one region per block trace - one region per trace loop - one region per loop default - compiler selects best option /Qvec[-] enables(DEFAULT)/disables vectorization /Qvec-guard-write[-] enables cache/bandwidth optimization for stores under conditionals within vector loops /Qvec-threshold[n] sets a threshold for the vectorization of loops based on the probability of profitable execution of the vectorized loop in parallel /Qopt-malloc-options:{0|1|2|3|4} specify malloc configuration parameters. Specifying a non-zero value will cause alternate configuration parameters to be set for how malloc allocates and frees memory /Qopt-jump-tables: control the generation of jump tables default - let the compiler decide when a jump table, a series of if-then-else constructs or a combination is generated large - generate jump tables up to a certain pre-defined size (64K entries) - generate jump tables up to in size use /Qopt-jump-tables- to lower switch statements as chains of if-then-else constructs /Qopt-block-factor: specify blocking factor for loop blocking /Qopt-streaming-stores: specifies whether streaming stores are generated always - enables generation of streaming stores under the assumption that the application is memory bound auto - compiler decides when streaming stores are used (DEFAULT) never - disables generation of streaming stores /Qmkl[:] link to the Intel(R) Math Kernel Library (Intel(R) MKL) and bring in the associated headers parallel - link using the threaded Intel(R) MKL libraries. This is the default when /Qmkl is specified sequential - link using the non-threaded Intel(R) MKL libraries cluster - link using the Intel(R) MKL Cluster libraries plus the sequential Intel(R) MKL libraries /Qimsl link to the International Mathematics and Statistics Library* (IMSL* library) /Qopt-subscript-in-range[-] assumes no overflows in the intermediate computation of the subscripts /Qcoarray[:shared|distributed] enable/disable(DEFAULT) coarray syntax for data parallel programming. The default is shared-memory; distributed memory is only valid with the Intel(R) Cluster Toolkit /Qcoarray-num-images:n set default number of coarray images /Qopt-matmul[-] replace matrix multiplication with calls to intrinsics and threading libraries for improved performance (DEFAULT at /O3 /Qparallel) /Qsimd[-] enables(DEFAULT)/disables vectorization using SIMD directive /Qguide-opts: tells the compiler to analyze certain code and generate recommendations that may improve optimizations /Qguide-file[:] causes the results of guide to be output to a file /Qguide-file-append[:] causes the results of guide to be appended to a file /Qguide[:] lets you set a level (1 - 4) of guidance for auto-vectorization, auto-parallelization, and data transformation (DEFAULT is 4 when the option is specified) /Qguide-data-trans[:] lets you set a level (1 - 4) of guidance for data transformation (DEFAULT is 4 when the option is specified) /Qguide-par[:] lets you set a level (1 - 4) of guidance for auto-parallelization (DEFAULT is 4 when the option is specified) /Qguide-vec[:] lets you set a level (1 - 4) of guidance for auto-vectorization (DEFAULT is 4 when the option is specified) /Qguide-profile:<[file|dir]>[,[file|dir],...] specify a loop profiler data file (or set of files in a directory) when using the /Qguide option /Qopt-mem-layout-trans[:] controls the level of memory layout transformations performed by the compiler 0 - disable memory layout transformations (same as /Qopt-mem-layout-trans-) 1 - enable basic memory layout transformations 2 - enable more memory layout transformations (DEFAULT when the option is specified) 3 - enable aggressive memory layout transformations /Qopt-prefetch[:n] enable levels of prefetch insertion, where 0 disables. n may be 0 through 4 inclusive. Default is 2. /Qopt-prefetch- disable(DEFAULT) prefetch insertion. Equivalent to /Qopt-prefetch:0 Profile Guided Optimization (PGO) --------------------------------- /Qprof-dir specify directory for profiling output files (*.dyn and *.dpi) /Qprof-src-root specify project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-root-cwd specify the current directory as the project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-dir[-] specify whether directory names of sources should be considered when looking up profile records within the .dpi file /Qprof-file specify file name for profiling summary file /Qprof-data-order[-] enable/disable(DEFAULT) static data ordering with profiling /Qprof-func-order[-] enable/disable(DEFAULT) function ordering with profiling /Qprof-gen[:keyword] instrument program for profiling. Optional keyword may be srcpos or globdata /Qprof-gen- disable profiling instrumentation /Qprof-use[:] enable use of profiling information during optimization weighted - invokes profmerge with -weighted option to scale data based on run durations [no]merge - enable(default)/disable the invocation of the profmerge tool /Qprof-use- disable use of profiling information during optimization /Qcov-gen instrument program for profiling /Qcov-dir specify directory for profiling output files (*.dyn and *.dpi) /Qcov-file specify file name for profiling summary file /Qinstrument-functions[-] determine whether function entry and exit points are instrumented /Qprof-hotness-threshold: set the hotness threshold for function grouping and function ordering val indicates percentage of functions to be placed in hot region. This option requires /Qprof-use and /Qprof-func-order /Qprof-value-profiling:[,,...] limit value profiling none - inhibit all types of value profiling nodivide - inhibit value profiling of non-compile time constants used in division or remainder operations noindcall - inhibit value profiling of function addresses at indirect call sites /Qprofile-functions enable instrumentation in generated code for collecting function execution time profiles /Qprofile-loops: enable instrumentation in generated code for collecting loop execution time profiles inner - instrument inner loops outer - instrument outer loops all - instrument all loops /Qprofile-loops-report: Control the level of instrumentation inserted for reporting loop execution profiles 1 - report loop times 2 - report loop times and iteration counts Optimization Reports -------------------- /Qvec-report[n] control amount of vectorizer diagnostic information n=0 no diagnostic information n=1 indicate vectorized loops (DEFAULT when enabled) n=2 indicate vectorized/non-vectorized loops n=3 indicate vectorized/non-vectorized loops and prohibiting data dependence information n=4 indicate non-vectorized loops n=5 indicate non-vectorized loops and prohibiting data dependence information n=6 indicate vectorized/non-vectorized loops with greater details and prohibiting data dependence information n=7 indicate vector code quality message ids and data values for vectorized loops /Qopt-report[:n] generate an optimization report to stderr 0 disable optimization report output 1 minimum report output 2 medium output (DEFAULT when enabled) 3 maximum report output /Qopt-report-file: specify the filename for the generated report /Qopt-report-phase: specify the phase that reports are generated against /Qopt-report-routine: reports on routines containing the given name /Qopt-report-help display the optimization phases available for reporting /Qtcheck[:mode] enable analysis of threaded applications (requires Intel(R) Thread Checker; cannot be used with compiler alone) tci - instruments a program to perform a thread-count-independent analysis tcd - instruments a program to perform a thread-count-dependent analysis (DEFAULT when mode is not used) api - instruments a program at the api-imports level /Qtcollect[:] inserts instrumentation probes calling the Intel(R) Trace Collector API. The library .lib is linked in the default being VT.lib (requires Intel(R) Trace Collector) /Qtcollect-filter:file Enable or disable the instrumentation of specified functions. (requires Intel(R) Trace Collector) OpenMP* and Parallel Processing ------------------------------ /Qopenmp enable the compiler to generate multi-threaded code based on the OpenMP* directives (same as /openmp) /Qopenmp-stubs enables the user to compile OpenMP programs in sequential mode. The OpenMP directives are ignored and a stub OpenMP library is linked (sequential) /Qopenmp-report{0|1|2} control the OpenMP parallelizer diagnostic level /Qopenmp-lib: choose which OpenMP library version to link with compat - use the Microsoft compatible OpenMP run-time libraries (DEFAULT) /Qopenmp-threadprivate: choose which threadprivate implementation to use compat - use the Microsoft compatible thread local storage legacy - use the Intel compatible implementation (DEFAULT) /Qparallel enable the auto-parallelizer to generate multi-threaded code for loops that can be safely executed in parallel /Qpar-report{0|1|2|3} control the auto-parallelizer diagnostic level /Qpar-threshold[n] set threshold for the auto-parallelization of loops where n is an integer from 0 to 100 /Qpar-runtime-control[n] Control parallelizer to generate runtime check code for effective automatic parallelization. n=0 no runtime check based auto-parallelization n=1 generate runtime check code under conservative mode (DEFAULT when enabled) n=2 generate runtime check code under heuristic mode n=3 generate runtime check code under aggressive mode /Qpar-schedule-static[:n] Specifies a scheduling algorithm for DO loop iteration. Divides iterations into contiguous pieces. Size n if specified, equal sized pieces if not. /Qpar-schedule-static_balanced[:n] Divides iterations into even-sized chunks. Size n if specified, equal sized pieces if not. /Qpar-schedule-static-steal[:n] Divides iterations into even-sized chunks, but allows threads to steal parts of chunks from neighboring threads /Qpar-schedule-dynamic[:n] Specifies a scheduling algorithm for DO loop iteration. Assigns iterations to threads in chunks dynamically. Chunk size is n iterations if specified, otherwise 1. /Qpar-schedule-guided[:n] Specifies a scheduling algorithm for DO loop iteration. Indicates a minimum number of iterations. If specified, n is the minimum number, otherwise 1. /Qpar-schedule-guided-analytical[:n] Divides iterations by using exponential distribution or dynamic distributions. /Qpar-schedule-runtime Specifies a scheduling algorithm for DO loop iteration. Defers the scheduling decision until runtime. /Qpar-schedule-auto Lets the compiler or run-time system determine the scheduling algorithm. /Qpar-adjust-stack perform fiber-based main thread stack adjustment /Qpar-affinity=[,...][,][,] tune application performance by setting different thread affinity /Qpar-num-threads= tune application performance by setting different number of threads /Qparallel-source-info[:n] enable(DEFAULT)/disable the emission of source location information for parallel code generation with OpenMP and auto-parallelization 0 - disable (same as /Qparallel-source-info-) 1 - emit routine name and line information (DEFAULT) 2 - emit path, file, routine name and line information /Qpar same as /Qparallel Floating Point -------------- /fp: enable floating point model variation except[-] - enable/disable floating point semantics fast[=1|2] - enables more aggressive floating point optimizations precise - allows value-safe optimizations source - enables intermediates in source precision strict - enables /fp:precise /fp:except, disables contractions and enables pragma stdc fenv_access /Qfp-speculation: enable floating point speculations with the following conditions: fast - speculate floating point operations (DEFAULT) safe - speculate only when safe strict - same as off off - disables speculation of floating-point operations /Qpc32 set internal FPU precision to 24 bit significand /Qprec improve floating-point precision (speed impact less than /Op) /Qprec-sqrt[-] determine if certain square root optimizations are enabled /Qprec-div[-] improve precision of FP divides (some speed impact) /Qfast-transcendentals[-] generate a faster version of the transcendental functions /Qfp-port[-] round fp results at assignments and casts (some speed impact) /Qfp-stack-check enable fp stack checking after every function/procedure call /Qrcd rounding mode to enable fast float-to-int conversions /rounding-mode:chopped set internal FPU rounding control to truncate /Qftz[-] enable/disable flush denormal results to zero /fpe:{0|1|3} specifies program-wide behavior on floating point exceptions /fpe-all:{0|1|3} specifies floating point exception behavior on all functions and subroutines. Also sets /assume:ieee_fpe_flags /[no]fltconsistency specify that improved floating-point consistency should be used /Qfma[-] enable/disable the combining of floating point multiplies and add/subtract operations /[no]recursive compile all procedures for possible recursive execution Inlining -------- /Ob control inline expansion: n=0 disable inlining (same as /inline:none) n=1 inline functions declared with ATTRIBUTES INLINE or FORCEINLINE n=2 inline any function, at the compiler's discretion /Qinline-min-size: set size limit for inlining small routines /Qinline-min-size- no size limit for inlining small routines /Qinline-max-size: set size limit for inlining large routines /Qinline-max-size- no size limit for inlining large routines /Qinline-max-total-size: maximum increase in size for inline function expansion /Qinline-max-total-size- no size limit for inline function expansion /Qinline-max-per-routine: maximum number of inline instances in any function /Qinline-max-per-routine- no maximum number of inline instances in any function /Qinline-max-per-compile: maximum number of inline instances in the current compilation /Qinline-max-per-compile- no maximum number of inline instances in the current compilation /Qinline-factor: set inlining upper limits by n percentage /Qinline-factor- do not set set inlining upper limits /Qinline-forceinline treat inline routines as forceinline /Qinline-dllimport allow(DEFAULT)/disallow functions declared DEC$ ATTRIBUTES DLLIMPORT to be inlined /Qinline-calloc directs the compiler to inline calloc() calls as malloc()/memset() /inline[:keyword] Specifies the level of inline function expansion keywords: all (same as /Ob2 /Ot), size (same as /Ob2 /Os) speed (same as /Ob2 /Ot), none or manual (same as /Ob0) Output, Debug, PCH ------------------ /c compile to object (.obj) only, do not link /nolink, /compile-only same as /c /S compile to assembly (.asm) only, do not link /FAs produce assembly file with optional source annotations /FAc produce assembly file with optional code annotations /FA produce assembly file /Fa[file] name assembly file (or directory for multiple files; i.e. /FaMYDIR\) /Fo[file] name object file (or directory for multiple files; i.e. /FoMYDIR\) /Fe[file] name executable file or directory /object: specify the name of the object file, or the directory to which object file(s) should be written. (e.g. /object:MYOBJ or /object:MYDIR\) /exe: specifies the name to be used for the built program (.exe) or dynamic-link (.dll) library /map: specify that a link map file should be generated /list: specify that a listing file should be generated /list-line-len:# overrides the default line length (80) in a listing file /list-page-len:# overrides the default page length (66) in a listing file /show: controls the contents of the listing file keywords: all, none, [no]include, [no]map, [no]options /Zi, /ZI, /Z7 produce symbolic debug information in object file (implies /Od when another optimization option is not explicitly set) /debug[:keyword] enable debug information and control output of enhanced debug information keywords: all, full, minimal, none, [no]inline-debug-info /nodebug do not enable debug information /debug-parameters[:keyword] control output of debug information for PARAMETERS keywords: all, used, none (same as /nodebug-parameters) /nodebug-parameters do not output debug information for PARAMETERS /Qd-lines, /[no]d-lines compile debug statements (indicated by D in column 1) /pdbfile[:filename] specify that debug related information should be generated to a program database file /nopdbfile do not generate debug related information to a program database file /Qtrapuv trap uninitialized variables /RTCu report use of variable that was not initialized /Qmap-opts enable option mapping tool Preprocessor ------------ /D[{=|#}] define macro /define:symbol[=] same as /D /nodefines specifies that any /D macros go to the preprocessor only, and not to the compiler /U remove predefined macro /undefine: remove predefined macro (same as /U) /allow:nofpp-comments If a Fortran end-of-line comment is seen within a #define, treat it as part of the definition. Default is allow:fpp-comments /E preprocess to stdout /EP preprocess to stdout, omitting #line directives /EP /P preprocess to file, omitting #line directives /P preprocess to file /preprocess-only same as /P /[no]keep keep/remove preprocessed file generated by preprocessor as input to compiler stage. Not affected by /Qsave-temps. Default is /nokeep /fpp[n], /[no]fpp run Fortran preprocessor on source files prior to compilation n=0 disable running the preprocessor, equivalent to nofpp n=1,2,3 run preprocessor /module:path specify path where mod files should be placed and first location to look for mod files /u remove all predefined macros /I add directory to include file search path /[no]include: same as /I /X remove standard directories from include file search path /[no]gen-dep[:filename] generate dependency information. If no filename is specified, output to stdout /gen-depformat:keyword generate dependency information in the specified format. One of: make, nmake Component Control ----------------- /Qoption,, pass options to tool specified by /Qlocation,, set as the location of tool specified by Language -------- /[no]altparam specify if alternate form of parameter constant declarations (without parenthesis) is recognized. Default is to recognize /assume: specify assumptions made by the optimizer and code generator keywords: none, [no]byterecl, [no]buffered_io, [no]bscc (nobscc same as /nbs), [no]cc_omp, [no]minus0, [no]dummy_aliases (same as /Qcommon-args), [no]ieee_fpe_flags, [no]fpe_summary, [no]old_boz, [no]old_complex_align, [no]old_logical_ldio, [no]old_ldout_format, [no]old_maxminloc, [no]old_unit_star, [no]old_xor, [no]protect_constants, [no]protect_parens, [no]realloc_lhs, [no]2underscore, [no]underscore (same as /us), [no]std_intent_in, [no]std_mod_proc_name, [no]source_include, [no]split_common,[no]writeable_strings /ccdefault: specify default carriage control for units 6 and * keywords: default, fortran, list or none /[no]check: check run-time conditions. Default is /nocheck keywords: all (same as /4Yb, /C), none (same as /nocheck, /4Nb), [no]arg_temp_created, [no]bounds (same as /CB), [no]format, [no]output_conversion, [no]pointer (same as /CA), [no]uninit (same as /CU), [no]stack /Qcommon-args assume "by reference" subprogram arguments may alias one another. Same as /assume:dummy_aliases /[no]extend-source[:] specify rightmost column for fixed form sources keywords: 72 (same as /noextend-source and /4L72), 80 (same as /4L80), 132 (same as /4L132. Default if you specify /extend-source without a keyword.) /fixed specify source files are in fixed format. Same as /FI and /4Nf /nofixed indicates free format /free specify source files are in free format. Same as /FR and /4Yf /nofree indicates fixed format /names: specify how source code identifiers and external names are interpreted. keywords: as_is, lowercase, uppercase /[no]pad-source, /Qpad-source[-] make compiler acknowledge blanks at the end of a line /stand[:] specifies level of conformance with ANSI standard to check for. If keyword is not specified, level of conformance is f03 keywords: f90 (same as /4Ys), f95, f03, none (same as /nostand) /standard-semantics sets assume keywords to conform to the semantics of the f03 standard. May result in performance loss. assume keywords set by /standard-semantics: byterecl, fpe_summary, minus0, noold_maxminloc, noold_unit_star, noold_xor, protect_parens, realloc_lhs, std_intent_in, std_mod_proc_name, noold_ldout_format /syntax-only, /Zs perform syntax and semantic checking only (no object file produced) Compiler Diagnostics -------------------- /w disable all warnings /W disable warnings (n = 0) or show warnings (n = 1 DEFAULT, same as /warn:general) /warn: specifies the level of warning messages issued keywords: all, none (same as /nowarn) [no]alignments, [no]declarations, [no]errors, [no]general, [no]ignore_loc, [no]interfaces, [no]stderrors, [no]truncated_source, [no]uncalled, [no]unused, [no]usage /nowarn suppress all warning messages /WB turn a compile-time bounds check into a warning /[no]traceback specify whether the compiler generates PC correlation data used to display a symbolic traceback rather than a hexadecimal traceback at runtime failure /[no]gen-interfaces [[no]source] generate interface blocks for all routines in the file. Can be checked using -warn interfaces nosource indicates temporary source files should not be saved /error-limit: specify the maximum number of error-level or fatal-level compiler errors allowed /noerror-limit set no maximum number on error-level or fatal-level error messages /Qdiag-enable:[,,...] enable the specified diagnostics or diagnostic groups /Qdiag-disable:[,,...] disable the specified diagnostics or diagnostic groups where may be individual diagnostic numbers or group names. where group names include: sc[n] - perform source code analysis: n=1 for critical errors, n=2 for all errors and n=3 for all errors and warnings sc- {full|concise|precise} - perform static analysis and determine the analysis mode. Full mode - attempts to find all program weaknesses, even at the expense of more false positives. Concise mode - attempts to reduce false positives somewhat more than reducing false negatives. Precise mode - attempts to avoid all false positives Default: full if /Qdiag-enable:sc{[1|2|3]} is present; otherwise None (static analysis diagnostics are disabled). sc-include - perform source code analysis on include files sc-single-file - This option tells static analysis to process each file individually. Default: OFF sc-enums - This option tells static analysis to treat enumeration variables as known values equal to any one of the associated enumeration literals. Default: OFF sc-parallel[n] - perform analysis of parallelization in source code: n=1 for critical errors, n=2 for errors, n=3 for all errors and warnings warn - diagnostic messages that have "warning" severity level. error - diagnostic messages that have "error" severity level. remark - diagnostic messages that are remarks or comments. vec - diagnostic messages issued by the vectorizer. par - diagnostic messages issued by the auto-parallelizer openmp - diagnostic messages issued by the OpenMP* parallelizer. cpu-dispatch Specifies the CPU dispatch remarks. /Qdiag-error:[,,...] output the specified diagnostics or diagnostic groups as errors /Qdiag-warning:[,,...] output the specified diagnostics or diagnostic groups as warnings /Qdiag-remark:[,,...] output the the specified diagnostics or diagnostic groups as remarks /Qdiag-dump display the currently enabled diagnostic messages to stdout or to a specified diagnostic output file. /Qdiag-sc-dir: directory where diagnostics from static analysis are created, rather than current working directory. /Qdiag-file[:] where diagnostics are emitted to. Not specifying this causes messages to be output to stderr /Qdiag-file-append[:] where diagnostics are emitted to. When already exists, output is appended to the file /Qdiag-id-numbers[-] enable(DEFAULT)/disable the diagnostic specifiers to be output in numeric form /Qdiag-error-limit: specify the maximum number of errors emitted Miscellaneous ------------- /[no]logo display compiler version information. /nologo disables the output /Qsox[:[,keyword]] enable saving of compiler options, version and additional information in the executable. Use /Qsox- to disable(DEFAULT) profile - include profiling data inline - include inlining information /bintext: place the string specified into the object file and executable /Qsave-temps store the intermediate files in current directory and name them based on the source file. Only saves files that are generated by default /what display detailed compiler version information /watch: tells the driver to output processing information keywords: all, none (same as /nowatch), [no]source, [no]cmd [no]mic-cmd /nowatch suppress processing information output (DEFAULT) /Tf compile file as Fortran source /extfor: specify extension of file to be recognized as a Fortran file /extfpp: specify extension of file to be recognized as a preprocessor file /libdir[:keyword] control the library names that should be emitted into the object file keywords: all, none (same as /nolibdir), [no]automatic, [no]user /nolibdir no library names should be emitted into the object file /MP[] create multiple processes that can be used to compile large numbers of source files at the same time /bigobj generate objects with increased address capacity Data ---- /4I{2|4|8} set default KIND of integer and logical variables to 2, 4, or 8 /integer-size: specifies the default size of integer and logical variables size: 16, 32, 64 /4R{8|16} set default size of real to 8 or 16 bytes /real-size: specify the size of REAL and COMPLEX declarations, constants, functions, and intrinsics size: 32, 64, 128 /Qautodouble same as /real-size:64 or /4R8 /double-size: defines the size of DOUBLE PRECISION and DOUBLE COMPLEX declarations, constants, functions, and intrinsics size: 64, 128 /[no]fpconstant extends the precision of single precision constants assigned to double precision variables to double precision /[no]intconstant use Fortran 77 semantics, rather than Fortran 90/95, to determine kind of integer constants /auto make all local variables AUTOMATIC /Qauto-scalar make scalar local variables AUTOMATIC (DEFAULT) /Qsave save all variables (static allocation) (same as /noauto, opposite of /auto) /Qzero[-] enable/disable(DEFAULT) implicit initialization to zero of local scalar variables of intrinsic type INTEGER, REAL, COMPLEX, or LOGICAL that are saved and not initialized /Qdyncom make given common blocks dynamically-allocated /Zp[n] specify alignment constraint for structures (n=1,2,4,8,16 /Zp16 DEFAULT) /[no]align analyze and reorder memory layout for variables and arrays /align: specify how data items are aligned keywords: all (same as /align), none (same as /noalign), [no]commons, [no]dcommons, [no]qcommons, [no]zcommons, rec1byte, rec2byte, rec4byte, rec8byte, rec16byte, rec32byte, array8byte, array16byte, array32byte, array64byte, array128byte, array256byte, [no]records, [no]sequence /GS enable overflow security checks. /GS- disables (DEFAULT) /Qpatchable-addresses generate code such that references to statically assigned addresses can be patched with arbitrary 64-bit addresses. /Qfnalign[-] align the start of functions to an optimal machine-dependent value. When disabled (DEFAULT) align on a 2-byte boundary /Qfnalign:[2|16] align the start of functions on a 2 (DEFAULT) or 16 byte boundary /Qglobal-hoist[-] enable(DEFAULT)/disable external globals are load safe /Qkeep-static-consts[-] enable/disable(DEFAULT) emission of static const variables even when not referenced /Qnobss-init disable placement of zero-initialized variables in BSS (use DATA) /Qzero-initialized-in-bss[-] put explicitly zero initialized variables into the DATA section instead of the BSS section /convert: specify the format of unformatted files containing numeric data keywords: big_endian, cray, ibm, little_endian, native, vaxd, vaxg /Qimf-absolute-error:value[:funclist] define the maximum allowable absolute error for math library function results value - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-accuracy-bits:bits[:funclist] define the relative error, measured by the number of correct bits, for math library function results bits - a positive, floating-point number funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-arch-consistency:value[:funclist] ensures that the math library functions produce consistent results across different implementations of the same architecture value - true or false funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-max-error:ulps[:funclist] defines the maximum allowable relative error, measured in ulps, for math library function results ulps - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-precision:value[:funclist] defines the accuracy (precision) for math library functions value - defined as one of the following values high - equivalent to max-error = 0.6 medium - equivalent to max-error = 4 (DEFAULT) low - equivalent to accuracy-bits = 11 (single precision); accuracy-bits = 26 (double precision) funclist - optional comma separated list of one or more math library functions to which the attribute should be applied Compatibility ------------- /fpscomp[:] specify the level of compatibility to adhere to with Fortran PowerStation keywords: all, none (same as /nofpscomp), [no]filesfromcmd, [no]general, [no]ioformat, [no]ldio_spacing, [no]libs, [no]logicals /nofpscomp no specific level of compatibility with Fortran PowerStation /f66 allow extensions that enhance FORTRAN-66 compatibility /f77rtl specify that the Fortran 77 specific run-time support should be used /nof77rtl disables /vms enable VMS I/O statement extensions /Qvc enable compatibility with a specific Microsoft* Visual Studio version 9 - Microsoft* Visual Studio 2008 compatibility 10 - Microsoft* Visual Studio 2010 compatibility 11 - Microsoft* Visual Studio 2012 compatibility Linking/Linker -------------- /link specify that all options following '/link' are for the linker /extlnk: specify extension of file to be passed directly to linker /F set the stack reserve amount specified to the linker /dbglibs use the debug version of runtime libraries, when appropriate /libs: specifies which type of run-time library to link to. keywords: static, dll, qwin, qwins /LD[d] produce a DLL instead of an EXE ('d' = debug version) /dll same as /LD /MD[d] use dynamically-loaded, multithread C runtime /MDs[d] use dynamically-loaded, singlethread Fortran runtime, and multithread C runtime /MT[d] use statically-linked, multithread C runtime (DEFAULT with Microsoft Visual Studio 2005 and later) /ML[d] use statically-linked, single thread C runtime (only valid in Microsoft Visual Studio 2003 environment) /MG, /winapp use Windows API runtime libraries /Zl omit library names from object file /threads specify that multi-threaded libraries should be linked against /nothreads disables multi-threaded libraries Deprecated Options ------------------ /Qinline-debug-info use /debug:inline-debug-info /Gf use /GF /ML[d] upgrade to /MT[d] /Quse-asm No replacement /Qprof-genx use /Qprof-gen:srcpos /Qdiag-enable:sv[] use /Qdiag-enable:sc[] /Qdiag-enable:sv-include use /Qdiag-enable:sc-include /Qdiag-sv use /Qdiag-enable:sc[] /Qdiag-sv-error use /Qdiag-disable:warning /Qdiag-sv-include use /Qdiag-enable:sc-include /Qdiag-sv-level No replacement /Qdiag-sv-sup use /Qdiag-disable:[,,...] /Qtprofile No replacement /arch:SSE use /arch:IA32 /QxK upgrade to /arch:SSE2 /QaxK upgrade to /arch:SSE2 /QxW use /arch:SSE2 /QaxW use /arch:SSE2 /QxN use /QxSSE2 /QaxN use /QaxSSE2 /QxP use /QxSSE3 /QaxP use /QaxSSE3 /QxT use /QxSSSE3 /QaxT use /QaxSSSE3 /QxS use /QxSSE4.1 /QaxS use /QaxSSE4.1 /QxH use /QxSSE4.2 /QaxH use /QaxSSE4.2 /QxO use /arch:SSE3 /Qvc7.1 No replacement /QIfist use /Qrcd /QxSSE3_ATOM use /QxSSSE3_ATOM /Qrct No replacement /Op use /fltconsistency /debug:partial No replacement /tune: use /Qx /architecture: use /arch: /1, /Qonetrip use /f66 /Fm use /map /Qcpp, /Qfpp use /fpp /Qdps use /altparam /Qextend-source use /extend-source /Qlowercase use /names:lowercase /Quppercase use /names:uppercase /Qvms use /vms /asmattr:keyword use /FA[c|s|cs] /noasmattr,/asmattr:none use /FA /asmfile use /Fa /automatic use /auto /cm use /warn:nousage /optimize:0 use /Od /optimize:1,2 use /O1 /optimize:3,4 use /O2 /optimize:5 use /O3 /source use /Tf /unix No replacement /us use /assume:underscore /unroll use /Qunroll /w90, /w95 No replacement /Zd use /debug:minimal /help, /? [category] print full or category help message Valid categories include advanced - Advanced Optimizations codegen - Code Generation compatibility - Compatibility component - Component Control data - Data deprecated - Deprecated Options diagnostics - Compiler Diagnostics float - Floating Point help - Help inline - Inlining ipo - Interprocedural Optimization (IPO) language - Language link - Linking/Linker misc - Miscellaneous opt - Optimization output - Output pgo - Profile Guided Optimization (PGO) preproc - Preprocessor reports - Optimization Reports openmp - OpenMP and Parallel Processing Copyright (C) 1985-2013, Intel Corporation. All rights reserved. * Other names and brands may be claimed as the property of others. Trying FC compiler flag -MT sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Added FC compiler flag -MT sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= icl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Intel(R) Visual Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 13.1.3.198 Build 20130607 Copyright (C) 1985-2013 Intel Corporation. All rights reserved. Intel(R) Fortran Compiler Help ============================== Intel(R) Compiler includes compiler options that optimize for instruction sets that are available in both Intel(R) and non-Intel microprocessors, but may perform additional optimizations for Intel microprocessors than for non-Intel microprocessors. In addition, certain compiler options for Intel(R) Compiler are reserved for Intel microprocessors. For a detailed description of these compiler options, including the instructions they implicate, please refer to "Intel(R) Compiler User and Reference Guides > Compiler Options." usage: ifort [options] file1 [file2 ...] [/link linker_options] where options represents zero or more compiler options fileN is a Fortran source (.f .for .ftn .f90 .fpp .i .i90), assembly (.asm), object (.obj), static library (.lib), or other linkable file linker_options represents zero or more linker options Notes ----- 1. Many FL32 options are supported; a warning is printed for unsupported options. 2. Intel Fortran compiler options may be placed in your ifort.cfg file. Some options listed are only available on a specific system i32 indicates the feature is available on systems based on IA-32 architecture i64em indicates the feature is available on systems using Intel(R) 64 architecture Compiler Option List -------------------- Optimization ------------ /O1 optimize for maximum speed, but disable some optimizations which increase code size for a small speed benefit /O2 optimize for maximum speed (DEFAULT) /O3 optimize for maximum speed and enable more aggressive optimizations that may not improve performance on some programs /Ox enable maximum optimizations (same as /O2) /Os enable speed optimizations, but disable some optimizations which increase code size for small speed benefit (overrides /Ot) /Ot enable speed optimizations (overrides /Os) /Od disable optimizations /Oy[-] enable/disable using EBP as a general purpose register (no frame pointer) (i32 only) /fast enable /QxHOST /O3 /Qipo /Qprec-div- options set by /fast cannot be overridden with the exception of /QxHOST, list options separately to change behavior /Oa[-] assume no aliasing in program /Ow[-] assume no aliasing within functions, but assume aliasing across calls Code Generation --------------- /Qx generate specialized code to run exclusively on processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. Optimizes for the Intel NetBurst(R) microarchitecture. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the enhanced Pentium(R) M processor microarchitecture and Intel NetBurst(R) microarchitecture. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Core(TM) microarchitecture. SSE4.1 May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator instructions for Intel processors. May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for Intel(R) 45nm Hi-k next generation Intel Core(TM) microarchitecture. SSE4.2 May generate Intel(R) SSE4 Efficient Accelerated String and Text Processing instructions supported by Intel(R) Core(TM) i7 processors. May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator, Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for the Intel(R) Core(TM) processor family. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. SSSE3_ATOM May generate MOVBE instructions for Intel processors, depending on the setting of option /Qinstruction. May also generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Atom(TM) processor and Intel(R) Centrino(R) Atom(TM) Processor Technology. /QxHost generate instructions for the highest instruction set and processor available on the compilation host machine /Qax[,,...] generate code specialized for processors specified by while also generating generic IA-32 instructions. includes one or more of the following: SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. /arch: generate specialized code to optimize for processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions SSE3 May generate Intel(R) SSE3, SSE2 and SSE instructions SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2 and SSE instructions SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions AVX May generate Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions /Qinstruction: Refine instruction set output for the selected target processor [no]movbe - Do/do not generate MOVBE instructions with SSSE3_ATOM (requires /QxSSSE3_ATOM) /Qextend-arguments:[32|64] By default, unprototyped scalar integer arguments are passed in 32-bits (sign-extended if necessary). On Intel(R) 64, unprototyped scalar integer arguments may be extended to 64-bits. Interprocedural Optimization (IPO) ---------------------------------- /Qip[-] enable(DEFAULT)/disable single-file IP optimization within files /Qipo[n] enable multi-file IP optimization between files /Qipo-c generate a multi-file object file (ipo_out.obj) /Qipo-S generate a multi-file assembly file (ipo_out.asm) /Qip-no-inlining disable full and partial inlining /Qip-no-pinlining disable partial inlining /Qipo-separate create one object file for every source file (overrides /Qipo[n]) /Qipo-jobs specify the number of jobs to be executed simultaneously during the IPO link phase Advanced Optimizations ---------------------- /Qunroll[n] set maximum number of times to unroll loops. Omit n to use default heuristics. Use n=0 to disable the loop unroller /Qunroll-aggressive[-] enables more aggressive unrolling heuristics /Qscalar-rep[-] enable(DEFAULT)/disable scalar replacement (requires /O3) /Qpad[-] enable/disable(DEFAULT) changing variable and array memory layout /Qsafe-cray-ptr Cray pointers do not alias with other variables /Qansi-alias[-] enable/disable(DEFAULT) use of ANSI aliasing rules optimizations; user asserts that the program adheres to these rules /Qcomplex-limited-range[-] enable/disable(DEFAULT) the use of the basic algebraic expansions of some complex arithmetic operations. This can allow for some performance improvement in programs which use a lot of complex arithmetic at the loss of some exponent range. /reentrancy: specify whether the threaded, reentrant run-time support should be used Keywords: none (same as /noreentrancy), threaded, async /noreentrancy do not use threaded, reentrant run-time support /heap-arrays[:n] temporary arrays of minimum size n (in kilobytes) are allocated in heap memory rather than on the stack. If n is not specified, all temporary arrays are allocated in heap memory. /heap-arrays- temporary arrays are allocated on the stack (DEFAULT) /Qopt-multi-version-aggressive[-] enables more aggressive multi-versioning to check for pointer aliasing and scalar replacement /Qopt-ra-region-strategy[:] select the method that the register allocator uses to partition each routine into regions routine - one region per routine block - one region per block trace - one region per trace loop - one region per loop default - compiler selects best option /Qvec[-] enables(DEFAULT)/disables vectorization /Qvec-guard-write[-] enables cache/bandwidth optimization for stores under conditionals within vector loops /Qvec-threshold[n] sets a threshold for the vectorization of loops based on the probability of profitable execution of the vectorized loop in parallel /Qopt-malloc-options:{0|1|2|3|4} specify malloc configuration parameters. Specifying a non-zero value will cause alternate configuration parameters to be set for how malloc allocates and frees memory /Qopt-jump-tables: control the generation of jump tables default - let the compiler decide when a jump table, a series of if-then-else constructs or a combination is generated large - generate jump tables up to a certain pre-defined size (64K entries) - generate jump tables up to in size use /Qopt-jump-tables- to lower switch statements as chains of if-then-else constructs /Qopt-block-factor: specify blocking factor for loop blocking /Qopt-streaming-stores: specifies whether streaming stores are generated always - enables generation of streaming stores under the assumption that the application is memory bound auto - compiler decides when streaming stores are used (DEFAULT) never - disables generation of streaming stores /Qmkl[:] link to the Intel(R) Math Kernel Library (Intel(R) MKL) and bring in the associated headers parallel - link using the threaded Intel(R) MKL libraries. This is the default when /Qmkl is specified sequential - link using the non-threaded Intel(R) MKL libraries cluster - link using the Intel(R) MKL Cluster libraries plus the sequential Intel(R) MKL libraries /Qimsl link to the International Mathematics and Statistics Library* (IMSL* library) /Qopt-subscript-in-range[-] assumes no overflows in the intermediate computation of the subscripts /Qcoarray[:shared|distributed] enable/disable(DEFAULT) coarray syntax for data parallel programming. The default is shared-memory; distributed memory is only valid with the Intel(R) Cluster Toolkit /Qcoarray-num-images:n set default number of coarray images /Qopt-matmul[-] replace matrix multiplication with calls to intrinsics and threading libraries for improved performance (DEFAULT at /O3 /Qparallel) /Qsimd[-] enables(DEFAULT)/disables vectorization using SIMD directive /Qguide-opts: tells the compiler to analyze certain code and generate recommendations that may improve optimizations /Qguide-file[:] causes the results of guide to be output to a file /Qguide-file-append[:] causes the results of guide to be appended to a file /Qguide[:] lets you set a level (1 - 4) of guidance for auto-vectorization, auto-parallelization, and data transformation (DEFAULT is 4 when the option is specified) /Qguide-data-trans[:] lets you set a level (1 - 4) of guidance for data transformation (DEFAULT is 4 when the option is specified) /Qguide-par[:] lets you set a level (1 - 4) of guidance for auto-parallelization (DEFAULT is 4 when the option is specified) /Qguide-vec[:] lets you set a level (1 - 4) of guidance for auto-vectorization (DEFAULT is 4 when the option is specified) /Qguide-profile:<[file|dir]>[,[file|dir],...] specify a loop profiler data file (or set of files in a directory) when using the /Qguide option /Qopt-mem-layout-trans[:] controls the level of memory layout transformations performed by the compiler 0 - disable memory layout transformations (same as /Qopt-mem-layout-trans-) 1 - enable basic memory layout transformations 2 - enable more memory layout transformations (DEFAULT when the option is specified) 3 - enable aggressive memory layout transformations /Qopt-prefetch[:n] enable levels of prefetch insertion, where 0 disables. n may be 0 through 4 inclusive. Default is 2. /Qopt-prefetch- disable(DEFAULT) prefetch insertion. Equivalent to /Qopt-prefetch:0 Profile Guided Optimization (PGO) --------------------------------- /Qprof-dir specify directory for profiling output files (*.dyn and *.dpi) /Qprof-src-root specify project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-root-cwd specify the current directory as the project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-dir[-] specify whether directory names of sources should be considered when looking up profile records within the .dpi file /Qprof-file specify file name for profiling summary file /Qprof-data-order[-] enable/disable(DEFAULT) static data ordering with profiling /Qprof-func-order[-] enable/disable(DEFAULT) function ordering with profiling /Qprof-gen[:keyword] instrument program for profiling. Optional keyword may be srcpos or globdata /Qprof-gen- disable profiling instrumentation /Qprof-use[:] enable use of profiling information during optimization weighted - invokes profmerge with -weighted option to scale data based on run durations [no]merge - enable(default)/disable the invocation of the profmerge tool /Qprof-use- disable use of profiling information during optimization /Qcov-gen instrument program for profiling /Qcov-dir specify directory for profiling output files (*.dyn and *.dpi) /Qcov-file specify file name for profiling summary file /Qinstrument-functions[-] determine whether function entry and exit points are instrumented /Qprof-hotness-threshold: set the hotness threshold for function grouping and function ordering val indicates percentage of functions to be placed in hot region. This option requires /Qprof-use and /Qprof-func-order /Qprof-value-profiling:[,,...] limit value profiling none - inhibit all types of value profiling nodivide - inhibit value profiling of non-compile time constants used in division or remainder operations noindcall - inhibit value profiling of function addresses at indirect call sites /Qprofile-functions enable instrumentation in generated code for collecting function execution time profiles /Qprofile-loops: enable instrumentation in generated code for collecting loop execution time profiles inner - instrument inner loops outer - instrument outer loops all - instrument all loops /Qprofile-loops-report: Control the level of instrumentation inserted for reporting loop execution profiles 1 - report loop times 2 - report loop times and iteration counts Optimization Reports -------------------- /Qvec-report[n] control amount of vectorizer diagnostic information n=0 no diagnostic information n=1 indicate vectorized loops (DEFAULT when enabled) n=2 indicate vectorized/non-vectorized loops n=3 indicate vectorized/non-vectorized loops and prohibiting data dependence information n=4 indicate non-vectorized loops n=5 indicate non-vectorized loops and prohibiting data dependence information n=6 indicate vectorized/non-vectorized loops with greater details and prohibiting data dependence information n=7 indicate vector code quality message ids and data values for vectorized loops /Qopt-report[:n] generate an optimization report to stderr 0 disable optimization report output 1 minimum report output 2 medium output (DEFAULT when enabled) 3 maximum report output /Qopt-report-file: specify the filename for the generated report /Qopt-report-phase: specify the phase that reports are generated against /Qopt-report-routine: reports on routines containing the given name /Qopt-report-help display the optimization phases available for reporting /Qtcheck[:mode] enable analysis of threaded applications (requires Intel(R) Thread Checker; cannot be used with compiler alone) tci - instruments a program to perform a thread-count-independent analysis tcd - instruments a program to perform a thread-count-dependent analysis (DEFAULT when mode is not used) api - instruments a program at the api-imports level /Qtcollect[:] inserts instrumentation probes calling the Intel(R) Trace Collector API. The library .lib is linked in the default being VT.lib (requires Intel(R) Trace Collector) /Qtcollect-filter:file Enable or disable the instrumentation of specified functions. (requires Intel(R) Trace Collector) OpenMP* and Parallel Processing ------------------------------ /Qopenmp enable the compiler to generate multi-threaded code based on the OpenMP* directives (same as /openmp) /Qopenmp-stubs enables the user to compile OpenMP programs in sequential mode. The OpenMP directives are ignored and a stub OpenMP library is linked (sequential) /Qopenmp-report{0|1|2} control the OpenMP parallelizer diagnostic level /Qopenmp-lib: choose which OpenMP library version to link with compat - use the Microsoft compatible OpenMP run-time libraries (DEFAULT) /Qopenmp-threadprivate: choose which threadprivate implementation to use compat - use the Microsoft compatible thread local storage legacy - use the Intel compatible implementation (DEFAULT) /Qparallel enable the auto-parallelizer to generate multi-threaded code for loops that can be safely executed in parallel /Qpar-report{0|1|2|3} control the auto-parallelizer diagnostic level /Qpar-threshold[n] set threshold for the auto-parallelization of loops where n is an integer from 0 to 100 /Qpar-runtime-control[n] Control parallelizer to generate runtime check code for effective automatic parallelization. n=0 no runtime check based auto-parallelization n=1 generate runtime check code under conservative mode (DEFAULT when enabled) n=2 generate runtime check code under heuristic mode n=3 generate runtime check code under aggressive mode /Qpar-schedule-static[:n] Specifies a scheduling algorithm for DO loop iteration. Divides iterations into contiguous pieces. Size n if specified, equal sized pieces if not. /Qpar-schedule-static_balanced[:n] Divides iterations into even-sized chunks. Size n if specified, equal sized pieces if not. /Qpar-schedule-static-steal[:n] Divides iterations into even-sized chunks, but allows threads to steal parts of chunks from neighboring threads /Qpar-schedule-dynamic[:n] Specifies a scheduling algorithm for DO loop iteration. Assigns iterations to threads in chunks dynamically. Chunk size is n iterations if specified, otherwise 1. /Qpar-schedule-guided[:n] Specifies a scheduling algorithm for DO loop iteration. Indicates a minimum number of iterations. If specified, n is the minimum number, otherwise 1. /Qpar-schedule-guided-analytical[:n] Divides iterations by using exponential distribution or dynamic distributions. /Qpar-schedule-runtime Specifies a scheduling algorithm for DO loop iteration. Defers the scheduling decision until runtime. /Qpar-schedule-auto Lets the compiler or run-time system determine the scheduling algorithm. /Qpar-adjust-stack perform fiber-based main thread stack adjustment /Qpar-affinity=[,...][,][,] tune application performance by setting different thread affinity /Qpar-num-threads= tune application performance by setting different number of threads /Qparallel-source-info[:n] enable(DEFAULT)/disable the emission of source location information for parallel code generation with OpenMP and auto-parallelization 0 - disable (same as /Qparallel-source-info-) 1 - emit routine name and line information (DEFAULT) 2 - emit path, file, routine name and line information /Qpar same as /Qparallel Floating Point -------------- /fp: enable floating point model variation except[-] - enable/disable floating point semantics fast[=1|2] - enables more aggressive floating point optimizations precise - allows value-safe optimizations source - enables intermediates in source precision strict - enables /fp:precise /fp:except, disables contractions and enables pragma stdc fenv_access /Qfp-speculation: enable floating point speculations with the following conditions: fast - speculate floating point operations (DEFAULT) safe - speculate only when safe strict - same as off off - disables speculation of floating-point operations /Qpc32 set internal FPU precision to 24 bit significand /Qprec improve floating-point precision (speed impact less than /Op) /Qprec-sqrt[-] determine if certain square root optimizations are enabled /Qprec-div[-] improve precision of FP divides (some speed impact) /Qfast-transcendentals[-] generate a faster version of the transcendental functions /Qfp-port[-] round fp results at assignments and casts (some speed impact) /Qfp-stack-check enable fp stack checking after every function/procedure call /Qrcd rounding mode to enable fast float-to-int conversions /rounding-mode:chopped set internal FPU rounding control to truncate /Qftz[-] enable/disable flush denormal results to zero /fpe:{0|1|3} specifies program-wide behavior on floating point exceptions /fpe-all:{0|1|3} specifies floating point exception behavior on all functions and subroutines. Also sets /assume:ieee_fpe_flags /[no]fltconsistency specify that improved floating-point consistency should be used /Qfma[-] enable/disable the combining of floating point multiplies and add/subtract operations /[no]recursive compile all procedures for possible recursive execution Inlining -------- /Ob control inline expansion: n=0 disable inlining (same as /inline:none) n=1 inline functions declared with ATTRIBUTES INLINE or FORCEINLINE n=2 inline any function, at the compiler's discretion /Qinline-min-size: set size limit for inlining small routines /Qinline-min-size- no size limit for inlining small routines /Qinline-max-size: set size limit for inlining large routines /Qinline-max-size- no size limit for inlining large routines /Qinline-max-total-size: maximum increase in size for inline function expansion /Qinline-max-total-size- no size limit for inline function expansion /Qinline-max-per-routine: maximum number of inline instances in any function /Qinline-max-per-routine- no maximum number of inline instances in any function /Qinline-max-per-compile: maximum number of inline instances in the current compilation /Qinline-max-per-compile- no maximum number of inline instances in the current compilation /Qinline-factor: set inlining upper limits by n percentage /Qinline-factor- do not set set inlining upper limits /Qinline-forceinline treat inline routines as forceinline /Qinline-dllimport allow(DEFAULT)/disallow functions declared DEC$ ATTRIBUTES DLLIMPORT to be inlined /Qinline-calloc directs the compiler to inline calloc() calls as malloc()/memset() /inline[:keyword] Specifies the level of inline function expansion keywords: all (same as /Ob2 /Ot), size (same as /Ob2 /Os) speed (same as /Ob2 /Ot), none or manual (same as /Ob0) Output, Debug, PCH ------------------ /c compile to object (.obj) only, do not link /nolink, /compile-only same as /c /S compile to assembly (.asm) only, do not link /FAs produce assembly file with optional source annotations /FAc produce assembly file with optional code annotations /FA produce assembly file /Fa[file] name assembly file (or directory for multiple files; i.e. /FaMYDIR\) /Fo[file] name object file (or directory for multiple files; i.e. /FoMYDIR\) /Fe[file] name executable file or directory /object: specify the name of the object file, or the directory to which object file(s) should be written. (e.g. /object:MYOBJ or /object:MYDIR\) /exe: specifies the name to be used for the built program (.exe) or dynamic-link (.dll) library /map: specify that a link map file should be generated /list: specify that a listing file should be generated /list-line-len:# overrides the default line length (80) in a listing file /list-page-len:# overrides the default page length (66) in a listing file /show: controls the contents of the listing file keywords: all, none, [no]include, [no]map, [no]options /Zi, /ZI, /Z7 produce symbolic debug information in object file (implies /Od when another optimization option is not explicitly set) /debug[:keyword] enable debug information and control output of enhanced debug information keywords: all, full, minimal, none, [no]inline-debug-info /nodebug do not enable debug information /debug-parameters[:keyword] control output of debug information for PARAMETERS keywords: all, used, none (same as /nodebug-parameters) /nodebug-parameters do not output debug information for PARAMETERS /Qd-lines, /[no]d-lines compile debug statements (indicated by D in column 1) /pdbfile[:filename] specify that debug related information should be generated to a program database file /nopdbfile do not generate debug related information to a program database file /Qtrapuv trap uninitialized variables /RTCu report use of variable that was not initialized /Qmap-opts enable option mapping tool Preprocessor ------------ /D[{=|#}] define macro /define:symbol[=] same as /D /nodefines specifies that any /D macros go to the preprocessor only, and not to the compiler /U remove predefined macro /undefine: remove predefined macro (same as /U) /allow:nofpp-comments If a Fortran end-of-line comment is seen within a #define, treat it as part of the definition. Default is allow:fpp-comments /E preprocess to stdout /EP preprocess to stdout, omitting #line directives /EP /P preprocess to file, omitting #line directives /P preprocess to file /preprocess-only same as /P /[no]keep keep/remove preprocessed file generated by preprocessor as input to compiler stage. Not affected by /Qsave-temps. Default is /nokeep /fpp[n], /[no]fpp run Fortran preprocessor on source files prior to compilation n=0 disable running the preprocessor, equivalent to nofpp n=1,2,3 run preprocessor /module:path specify path where mod files should be placed and first location to look for mod files /u remove all predefined macros /I add directory to include file search path /[no]include: same as /I /X remove standard directories from include file search path /[no]gen-dep[:filename] generate dependency information. If no filename is specified, output to stdout /gen-depformat:keyword generate dependency information in the specified format. One of: make, nmake Component Control ----------------- /Qoption,, pass options to tool specified by /Qlocation,, set as the location of tool specified by Language -------- /[no]altparam specify if alternate form of parameter constant declarations (without parenthesis) is recognized. Default is to recognize /assume: specify assumptions made by the optimizer and code generator keywords: none, [no]byterecl, [no]buffered_io, [no]bscc (nobscc same as /nbs), [no]cc_omp, [no]minus0, [no]dummy_aliases (same as /Qcommon-args), [no]ieee_fpe_flags, [no]fpe_summary, [no]old_boz, [no]old_complex_align, [no]old_logical_ldio, [no]old_ldout_format, [no]old_maxminloc, [no]old_unit_star, [no]old_xor, [no]protect_constants, [no]protect_parens, [no]realloc_lhs, [no]2underscore, [no]underscore (same as /us), [no]std_intent_in, [no]std_mod_proc_name, [no]source_include, [no]split_common,[no]writeable_strings /ccdefault: specify default carriage control for units 6 and * keywords: default, fortran, list or none /[no]check: check run-time conditions. Default is /nocheck keywords: all (same as /4Yb, /C), none (same as /nocheck, /4Nb), [no]arg_temp_created, [no]bounds (same as /CB), [no]format, [no]output_conversion, [no]pointer (same as /CA), [no]uninit (same as /CU), [no]stack /Qcommon-args assume "by reference" subprogram arguments may alias one another. Same as /assume:dummy_aliases /[no]extend-source[:] specify rightmost column for fixed form sources keywords: 72 (same as /noextend-source and /4L72), 80 (same as /4L80), 132 (same as /4L132. Default if you specify /extend-source without a keyword.) /fixed specify source files are in fixed format. Same as /FI and /4Nf /nofixed indicates free format /free specify source files are in free format. Same as /FR and /4Yf /nofree indicates fixed format /names: specify how source code identifiers and external names are interpreted. keywords: as_is, lowercase, uppercase /[no]pad-source, /Qpad-source[-] make compiler acknowledge blanks at the end of a line /stand[:] specifies level of conformance with ANSI standard to check for. If keyword is not specified, level of conformance is f03 keywords: f90 (same as /4Ys), f95, f03, none (same as /nostand) /standard-semantics sets assume keywords to conform to the semantics of the f03 standard. May result in performance loss. assume keywords set by /standard-semantics: byterecl, fpe_summary, minus0, noold_maxminloc, noold_unit_star, noold_xor, protect_parens, realloc_lhs, std_intent_in, std_mod_proc_name, noold_ldout_format /syntax-only, /Zs perform syntax and semantic checking only (no object file produced) Compiler Diagnostics -------------------- /w disable all warnings /W disable warnings (n = 0) or show warnings (n = 1 DEFAULT, same as /warn:general) /warn: specifies the level of warning messages issued keywords: all, none (same as /nowarn) [no]alignments, [no]declarations, [no]errors, [no]general, [no]ignore_loc, [no]interfaces, [no]stderrors, [no]truncated_source, [no]uncalled, [no]unused, [no]usage /nowarn suppress all warning messages /WB turn a compile-time bounds check into a warning /[no]traceback specify whether the compiler generates PC correlation data used to display a symbolic traceback rather than a hexadecimal traceback at runtime failure /[no]gen-interfaces [[no]source] generate interface blocks for all routines in the file. Can be checked using -warn interfaces nosource indicates temporary source files should not be saved /error-limit: specify the maximum number of error-level or fatal-level compiler errors allowed /noerror-limit set no maximum number on error-level or fatal-level error messages /Qdiag-enable:[,,...] enable the specified diagnostics or diagnostic groups /Qdiag-disable:[,,...] disable the specified diagnostics or diagnostic groups where may be individual diagnostic numbers or group names. where group names include: sc[n] - perform source code analysis: n=1 for critical errors, n=2 for all errors and n=3 for all errors and warnings sc- {full|concise|precise} - perform static analysis and determine the analysis mode. Full mode - attempts to find all program weaknesses, even at the expense of more false positives. Concise mode - attempts to reduce false positives somewhat more than reducing false negatives. Precise mode - attempts to avoid all false positives Default: full if /Qdiag-enable:sc{[1|2|3]} is present; otherwise None (static analysis diagnostics are disabled). sc-include - perform source code analysis on include files sc-single-file - This option tells static analysis to process each file individually. Default: OFF sc-enums - This option tells static analysis to treat enumeration variables as known values equal to any one of the associated enumeration literals. Default: OFF sc-parallel[n] - perform analysis of parallelization in source code: n=1 for critical errors, n=2 for errors, n=3 for all errors and warnings warn - diagnostic messages that have "warning" severity level. error - diagnostic messages that have "error" severity level. remark - diagnostic messages that are remarks or comments. vec - diagnostic messages issued by the vectorizer. par - diagnostic messages issued by the auto-parallelizer openmp - diagnostic messages issued by the OpenMP* parallelizer. cpu-dispatch Specifies the CPU dispatch remarks. /Qdiag-error:[,,...] output the specified diagnostics or diagnostic groups as errors /Qdiag-warning:[,,...] output the specified diagnostics or diagnostic groups as warnings /Qdiag-remark:[,,...] output the the specified diagnostics or diagnostic groups as remarks /Qdiag-dump display the currently enabled diagnostic messages to stdout or to a specified diagnostic output file. /Qdiag-sc-dir: directory where diagnostics from static analysis are created, rather than current working directory. /Qdiag-file[:] where diagnostics are emitted to. Not specifying this causes messages to be output to stderr /Qdiag-file-append[:] where diagnostics are emitted to. When already exists, output is appended to the file /Qdiag-id-numbers[-] enable(DEFAULT)/disable the diagnostic specifiers to be output in numeric form /Qdiag-error-limit: specify the maximum number of errors emitted Miscellaneous ------------- /[no]logo display compiler version information. /nologo disables the output /Qsox[:[,keyword]] enable saving of compiler options, version and additional information in the executable. Use /Qsox- to disable(DEFAULT) profile - include profiling data inline - include inlining information /bintext: place the string specified into the object file and executable /Qsave-temps store the intermediate files in current directory and name them based on the source file. Only saves files that are generated by default /what display detailed compiler version information /watch: tells the driver to output processing information keywords: all, none (same as /nowatch), [no]source, [no]cmd [no]mic-cmd /nowatch suppress processing information output (DEFAULT) /Tf compile file as Fortran source /extfor: specify extension of file to be recognized as a Fortran file /extfpp: specify extension of file to be recognized as a preprocessor file /libdir[:keyword] control the library names that should be emitted into the object file keywords: all, none (same as /nolibdir), [no]automatic, [no]user /nolibdir no library names should be emitted into the object file /MP[] create multiple processes that can be used to compile large numbers of source files at the same time /bigobj generate objects with increased address capacity Data ---- /4I{2|4|8} set default KIND of integer and logical variables to 2, 4, or 8 /integer-size: specifies the default size of integer and logical variables size: 16, 32, 64 /4R{8|16} set default size of real to 8 or 16 bytes /real-size: specify the size of REAL and COMPLEX declarations, constants, functions, and intrinsics size: 32, 64, 128 /Qautodouble same as /real-size:64 or /4R8 /double-size: defines the size of DOUBLE PRECISION and DOUBLE COMPLEX declarations, constants, functions, and intrinsics size: 64, 128 /[no]fpconstant extends the precision of single precision constants assigned to double precision variables to double precision /[no]intconstant use Fortran 77 semantics, rather than Fortran 90/95, to determine kind of integer constants /auto make all local variables AUTOMATIC /Qauto-scalar make scalar local variables AUTOMATIC (DEFAULT) /Qsave save all variables (static allocation) (same as /noauto, opposite of /auto) /Qzero[-] enable/disable(DEFAULT) implicit initialization to zero of local scalar variables of intrinsic type INTEGER, REAL, COMPLEX, or LOGICAL that are saved and not initialized /Qdyncom make given common blocks dynamically-allocated /Zp[n] specify alignment constraint for structures (n=1,2,4,8,16 /Zp16 DEFAULT) /[no]align analyze and reorder memory layout for variables and arrays /align: specify how data items are aligned keywords: all (same as /align), none (same as /noalign), [no]commons, [no]dcommons, [no]qcommons, [no]zcommons, rec1byte, rec2byte, rec4byte, rec8byte, rec16byte, rec32byte, array8byte, array16byte, array32byte, array64byte, array128byte, array256byte, [no]records, [no]sequence /GS enable overflow security checks. /GS- disables (DEFAULT) /Qpatchable-addresses generate code such that references to statically assigned addresses can be patched with arbitrary 64-bit addresses. /Qfnalign[-] align the start of functions to an optimal machine-dependent value. When disabled (DEFAULT) align on a 2-byte boundary /Qfnalign:[2|16] align the start of functions on a 2 (DEFAULT) or 16 byte boundary /Qglobal-hoist[-] enable(DEFAULT)/disable external globals are load safe /Qkeep-static-consts[-] enable/disable(DEFAULT) emission of static const variables even when not referenced /Qnobss-init disable placement of zero-initialized variables in BSS (use DATA) /Qzero-initialized-in-bss[-] put explicitly zero initialized variables into the DATA section instead of the BSS section /convert: specify the format of unformatted files containing numeric data keywords: big_endian, cray, ibm, little_endian, native, vaxd, vaxg /Qimf-absolute-error:value[:funclist] define the maximum allowable absolute error for math library function results value - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-accuracy-bits:bits[:funclist] define the relative error, measured by the number of correct bits, for math library function results bits - a positive, floating-point number funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-arch-consistency:value[:funclist] ensures that the math library functions produce consistent results across different implementations of the same architecture value - true or false funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-max-error:ulps[:funclist] defines the maximum allowable relative error, measured in ulps, for math library function results ulps - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-precision:value[:funclist] defines the accuracy (precision) for math library functions value - defined as one of the following values high - equivalent to max-error = 0.6 medium - equivalent to max-error = 4 (DEFAULT) low - equivalent to accuracy-bits = 11 (single precision); accuracy-bits = 26 (double precision) funclist - optional comma separated list of one or more math library functions to which the attribute should be applied Compatibility ------------- /fpscomp[:] specify the level of compatibility to adhere to with Fortran PowerStation keywords: all, none (same as /nofpscomp), [no]filesfromcmd, [no]general, [no]ioformat, [no]ldio_spacing, [no]libs, [no]logicals /nofpscomp no specific level of compatibility with Fortran PowerStation /f66 allow extensions that enhance FORTRAN-66 compatibility /f77rtl specify that the Fortran 77 specific run-time support should be used /nof77rtl disables /vms enable VMS I/O statement extensions /Qvc enable compatibility with a specific Microsoft* Visual Studio version 9 - Microsoft* Visual Studio 2008 compatibility 10 - Microsoft* Visual Studio 2010 compatibility 11 - Microsoft* Visual Studio 2012 compatibility Linking/Linker -------------- /link specify that all options following '/link' are for the linker /extlnk: specify extension of file to be passed directly to linker /F set the stack reserve amount specified to the linker /dbglibs use the debug version of runtime libraries, when appropriate /libs: specifies which type of run-time library to link to. keywords: static, dll, qwin, qwins /LD[d] produce a DLL instead of an EXE ('d' = debug version) /dll same as /LD /MD[d] use dynamically-loaded, multithread C runtime /MDs[d] use dynamically-loaded, singlethread Fortran runtime, and multithread C runtime /MT[d] use statically-linked, multithread C runtime (DEFAULT with Microsoft Visual Studio 2005 and later) /ML[d] use statically-linked, single thread C runtime (only valid in Microsoft Visual Studio 2003 environment) /MG, /winapp use Windows API runtime libraries /Zl omit library names from object file /threads specify that multi-threaded libraries should be linked against /nothreads disables multi-threaded libraries Deprecated Options ------------------ /Qinline-debug-info use /debug:inline-debug-info /Gf use /GF /ML[d] upgrade to /MT[d] /Quse-asm No replacement /Qprof-genx use /Qprof-gen:srcpos /Qdiag-enable:sv[] use /Qdiag-enable:sc[] /Qdiag-enable:sv-include use /Qdiag-enable:sc-include /Qdiag-sv use /Qdiag-enable:sc[] /Qdiag-sv-error use /Qdiag-disable:warning /Qdiag-sv-include use /Qdiag-enable:sc-include /Qdiag-sv-level No replacement /Qdiag-sv-sup use /Qdiag-disable:[,,...] /Qtprofile No replacement /arch:SSE use /arch:IA32 /QxK upgrade to /arch:SSE2 /QaxK upgrade to /arch:SSE2 /QxW use /arch:SSE2 /QaxW use /arch:SSE2 /QxN use /QxSSE2 /QaxN use /QaxSSE2 /QxP use /QxSSE3 /QaxP use /QaxSSE3 /QxT use /QxSSSE3 /QaxT use /QaxSSSE3 /QxS use /QxSSE4.1 /QaxS use /QaxSSE4.1 /QxH use /QxSSE4.2 /QaxH use /QaxSSE4.2 /QxO use /arch:SSE3 /Qvc7.1 No replacement /QIfist use /Qrcd /QxSSE3_ATOM use /QxSSSE3_ATOM /Qrct No replacement /Op use /fltconsistency /debug:partial No replacement /tune: use /Qx /architecture: use /arch: /1, /Qonetrip use /f66 /Fm use /map /Qcpp, /Qfpp use /fpp /Qdps use /altparam /Qextend-source use /extend-source /Qlowercase use /names:lowercase /Quppercase use /names:uppercase /Qvms use /vms /asmattr:keyword use /FA[c|s|cs] /noasmattr,/asmattr:none use /FA /asmfile use /Fa /automatic use /auto /cm use /warn:nousage /optimize:0 use /Od /optimize:1,2 use /O1 /optimize:3,4 use /O2 /optimize:5 use /O3 /source use /Tf /unix No replacement /us use /assume:underscore /unroll use /Qunroll /w90, /w95 No replacement /Zd use /debug:minimal /help, /? [category] print full or category help message Valid categories include advanced - Advanced Optimizations codegen - Code Generation compatibility - Compatibility component - Component Control data - Data deprecated - Deprecated Options diagnostics - Compiler Diagnostics float - Floating Point help - Help inline - Inlining ipo - Interprocedural Optimization (IPO) language - Language link - Linking/Linker misc - Miscellaneous opt - Optimization output - Output pgo - Profile Guided Optimization (PGO) preproc - Preprocessor reports - Optimization Reports openmp - OpenMP and Parallel Processing Copyright (C) 1985-2013, Intel Corporation. All rights reserved. * Other names and brands may be claimed as the property of others. sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= icl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Intel(R) Visual Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 64, Version 13.1.3.198 Build 20130607 Copyright (C) 1985-2013 Intel Corporation. All rights reserved. Intel(R) Fortran Compiler Help ============================== Intel(R) Compiler includes compiler options that optimize for instruction sets that are available in both Intel(R) and non-Intel microprocessors, but may perform additional optimizations for Intel microprocessors than for non-Intel microprocessors. In addition, certain compiler options for Intel(R) Compiler are reserved for Intel microprocessors. For a detailed description of these compiler options, including the instructions they implicate, please refer to "Intel(R) Compiler User and Reference Guides > Compiler Options." usage: ifort [options] file1 [file2 ...] [/link linker_options] where options represents zero or more compiler options fileN is a Fortran source (.f .for .ftn .f90 .fpp .i .i90), assembly (.asm), object (.obj), static library (.lib), or other linkable file linker_options represents zero or more linker options Notes ----- 1. Many FL32 options are supported; a warning is printed for unsupported options. 2. Intel Fortran compiler options may be placed in your ifort.cfg file. Some options listed are only available on a specific system i32 indicates the feature is available on systems based on IA-32 architecture i64em indicates the feature is available on systems using Intel(R) 64 architecture Compiler Option List -------------------- Optimization ------------ /O1 optimize for maximum speed, but disable some optimizations which increase code size for a small speed benefit /O2 optimize for maximum speed (DEFAULT) /O3 optimize for maximum speed and enable more aggressive optimizations that may not improve performance on some programs /Ox enable maximum optimizations (same as /O2) /Os enable speed optimizations, but disable some optimizations which increase code size for small speed benefit (overrides /Ot) /Ot enable speed optimizations (overrides /Os) /Od disable optimizations /Oy[-] enable/disable using EBP as a general purpose register (no frame pointer) (i32 only) /fast enable /QxHOST /O3 /Qipo /Qprec-div- options set by /fast cannot be overridden with the exception of /QxHOST, list options separately to change behavior /Oa[-] assume no aliasing in program /Ow[-] assume no aliasing within functions, but assume aliasing across calls Code Generation --------------- /Qx generate specialized code to run exclusively on processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. Optimizes for the Intel NetBurst(R) microarchitecture. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the enhanced Pentium(R) M processor microarchitecture and Intel NetBurst(R) microarchitecture. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Core(TM) microarchitecture. SSE4.1 May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator instructions for Intel processors. May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for Intel(R) 45nm Hi-k next generation Intel Core(TM) microarchitecture. SSE4.2 May generate Intel(R) SSE4 Efficient Accelerated String and Text Processing instructions supported by Intel(R) Core(TM) i7 processors. May generate Intel(R) SSE4 Vectorizing Compiler and Media Accelerator, Intel(R) SSSE3, SSE3, SSE2, and SSE instructions and it may optimize for the Intel(R) Core(TM) processor family. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. Optimizes for a future Intel processor. SSSE3_ATOM May generate MOVBE instructions for Intel processors, depending on the setting of option /Qinstruction. May also generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. Optimizes for the Intel(R) Atom(TM) processor and Intel(R) Centrino(R) Atom(TM) Processor Technology. /QxHost generate instructions for the highest instruction set and processor available on the compilation host machine /Qax[,,...] generate code specialized for processors specified by while also generating generic IA-32 instructions. includes one or more of the following: SSE2 May generate Intel(R) SSE2 and SSE instructions for Intel processors. SSE3 May generate Intel(R) SSE3, SSE2, and SSE instructions for Intel processors. SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel processors. AVX May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX2 May generate Intel(R) Advanced Vector Extensions 2 (Intel(R) AVX2), Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. CORE-AVX-I May generate Intel(R) Advanced Vector Extensions (Intel(R) AVX), including instructions in Intel(R) Core 2(TM) processors in process technology smaller than 32nm, Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2, and SSE instructions for Intel(R) processors. /arch: generate specialized code to optimize for processors indicated by as described below SSE2 May generate Intel(R) SSE2 and SSE instructions SSE3 May generate Intel(R) SSE3, SSE2 and SSE instructions SSSE3 May generate Intel(R) SSSE3, SSE3, SSE2 and SSE instructions SSE4.1 May generate Intel(R) SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions SSE4.2 May generate Intel(R) SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions AVX May generate Intel(R) AVX, SSE4.2, SSE4.1, SSSE3, SSE3, SSE2 and SSE instructions /Qinstruction: Refine instruction set output for the selected target processor [no]movbe - Do/do not generate MOVBE instructions with SSSE3_ATOM (requires /QxSSSE3_ATOM) /Qextend-arguments:[32|64] By default, unprototyped scalar integer arguments are passed in 32-bits (sign-extended if necessary). On Intel(R) 64, unprototyped scalar integer arguments may be extended to 64-bits. Interprocedural Optimization (IPO) ---------------------------------- /Qip[-] enable(DEFAULT)/disable single-file IP optimization within files /Qipo[n] enable multi-file IP optimization between files /Qipo-c generate a multi-file object file (ipo_out.obj) /Qipo-S generate a multi-file assembly file (ipo_out.asm) /Qip-no-inlining disable full and partial inlining /Qip-no-pinlining disable partial inlining /Qipo-separate create one object file for every source file (overrides /Qipo[n]) /Qipo-jobs specify the number of jobs to be executed simultaneously during the IPO link phase Advanced Optimizations ---------------------- /Qunroll[n] set maximum number of times to unroll loops. Omit n to use default heuristics. Use n=0 to disable the loop unroller /Qunroll-aggressive[-] enables more aggressive unrolling heuristics /Qscalar-rep[-] enable(DEFAULT)/disable scalar replacement (requires /O3) /Qpad[-] enable/disable(DEFAULT) changing variable and array memory layout /Qsafe-cray-ptr Cray pointers do not alias with other variables /Qansi-alias[-] enable/disable(DEFAULT) use of ANSI aliasing rules optimizations; user asserts that the program adheres to these rules /Qcomplex-limited-range[-] enable/disable(DEFAULT) the use of the basic algebraic expansions of some complex arithmetic operations. This can allow for some performance improvement in programs which use a lot of complex arithmetic at the loss of some exponent range. /reentrancy: specify whether the threaded, reentrant run-time support should be used Keywords: none (same as /noreentrancy), threaded, async /noreentrancy do not use threaded, reentrant run-time support /heap-arrays[:n] temporary arrays of minimum size n (in kilobytes) are allocated in heap memory rather than on the stack. If n is not specified, all temporary arrays are allocated in heap memory. /heap-arrays- temporary arrays are allocated on the stack (DEFAULT) /Qopt-multi-version-aggressive[-] enables more aggressive multi-versioning to check for pointer aliasing and scalar replacement /Qopt-ra-region-strategy[:] select the method that the register allocator uses to partition each routine into regions routine - one region per routine block - one region per block trace - one region per trace loop - one region per loop default - compiler selects best option /Qvec[-] enables(DEFAULT)/disables vectorization /Qvec-guard-write[-] enables cache/bandwidth optimization for stores under conditionals within vector loops /Qvec-threshold[n] sets a threshold for the vectorization of loops based on the probability of profitable execution of the vectorized loop in parallel /Qopt-malloc-options:{0|1|2|3|4} specify malloc configuration parameters. Specifying a non-zero value will cause alternate configuration parameters to be set for how malloc allocates and frees memory /Qopt-jump-tables: control the generation of jump tables default - let the compiler decide when a jump table, a series of if-then-else constructs or a combination is generated large - generate jump tables up to a certain pre-defined size (64K entries) - generate jump tables up to in size use /Qopt-jump-tables- to lower switch statements as chains of if-then-else constructs /Qopt-block-factor: specify blocking factor for loop blocking /Qopt-streaming-stores: specifies whether streaming stores are generated always - enables generation of streaming stores under the assumption that the application is memory bound auto - compiler decides when streaming stores are used (DEFAULT) never - disables generation of streaming stores /Qmkl[:] link to the Intel(R) Math Kernel Library (Intel(R) MKL) and bring in the associated headers parallel - link using the threaded Intel(R) MKL libraries. This is the default when /Qmkl is specified sequential - link using the non-threaded Intel(R) MKL libraries cluster - link using the Intel(R) MKL Cluster libraries plus the sequential Intel(R) MKL libraries /Qimsl link to the International Mathematics and Statistics Library* (IMSL* library) /Qopt-subscript-in-range[-] assumes no overflows in the intermediate computation of the subscripts /Qcoarray[:shared|distributed] enable/disable(DEFAULT) coarray syntax for data parallel programming. The default is shared-memory; distributed memory is only valid with the Intel(R) Cluster Toolkit /Qcoarray-num-images:n set default number of coarray images /Qopt-matmul[-] replace matrix multiplication with calls to intrinsics and threading libraries for improved performance (DEFAULT at /O3 /Qparallel) /Qsimd[-] enables(DEFAULT)/disables vectorization using SIMD directive /Qguide-opts: tells the compiler to analyze certain code and generate recommendations that may improve optimizations /Qguide-file[:] causes the results of guide to be output to a file /Qguide-file-append[:] causes the results of guide to be appended to a file /Qguide[:] lets you set a level (1 - 4) of guidance for auto-vectorization, auto-parallelization, and data transformation (DEFAULT is 4 when the option is specified) /Qguide-data-trans[:] lets you set a level (1 - 4) of guidance for data transformation (DEFAULT is 4 when the option is specified) /Qguide-par[:] lets you set a level (1 - 4) of guidance for auto-parallelization (DEFAULT is 4 when the option is specified) /Qguide-vec[:] lets you set a level (1 - 4) of guidance for auto-vectorization (DEFAULT is 4 when the option is specified) /Qguide-profile:<[file|dir]>[,[file|dir],...] specify a loop profiler data file (or set of files in a directory) when using the /Qguide option /Qopt-mem-layout-trans[:] controls the level of memory layout transformations performed by the compiler 0 - disable memory layout transformations (same as /Qopt-mem-layout-trans-) 1 - enable basic memory layout transformations 2 - enable more memory layout transformations (DEFAULT when the option is specified) 3 - enable aggressive memory layout transformations /Qopt-prefetch[:n] enable levels of prefetch insertion, where 0 disables. n may be 0 through 4 inclusive. Default is 2. /Qopt-prefetch- disable(DEFAULT) prefetch insertion. Equivalent to /Qopt-prefetch:0 Profile Guided Optimization (PGO) --------------------------------- /Qprof-dir specify directory for profiling output files (*.dyn and *.dpi) /Qprof-src-root specify project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-root-cwd specify the current directory as the project root directory for application source files to enable relative path resolution during profile feedback on sources below that directory /Qprof-src-dir[-] specify whether directory names of sources should be considered when looking up profile records within the .dpi file /Qprof-file specify file name for profiling summary file /Qprof-data-order[-] enable/disable(DEFAULT) static data ordering with profiling /Qprof-func-order[-] enable/disable(DEFAULT) function ordering with profiling /Qprof-gen[:keyword] instrument program for profiling. Optional keyword may be srcpos or globdata /Qprof-gen- disable profiling instrumentation /Qprof-use[:] enable use of profiling information during optimization weighted - invokes profmerge with -weighted option to scale data based on run durations [no]merge - enable(default)/disable the invocation of the profmerge tool /Qprof-use- disable use of profiling information during optimization /Qcov-gen instrument program for profiling /Qcov-dir specify directory for profiling output files (*.dyn and *.dpi) /Qcov-file specify file name for profiling summary file /Qinstrument-functions[-] determine whether function entry and exit points are instrumented /Qprof-hotness-threshold: set the hotness threshold for function grouping and function ordering val indicates percentage of functions to be placed in hot region. This option requires /Qprof-use and /Qprof-func-order /Qprof-value-profiling:[,,...] limit value profiling none - inhibit all types of value profiling nodivide - inhibit value profiling of non-compile time constants used in division or remainder operations noindcall - inhibit value profiling of function addresses at indirect call sites /Qprofile-functions enable instrumentation in generated code for collecting function execution time profiles /Qprofile-loops: enable instrumentation in generated code for collecting loop execution time profiles inner - instrument inner loops outer - instrument outer loops all - instrument all loops /Qprofile-loops-report: Control the level of instrumentation inserted for reporting loop execution profiles 1 - report loop times 2 - report loop times and iteration counts Optimization Reports -------------------- /Qvec-report[n] control amount of vectorizer diagnostic information n=0 no diagnostic information n=1 indicate vectorized loops (DEFAULT when enabled) n=2 indicate vectorized/non-vectorized loops n=3 indicate vectorized/non-vectorized loops and prohibiting data dependence information n=4 indicate non-vectorized loops n=5 indicate non-vectorized loops and prohibiting data dependence information n=6 indicate vectorized/non-vectorized loops with greater details and prohibiting data dependence information n=7 indicate vector code quality message ids and data values for vectorized loops /Qopt-report[:n] generate an optimization report to stderr 0 disable optimization report output 1 minimum report output 2 medium output (DEFAULT when enabled) 3 maximum report output /Qopt-report-file: specify the filename for the generated report /Qopt-report-phase: specify the phase that reports are generated against /Qopt-report-routine: reports on routines containing the given name /Qopt-report-help display the optimization phases available for reporting /Qtcheck[:mode] enable analysis of threaded applications (requires Intel(R) Thread Checker; cannot be used with compiler alone) tci - instruments a program to perform a thread-count-independent analysis tcd - instruments a program to perform a thread-count-dependent analysis (DEFAULT when mode is not used) api - instruments a program at the api-imports level /Qtcollect[:] inserts instrumentation probes calling the Intel(R) Trace Collector API. The library .lib is linked in the default being VT.lib (requires Intel(R) Trace Collector) /Qtcollect-filter:file Enable or disable the instrumentation of specified functions. (requires Intel(R) Trace Collector) OpenMP* and Parallel Processing ------------------------------ /Qopenmp enable the compiler to generate multi-threaded code based on the OpenMP* directives (same as /openmp) /Qopenmp-stubs enables the user to compile OpenMP programs in sequential mode. The OpenMP directives are ignored and a stub OpenMP library is linked (sequential) /Qopenmp-report{0|1|2} control the OpenMP parallelizer diagnostic level /Qopenmp-lib: choose which OpenMP library version to link with compat - use the Microsoft compatible OpenMP run-time libraries (DEFAULT) /Qopenmp-threadprivate: choose which threadprivate implementation to use compat - use the Microsoft compatible thread local storage legacy - use the Intel compatible implementation (DEFAULT) /Qparallel enable the auto-parallelizer to generate multi-threaded code for loops that can be safely executed in parallel /Qpar-report{0|1|2|3} control the auto-parallelizer diagnostic level /Qpar-threshold[n] set threshold for the auto-parallelization of loops where n is an integer from 0 to 100 /Qpar-runtime-control[n] Control parallelizer to generate runtime check code for effective automatic parallelization. n=0 no runtime check based auto-parallelization n=1 generate runtime check code under conservative mode (DEFAULT when enabled) n=2 generate runtime check code under heuristic mode n=3 generate runtime check code under aggressive mode /Qpar-schedule-static[:n] Specifies a scheduling algorithm for DO loop iteration. Divides iterations into contiguous pieces. Size n if specified, equal sized pieces if not. /Qpar-schedule-static_balanced[:n] Divides iterations into even-sized chunks. Size n if specified, equal sized pieces if not. /Qpar-schedule-static-steal[:n] Divides iterations into even-sized chunks, but allows threads to steal parts of chunks from neighboring threads /Qpar-schedule-dynamic[:n] Specifies a scheduling algorithm for DO loop iteration. Assigns iterations to threads in chunks dynamically. Chunk size is n iterations if specified, otherwise 1. /Qpar-schedule-guided[:n] Specifies a scheduling algorithm for DO loop iteration. Indicates a minimum number of iterations. If specified, n is the minimum number, otherwise 1. /Qpar-schedule-guided-analytical[:n] Divides iterations by using exponential distribution or dynamic distributions. /Qpar-schedule-runtime Specifies a scheduling algorithm for DO loop iteration. Defers the scheduling decision until runtime. /Qpar-schedule-auto Lets the compiler or run-time system determine the scheduling algorithm. /Qpar-adjust-stack perform fiber-based main thread stack adjustment /Qpar-affinity=[,...][,][,] tune application performance by setting different thread affinity /Qpar-num-threads= tune application performance by setting different number of threads /Qparallel-source-info[:n] enable(DEFAULT)/disable the emission of source location information for parallel code generation with OpenMP and auto-parallelization 0 - disable (same as /Qparallel-source-info-) 1 - emit routine name and line information (DEFAULT) 2 - emit path, file, routine name and line information /Qpar same as /Qparallel Floating Point -------------- /fp: enable floating point model variation except[-] - enable/disable floating point semantics fast[=1|2] - enables more aggressive floating point optimizations precise - allows value-safe optimizations source - enables intermediates in source precision strict - enables /fp:precise /fp:except, disables contractions and enables pragma stdc fenv_access /Qfp-speculation: enable floating point speculations with the following conditions: fast - speculate floating point operations (DEFAULT) safe - speculate only when safe strict - same as off off - disables speculation of floating-point operations /Qpc32 set internal FPU precision to 24 bit significand /Qprec improve floating-point precision (speed impact less than /Op) /Qprec-sqrt[-] determine if certain square root optimizations are enabled /Qprec-div[-] improve precision of FP divides (some speed impact) /Qfast-transcendentals[-] generate a faster version of the transcendental functions /Qfp-port[-] round fp results at assignments and casts (some speed impact) /Qfp-stack-check enable fp stack checking after every function/procedure call /Qrcd rounding mode to enable fast float-to-int conversions /rounding-mode:chopped set internal FPU rounding control to truncate /Qftz[-] enable/disable flush denormal results to zero /fpe:{0|1|3} specifies program-wide behavior on floating point exceptions /fpe-all:{0|1|3} specifies floating point exception behavior on all functions and subroutines. Also sets /assume:ieee_fpe_flags /[no]fltconsistency specify that improved floating-point consistency should be used /Qfma[-] enable/disable the combining of floating point multiplies and add/subtract operations /[no]recursive compile all procedures for possible recursive execution Inlining -------- /Ob control inline expansion: n=0 disable inlining (same as /inline:none) n=1 inline functions declared with ATTRIBUTES INLINE or FORCEINLINE n=2 inline any function, at the compiler's discretion /Qinline-min-size: set size limit for inlining small routines /Qinline-min-size- no size limit for inlining small routines /Qinline-max-size: set size limit for inlining large routines /Qinline-max-size- no size limit for inlining large routines /Qinline-max-total-size: maximum increase in size for inline function expansion /Qinline-max-total-size- no size limit for inline function expansion /Qinline-max-per-routine: maximum number of inline instances in any function /Qinline-max-per-routine- no maximum number of inline instances in any function /Qinline-max-per-compile: maximum number of inline instances in the current compilation /Qinline-max-per-compile- no maximum number of inline instances in the current compilation /Qinline-factor: set inlining upper limits by n percentage /Qinline-factor- do not set set inlining upper limits /Qinline-forceinline treat inline routines as forceinline /Qinline-dllimport allow(DEFAULT)/disallow functions declared DEC$ ATTRIBUTES DLLIMPORT to be inlined /Qinline-calloc directs the compiler to inline calloc() calls as malloc()/memset() /inline[:keyword] Specifies the level of inline function expansion keywords: all (same as /Ob2 /Ot), size (same as /Ob2 /Os) speed (same as /Ob2 /Ot), none or manual (same as /Ob0) Output, Debug, PCH ------------------ /c compile to object (.obj) only, do not link /nolink, /compile-only same as /c /S compile to assembly (.asm) only, do not link /FAs produce assembly file with optional source annotations /FAc produce assembly file with optional code annotations /FA produce assembly file /Fa[file] name assembly file (or directory for multiple files; i.e. /FaMYDIR\) /Fo[file] name object file (or directory for multiple files; i.e. /FoMYDIR\) /Fe[file] name executable file or directory /object: specify the name of the object file, or the directory to which object file(s) should be written. (e.g. /object:MYOBJ or /object:MYDIR\) /exe: specifies the name to be used for the built program (.exe) or dynamic-link (.dll) library /map: specify that a link map file should be generated /list: specify that a listing file should be generated /list-line-len:# overrides the default line length (80) in a listing file /list-page-len:# overrides the default page length (66) in a listing file /show: controls the contents of the listing file keywords: all, none, [no]include, [no]map, [no]options /Zi, /ZI, /Z7 produce symbolic debug information in object file (implies /Od when another optimization option is not explicitly set) /debug[:keyword] enable debug information and control output of enhanced debug information keywords: all, full, minimal, none, [no]inline-debug-info /nodebug do not enable debug information /debug-parameters[:keyword] control output of debug information for PARAMETERS keywords: all, used, none (same as /nodebug-parameters) /nodebug-parameters do not output debug information for PARAMETERS /Qd-lines, /[no]d-lines compile debug statements (indicated by D in column 1) /pdbfile[:filename] specify that debug related information should be generated to a program database file /nopdbfile do not generate debug related information to a program database file /Qtrapuv trap uninitialized variables /RTCu report use of variable that was not initialized /Qmap-opts enable option mapping tool Preprocessor ------------ /D[{=|#}] define macro /define:symbol[=] same as /D /nodefines specifies that any /D macros go to the preprocessor only, and not to the compiler /U remove predefined macro /undefine: remove predefined macro (same as /U) /allow:nofpp-comments If a Fortran end-of-line comment is seen within a #define, treat it as part of the definition. Default is allow:fpp-comments /E preprocess to stdout /EP preprocess to stdout, omitting #line directives /EP /P preprocess to file, omitting #line directives /P preprocess to file /preprocess-only same as /P /[no]keep keep/remove preprocessed file generated by preprocessor as input to compiler stage. Not affected by /Qsave-temps. Default is /nokeep /fpp[n], /[no]fpp run Fortran preprocessor on source files prior to compilation n=0 disable running the preprocessor, equivalent to nofpp n=1,2,3 run preprocessor /module:path specify path where mod files should be placed and first location to look for mod files /u remove all predefined macros /I add directory to include file search path /[no]include: same as /I /X remove standard directories from include file search path /[no]gen-dep[:filename] generate dependency information. If no filename is specified, output to stdout /gen-depformat:keyword generate dependency information in the specified format. One of: make, nmake Component Control ----------------- /Qoption,, pass options to tool specified by /Qlocation,, set as the location of tool specified by Language -------- /[no]altparam specify if alternate form of parameter constant declarations (without parenthesis) is recognized. Default is to recognize /assume: specify assumptions made by the optimizer and code generator keywords: none, [no]byterecl, [no]buffered_io, [no]bscc (nobscc same as /nbs), [no]cc_omp, [no]minus0, [no]dummy_aliases (same as /Qcommon-args), [no]ieee_fpe_flags, [no]fpe_summary, [no]old_boz, [no]old_complex_align, [no]old_logical_ldio, [no]old_ldout_format, [no]old_maxminloc, [no]old_unit_star, [no]old_xor, [no]protect_constants, [no]protect_parens, [no]realloc_lhs, [no]2underscore, [no]underscore (same as /us), [no]std_intent_in, [no]std_mod_proc_name, [no]source_include, [no]split_common,[no]writeable_strings /ccdefault: specify default carriage control for units 6 and * keywords: default, fortran, list or none /[no]check: check run-time conditions. Default is /nocheck keywords: all (same as /4Yb, /C), none (same as /nocheck, /4Nb), [no]arg_temp_created, [no]bounds (same as /CB), [no]format, [no]output_conversion, [no]pointer (same as /CA), [no]uninit (same as /CU), [no]stack /Qcommon-args assume "by reference" subprogram arguments may alias one another. Same as /assume:dummy_aliases /[no]extend-source[:] specify rightmost column for fixed form sources keywords: 72 (same as /noextend-source and /4L72), 80 (same as /4L80), 132 (same as /4L132. Default if you specify /extend-source without a keyword.) /fixed specify source files are in fixed format. Same as /FI and /4Nf /nofixed indicates free format /free specify source files are in free format. Same as /FR and /4Yf /nofree indicates fixed format /names: specify how source code identifiers and external names are interpreted. keywords: as_is, lowercase, uppercase /[no]pad-source, /Qpad-source[-] make compiler acknowledge blanks at the end of a line /stand[:] specifies level of conformance with ANSI standard to check for. If keyword is not specified, level of conformance is f03 keywords: f90 (same as /4Ys), f95, f03, none (same as /nostand) /standard-semantics sets assume keywords to conform to the semantics of the f03 standard. May result in performance loss. assume keywords set by /standard-semantics: byterecl, fpe_summary, minus0, noold_maxminloc, noold_unit_star, noold_xor, protect_parens, realloc_lhs, std_intent_in, std_mod_proc_name, noold_ldout_format /syntax-only, /Zs perform syntax and semantic checking only (no object file produced) Compiler Diagnostics -------------------- /w disable all warnings /W disable warnings (n = 0) or show warnings (n = 1 DEFAULT, same as /warn:general) /warn: specifies the level of warning messages issued keywords: all, none (same as /nowarn) [no]alignments, [no]declarations, [no]errors, [no]general, [no]ignore_loc, [no]interfaces, [no]stderrors, [no]truncated_source, [no]uncalled, [no]unused, [no]usage /nowarn suppress all warning messages /WB turn a compile-time bounds check into a warning /[no]traceback specify whether the compiler generates PC correlation data used to display a symbolic traceback rather than a hexadecimal traceback at runtime failure /[no]gen-interfaces [[no]source] generate interface blocks for all routines in the file. Can be checked using -warn interfaces nosource indicates temporary source files should not be saved /error-limit: specify the maximum number of error-level or fatal-level compiler errors allowed /noerror-limit set no maximum number on error-level or fatal-level error messages /Qdiag-enable:[,,...] enable the specified diagnostics or diagnostic groups /Qdiag-disable:[,,...] disable the specified diagnostics or diagnostic groups where may be individual diagnostic numbers or group names. where group names include: sc[n] - perform source code analysis: n=1 for critical errors, n=2 for all errors and n=3 for all errors and warnings sc- {full|concise|precise} - perform static analysis and determine the analysis mode. Full mode - attempts to find all program weaknesses, even at the expense of more false positives. Concise mode - attempts to reduce false positives somewhat more than reducing false negatives. Precise mode - attempts to avoid all false positives Default: full if /Qdiag-enable:sc{[1|2|3]} is present; otherwise None (static analysis diagnostics are disabled). sc-include - perform source code analysis on include files sc-single-file - This option tells static analysis to process each file individually. Default: OFF sc-enums - This option tells static analysis to treat enumeration variables as known values equal to any one of the associated enumeration literals. Default: OFF sc-parallel[n] - perform analysis of parallelization in source code: n=1 for critical errors, n=2 for errors, n=3 for all errors and warnings warn - diagnostic messages that have "warning" severity level. error - diagnostic messages that have "error" severity level. remark - diagnostic messages that are remarks or comments. vec - diagnostic messages issued by the vectorizer. par - diagnostic messages issued by the auto-parallelizer openmp - diagnostic messages issued by the OpenMP* parallelizer. cpu-dispatch Specifies the CPU dispatch remarks. /Qdiag-error:[,,...] output the specified diagnostics or diagnostic groups as errors /Qdiag-warning:[,,...] output the specified diagnostics or diagnostic groups as warnings /Qdiag-remark:[,,...] output the the specified diagnostics or diagnostic groups as remarks /Qdiag-dump display the currently enabled diagnostic messages to stdout or to a specified diagnostic output file. /Qdiag-sc-dir: directory where diagnostics from static analysis are created, rather than current working directory. /Qdiag-file[:] where diagnostics are emitted to. Not specifying this causes messages to be output to stderr /Qdiag-file-append[:] where diagnostics are emitted to. When already exists, output is appended to the file /Qdiag-id-numbers[-] enable(DEFAULT)/disable the diagnostic specifiers to be output in numeric form /Qdiag-error-limit: specify the maximum number of errors emitted Miscellaneous ------------- /[no]logo display compiler version information. /nologo disables the output /Qsox[:[,keyword]] enable saving of compiler options, version and additional information in the executable. Use /Qsox- to disable(DEFAULT) profile - include profiling data inline - include inlining information /bintext: place the string specified into the object file and executable /Qsave-temps store the intermediate files in current directory and name them based on the source file. Only saves files that are generated by default /what display detailed compiler version information /watch: tells the driver to output processing information keywords: all, none (same as /nowatch), [no]source, [no]cmd [no]mic-cmd /nowatch suppress processing information output (DEFAULT) /Tf compile file as Fortran source /extfor: specify extension of file to be recognized as a Fortran file /extfpp: specify extension of file to be recognized as a preprocessor file /libdir[:keyword] control the library names that should be emitted into the object file keywords: all, none (same as /nolibdir), [no]automatic, [no]user /nolibdir no library names should be emitted into the object file /MP[] create multiple processes that can be used to compile large numbers of source files at the same time /bigobj generate objects with increased address capacity Data ---- /4I{2|4|8} set default KIND of integer and logical variables to 2, 4, or 8 /integer-size: specifies the default size of integer and logical variables size: 16, 32, 64 /4R{8|16} set default size of real to 8 or 16 bytes /real-size: specify the size of REAL and COMPLEX declarations, constants, functions, and intrinsics size: 32, 64, 128 /Qautodouble same as /real-size:64 or /4R8 /double-size: defines the size of DOUBLE PRECISION and DOUBLE COMPLEX declarations, constants, functions, and intrinsics size: 64, 128 /[no]fpconstant extends the precision of single precision constants assigned to double precision variables to double precision /[no]intconstant use Fortran 77 semantics, rather than Fortran 90/95, to determine kind of integer constants /auto make all local variables AUTOMATIC /Qauto-scalar make scalar local variables AUTOMATIC (DEFAULT) /Qsave save all variables (static allocation) (same as /noauto, opposite of /auto) /Qzero[-] enable/disable(DEFAULT) implicit initialization to zero of local scalar variables of intrinsic type INTEGER, REAL, COMPLEX, or LOGICAL that are saved and not initialized /Qdyncom make given common blocks dynamically-allocated /Zp[n] specify alignment constraint for structures (n=1,2,4,8,16 /Zp16 DEFAULT) /[no]align analyze and reorder memory layout for variables and arrays /align: specify how data items are aligned keywords: all (same as /align), none (same as /noalign), [no]commons, [no]dcommons, [no]qcommons, [no]zcommons, rec1byte, rec2byte, rec4byte, rec8byte, rec16byte, rec32byte, array8byte, array16byte, array32byte, array64byte, array128byte, array256byte, [no]records, [no]sequence /GS enable overflow security checks. /GS- disables (DEFAULT) /Qpatchable-addresses generate code such that references to statically assigned addresses can be patched with arbitrary 64-bit addresses. /Qfnalign[-] align the start of functions to an optimal machine-dependent value. When disabled (DEFAULT) align on a 2-byte boundary /Qfnalign:[2|16] align the start of functions on a 2 (DEFAULT) or 16 byte boundary /Qglobal-hoist[-] enable(DEFAULT)/disable external globals are load safe /Qkeep-static-consts[-] enable/disable(DEFAULT) emission of static const variables even when not referenced /Qnobss-init disable placement of zero-initialized variables in BSS (use DATA) /Qzero-initialized-in-bss[-] put explicitly zero initialized variables into the DATA section instead of the BSS section /convert: specify the format of unformatted files containing numeric data keywords: big_endian, cray, ibm, little_endian, native, vaxd, vaxg /Qimf-absolute-error:value[:funclist] define the maximum allowable absolute error for math library function results value - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-accuracy-bits:bits[:funclist] define the relative error, measured by the number of correct bits, for math library function results bits - a positive, floating-point number funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-arch-consistency:value[:funclist] ensures that the math library functions produce consistent results across different implementations of the same architecture value - true or false funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-max-error:ulps[:funclist] defines the maximum allowable relative error, measured in ulps, for math library function results ulps - a positive, floating-point number conforming to the format [digits][.digits][{e|E}[sign]digits] funclist - optional comma separated list of one or more math library functions to which the attribute should be applied /Qimf-precision:value[:funclist] defines the accuracy (precision) for math library functions value - defined as one of the following values high - equivalent to max-error = 0.6 medium - equivalent to max-error = 4 (DEFAULT) low - equivalent to accuracy-bits = 11 (single precision); accuracy-bits = 26 (double precision) funclist - optional comma separated list of one or more math library functions to which the attribute should be applied Compatibility ------------- /fpscomp[:] specify the level of compatibility to adhere to with Fortran PowerStation keywords: all, none (same as /nofpscomp), [no]filesfromcmd, [no]general, [no]ioformat, [no]ldio_spacing, [no]libs, [no]logicals /nofpscomp no specific level of compatibility with Fortran PowerStation /f66 allow extensions that enhance FORTRAN-66 compatibility /f77rtl specify that the Fortran 77 specific run-time support should be used /nof77rtl disables /vms enable VMS I/O statement extensions /Qvc enable compatibility with a specific Microsoft* Visual Studio version 9 - Microsoft* Visual Studio 2008 compatibility 10 - Microsoft* Visual Studio 2010 compatibility 11 - Microsoft* Visual Studio 2012 compatibility Linking/Linker -------------- /link specify that all options following '/link' are for the linker /extlnk: specify extension of file to be passed directly to linker /F set the stack reserve amount specified to the linker /dbglibs use the debug version of runtime libraries, when appropriate /libs: specifies which type of run-time library to link to. keywords: static, dll, qwin, qwins /LD[d] produce a DLL instead of an EXE ('d' = debug version) /dll same as /LD /MD[d] use dynamically-loaded, multithread C runtime /MDs[d] use dynamically-loaded, singlethread Fortran runtime, and multithread C runtime /MT[d] use statically-linked, multithread C runtime (DEFAULT with Microsoft Visual Studio 2005 and later) /ML[d] use statically-linked, single thread C runtime (only valid in Microsoft Visual Studio 2003 environment) /MG, /winapp use Windows API runtime libraries /Zl omit library names from object file /threads specify that multi-threaded libraries should be linked against /nothreads disables multi-threaded libraries Deprecated Options ------------------ /Qinline-debug-info use /debug:inline-debug-info /Gf use /GF /ML[d] upgrade to /MT[d] /Quse-asm No replacement /Qprof-genx use /Qprof-gen:srcpos /Qdiag-enable:sv[] use /Qdiag-enable:sc[] /Qdiag-enable:sv-include use /Qdiag-enable:sc-include /Qdiag-sv use /Qdiag-enable:sc[] /Qdiag-sv-error use /Qdiag-disable:warning /Qdiag-sv-include use /Qdiag-enable:sc-include /Qdiag-sv-level No replacement /Qdiag-sv-sup use /Qdiag-disable:[,,...] /Qtprofile No replacement /arch:SSE use /arch:IA32 /QxK upgrade to /arch:SSE2 /QaxK upgrade to /arch:SSE2 /QxW use /arch:SSE2 /QaxW use /arch:SSE2 /QxN use /QxSSE2 /QaxN use /QaxSSE2 /QxP use /QxSSE3 /QaxP use /QaxSSE3 /QxT use /QxSSSE3 /QaxT use /QaxSSSE3 /QxS use /QxSSE4.1 /QaxS use /QaxSSE4.1 /QxH use /QxSSE4.2 /QaxH use /QaxSSE4.2 /QxO use /arch:SSE3 /Qvc7.1 No replacement /QIfist use /Qrcd /QxSSE3_ATOM use /QxSSSE3_ATOM /Qrct No replacement /Op use /fltconsistency /debug:partial No replacement /tune: use /Qx /architecture: use /arch: /1, /Qonetrip use /f66 /Fm use /map /Qcpp, /Qfpp use /fpp /Qdps use /altparam /Qextend-source use /extend-source /Qlowercase use /names:lowercase /Quppercase use /names:uppercase /Qvms use /vms /asmattr:keyword use /FA[c|s|cs] /noasmattr,/asmattr:none use /FA /asmfile use /Fa /automatic use /auto /cm use /warn:nousage /optimize:0 use /Od /optimize:1,2 use /O1 /optimize:3,4 use /O2 /optimize:5 use /O3 /source use /Tf /unix No replacement /us use /assume:underscore /unroll use /Qunroll /w90, /w95 No replacement /Zd use /debug:minimal /help, /? [category] print full or category help message Valid categories include advanced - Advanced Optimizations codegen - Code Generation compatibility - Compatibility component - Component Control data - Data deprecated - Deprecated Options diagnostics - Compiler Diagnostics float - Floating Point help - Help inline - Inlining ipo - Interprocedural Optimization (IPO) language - Language link - Linking/Linker misc - Miscellaneous opt - Optimization output - Output pgo - Profile Guided Optimization (PGO) preproc - Preprocessor reports - Optimization Reports openmp - OpenMP and Parallel Processing Copyright (C) 1985-2013, Intel Corporation. All rights reserved. * Other names and brands may be claimed as the property of others. Trying FC compiler flag -Z7 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Added FC compiler flag -Z7 Popping language FC ================================================================================ TEST configureDebugging from PETSc.utilities.debugging(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/debugging.py:25) TESTING: configureDebugging from PETSc.utilities.debugging(config/PETSc/utilities/debugging.py:25) Defined "USE_ERRORCHECKING" to "1" ================================================================================ TEST configureArchitecture from PETSc.utilities.arch(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/arch.py:32) TESTING: configureArchitecture from PETSc.utilities.arch(config/PETSc/utilities/arch.py:32) Checks PETSC_ARCH and sets if not set Defined "ARCH" to ""arch-mswin-c-debug"" ================================================================================ TEST checkSharedDynamicPicOptions from PETSc.utilities.sharedLibraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/sharedLibraries.py:42) TESTING: checkSharedDynamicPicOptions from PETSc.utilities.sharedLibraries(config/PETSc/utilities/sharedLibraries.py:42) ================================================================================ TEST configureSharedLibraries from PETSc.utilities.sharedLibraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/sharedLibraries.py:63) TESTING: configureSharedLibraries from PETSc.utilities.sharedLibraries(config/PETSc/utilities/sharedLibraries.py:63) Checks whether shared libraries should be used, for which you must - Specify --with-shared-libraries - Have found a working shared linker Defines PETSC_USE_SHARED_LIBRARIES if they are used Defined make rule "shared_arch" with dependencies "" and code [] Defined make macro "BUILDSHAREDLIB" to "no" Defined "HAVE_SHARED_LIBRARIES" to "1" Shared libraries - disabled ================================================================================ TEST configureDynamicLibraries from PETSc.utilities.sharedLibraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/sharedLibraries.py:102) TESTING: configureDynamicLibraries from PETSc.utilities.sharedLibraries(config/PETSc/utilities/sharedLibraries.py:102) Checks whether dynamic loading should be used, for which you must - Specify --with-dynamic-loading - Have found a working dynamic linker (with dlfcn.h and libdl) Defines PETSC_USE_DYNAMIC_LIBRARIES if they are used Dynamic loading - disabled ================================================================================ TEST configureBmakeDir from PETSc.utilities.bmakeDir(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/bmakeDir.py:26) TESTING: configureBmakeDir from PETSc.utilities.bmakeDir(config/PETSc/utilities/bmakeDir.py:26) Makes $PETSC_ARCH and subdirectories if it does not exist Changed persistence directory to arch-mswin-c-debug/conf sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib ================================================================================ TEST checkRestrict from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:134) TESTING: checkRestrict from config.compilers(config/BuildSystem/config/compilers.py:134) Check for the C/CXX restrict keyword Pushing language C All intermediate test results are stored in /tmp/petsc-1nzsmm/config.compilers sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2065: 'x' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2065: 'x' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" int main() { float * restrict x;; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2065: 'x' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(5) : error C2065: 'x' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" int main() { float * __restrict__ x;; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { float * __restrict x;; return 0; } compilers: Set C restrict keyword to __restrict Defined "C_RESTRICT" to "__restrict" Popping language C ================================================================================ TEST checkCFormatting from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:306) TESTING: checkCFormatting from config.compilers(config/BuildSystem/config/compilers.py:306) Activate format string checking if using the GNU compilers ================================================================================ TEST checkCStaticInline from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:105) TESTING: checkCStaticInline from config.compilers(config/BuildSystem/config/compilers.py:105) Check for C keyword: static inline Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(3) : error C2054: expected '(' to follow 'inline' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(3) : error C2085: 'foo' : not in formal parameter list C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(3) : error C2143: syntax error : missing ';' before '{' Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(3) : error C2054: expected '(' to follow 'inline' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(3) : error C2085: 'foo' : not in formal parameter list C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.c(3) : error C2143: syntax error : missing ';' before '{' ret = 512 Source: #include "confdefs.h" #include "conffix.h" static inline int foo(int a) {return a;} int main() { int i = foo(1);; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" static __inline int foo(int a) {return a;} int main() { int i = foo(1);; return 0; } compilers: Set C StaticInline keyword to static __inline Popping language C Defined "C_STATIC_INLINE" to "static __inline" ================================================================================ TEST checkDynamicLoadFlag from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:317) TESTING: checkDynamicLoadFlag from config.compilers(config/BuildSystem/config/compilers.py:317) Checks that dlopen() takes RTLD_XXX, and defines PETSC_HAVE_RTLD_XXX if it does ================================================================================ TEST checkCLibraries from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:161) TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:161) Determines the libraries needed to link with C Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -v -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -v -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-v' Popping language C compilers: Checking arg cl compilers: Unknown arg cl compilers: Checking arg : compilers: Unknown arg : compilers: Checking arg Command compilers: Unknown arg Command compilers: Checking arg line compilers: Unknown arg line compilers: Checking arg warning compilers: Unknown arg warning compilers: Checking arg D9002 compilers: Unknown arg D9002 compilers: Checking arg : compilers: Unknown arg : compilers: Checking arg ignoring compilers: Unknown arg ignoring compilers: Checking arg unknown compilers: Unknown arg unknown compilers: Checking arg option compilers: Unknown arg option compilers: Libraries needed to link C code with another linker: [] compilers: Check that C libraries can be used from Fortran Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language FC ================================================================================ TEST checkDependencyGenerationFlag from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1282) TESTING: checkDependencyGenerationFlag from config.compilers(config/BuildSystem/config/compilers.py:1282) Check if -MMD works for dependency generation, and add it if it does Pushing language C Trying C compiler flag -MMD sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MMD -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MMD -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-MMD' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejecting C linker flag -MMD due to cl : Command line warning D9002 : ignoring unknown option '-MMD' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejected C compiler flag -MMD because linker cannot handle it Trying C compiler flag -M sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -M -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -M -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-M' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejecting C linker flag -M due to cl : Command line warning D9002 : ignoring unknown option '-M' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejected C compiler flag -M because linker cannot handle it Popping language C Pushing language Cxx Trying Cxx compiler flag -MMD sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MMD -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MMD -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-MMD' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejecting Cxx linker flag -MMD due to cl : Command line warning D9002 : ignoring unknown option '-MMD' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejected Cxx compiler flag -MMD because linker cannot handle it Trying Cxx compiler flag -M sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -M -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -M -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-M' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejecting Cxx linker flag -M due to cl : Command line warning D9002 : ignoring unknown option '-M' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejected Cxx compiler flag -M because linker cannot handle it Popping language Cxx Pushing language FC Trying FC compiler flag -MMD sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MMD -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MMD -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/MMD' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejecting FC linker flag -MMD due to ifort: command line warning #10006: ignoring unknown option '/MMD' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Rejected FC compiler flag -MMD because linker cannot handle it Trying FC compiler flag -M sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -M -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -M -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: ifort: command line error: option '/M' is ambiguous Possible ERROR while running linker: output: ifort: command line error: option '/M' is ambiguous ret = 256 Pushing language FC Popping language FC in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -M -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Source: program main end Rejecting linker flag -M due to nonzero status from link Rejected FC compiler flag -M because linker cannot handle it Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl --help sh: Win32 Development Tool Front End, version 1.10.1 Wed Oct 19 20:45:06 CDT 2011 Usage: win32fe -- - must be the first argument to win32fe : {cl,icl,df,f90,ifl,bcc32,lib,tlib} cl: Microsoft 32-bit C/C++ Optimizing Compiler icl: Intel C/C++ Optimizing Compiler df: Compaq Visual Fortran Optimizing Compiler f90: Compaq Visual Fortran90 Optimizing Compiler ifl: Intel Fortran Optimizing Compiler ifort: Intel Fortran Optimizing Compiler nvcc: NVIDIA CUDA Compiler Driver bcc32: Borland C++ for Win32 lib: Microsoft Library Manager tlib: Borland Library Manager : --help: Output this help message and help for --autodetect: Attempt automatic detection of installation --path : specifies an addition to the PATH that is required (ex. the location of a required .dll) --use : specifies the variant of to use --verbose: Echo to stdout the translated commandline and other diagnostic information --version: Output version info for win32fe and --wait_for_debugger: Inserts an infinite wait after creation of and outputs PID so one can manually attach a debugger to the current process. In the debugger, one must set: tool::waitfordebugger = 0 to continue the execution normally. --win-l: For compilers, define -lfoo to link foo.lib instead of libfoo.lib --woff: Suppress win32fe specific warning messages ================================================================================= For compilers: win32fe will map the following to their native options: -c: Compile Only, generates an object file with .o extension This will invoke the compiler once for each file listed. -l: Link the file lib.lib or if using --win-l also, .lib -o : Output= context dependent -D: Define -I: Add to the include path -L: Add to the link path -g: Generate debug symbols in objects when specified for compilation, and in executables when specified for linking (some compilers specification at both times for full debugging support). -O: Enable compiletime and/or linktime optimizations. Ex: win32fe cl -g -c foo.c --verbose -Iinclude Note: win32fe will automatically find the system library paths and system include paths, relieving the user of the need to invoke a particular shell. ========================================================================= cl specific help: win32fe uses -nologo by default for nonverbose output. Use the flag: -logo to disable this feature. -g is identical to -Z7. -O is identical to -O2. ========================================================================= Microsoft (R) C/C++ Optimizing Compiler Version 16.00.30319.01 for x64 Copyright (C) Microsoft Corporation. All rights reserved. C/C++ COMPILER OPTIONS -OPTIMIZATION- /O1 minimize space /O2 maximize speed /Ob inline expansion (default n=0) /Od disable optimizations (default) /Og enable global optimization /Oi[-] enable intrinsic functions /Os favor code space /Ot favor code speed /Ox maximum optimizations /favor: select processor to optimize for, one of: blend - a combination of optimizations for several different x64 processors AMD64 - 64-bit AMD processors INTEL64 - Intel(R)64 architecture processors -CODE GENERATION- /GF enable read-only string pooling /Gm[-] enable minimal rebuild /Gy[-] separate functions for linker /GS[-] enable security checks /GR[-] enable C++ RTTI /GX[-] enable C++ EH (same as /EHsc) /EHs enable C++ EH (no SEH exceptions) /EHa enable C++ EH (w/ SEH exceptions) /EHc extern "C" defaults to nothrow /fp: choose floating-point model: except[-] - consider floating-point exceptions when generating code fast - "fast" floating-point model; results are less predictable precise - "precise" floating-point model; results are predictable strict - "strict" floating-point model (implies /fp:except) /Qfast_transcendentals generate inline FP intrinsics even with /fp:except /GL[-] enable link-time code generation /GA optimize for Windows Application /Ge force stack checking for all funcs /Gs[num] control stack checking calls /Gh enable _penter function call /GH enable _pexit function call /GT generate fiber-safe TLS accesses /RTC1 Enable fast checks (/RTCsu) /RTCc Convert to smaller type checks /RTCs Stack Frame runtime checking /RTCu Uninitialized local usage checks /clr[:option] compile for common language runtime, where option is: pure - produce IL-only output file (no native executable code) safe - produce IL-only verifiable output file oldSyntax - accept the Managed Extensions syntax from Visual C++ 2002/2003 initialAppDomain - enable initial AppDomain behavior of Visual C++ 2002 noAssembly - do not produce an assembly /homeparams Force parameters passed in registers to be written to the stack /GZ Enable stack checks (/RTCs) /arch:AVX enable use of Intel(R) Advanced Vector Extensions instructions -OUTPUT FILES- /Fa[file] name assembly listing file /FA[scu] configure assembly listing /Fd[file] name .PDB file /Fe name executable file /Fm[file] name map file /Fo name object file /Fp name precompiled header file /Fr[file] name source browser file /FR[file] name extended .SBR file /Fi[file] name preprocessed file /doc[file] process XML documentation comments and optionally name the .xdc file -PREPROCESSOR- /AI add to assembly search path /FU forced using assembly/module /C don't strip comments /D{=|#} define macro /E preprocess to stdout /EP preprocess to stdout, no #line /P preprocess to file /Fx merge injected code to file /FI name forced include file /U remove predefined macro /u remove all predefined macros /I add to include search path /X ignore "standard places" -LANGUAGE- /Zi enable debugging information /Z7 enable old-style debug info /Zp[n] pack structs on n-byte boundary /Za disable extensions /Ze enable extensions (default) /Zl omit default library name in .OBJ /Zg generate function prototypes /Zs syntax check only /vd{0|1|2} disable/enable vtordisp /vm type of pointers to members /Zc:arg1[,arg2] C++ language conformance, where arguments can be: forScope[-] - enforce Standard C++ for scoping rules wchar_t[-] - wchar_t is the native type, not a typedef auto[-] - enforce the new Standard C++ meaning for auto trigraphs[-] - enable trigraphs (off by default) /openmp enable OpenMP 2.0 language extensions -MISCELLANEOUS- @ options response file /?, /help print this help message /bigobj generate extended object format /c compile only, no link /errorReport:option Report internal compiler errors to Microsoft none - do not send report prompt - prompt to immediately send report queue - at next admin logon, prompt to send report (default) send - send report automatically /FC use full pathnames in diagnostics /H max external name length /J default char type is unsigned /MP[n] use up to 'n' processes for compilation /nologo suppress copyright message /showIncludes show include file names /Tc compile file as .c /Tp compile file as .cpp /TC compile all files as .c /TP compile all files as .cpp /V set version string /w disable all warnings /wd disable warning n /we treat warning n as an error /wo issue warning n once /w set warning level 1-4 for n /W set warning level (default n=1) /Wall enable all warnings /WL enable one line diagnostics /WX treat warnings as errors /Yc[file] create .PCH file /Yd put debug info in every .OBJ /Yl[sym] inject .PCH ref for debug lib /Yu[file] use .PCH file /Y- disable all PCH options /Zm max memory alloc (% of default) /Wp64 enable 64 bit porting warnings -LINKING- /LD Create .DLL /LDd Create .DLL debug library /LN Create a .netmodule /F set stack size /link [linker options and libraries] /MD link with MSVCRT.LIB /MT link with LIBCMT.LIB /MDd link with MSVCRTD.LIB debug lib /MTd link with LIBCMTD.LIB debug lib ================================================================================ TEST checkRestrict from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:134) TESTING: checkRestrict from config.compilers(config/BuildSystem/config/compilers.py:134) Check for the C/CXX restrict keyword Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2065: 'x' : undeclared identifier Possible ERROR while running compiler: conftest.cc C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2065: 'x' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" int main() { float * restrict x;; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2065: 'x' : undeclared identifier Possible ERROR while running compiler: conftest.cc C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2146: syntax error : missing ';' before identifier 'x' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.cc(5) : error C2065: 'x' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" int main() { float * __restrict__ x;; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { float * __restrict x;; return 0; } compilers: Set Cxx restrict keyword to __restrict Defined "CXX_RESTRICT" to "__restrict" Popping language Cxx ================================================================================ TEST checkCxxNamespace from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:355) TESTING: checkCxxNamespace from config.compilers(config/BuildSystem/config/compilers.py:355) Checks that C++ compiler supports namespaces, and if it does defines HAVE_CXX_NAMESPACE Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" namespace petsc {int dummy;} int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" template struct a {}; namespace trouble{ template struct a : public ::a {}; } trouble::a uugh; int main() { ; return 0; } Popping language Cxx compilers: C++ has namespaces Defined "HAVE_CXX_NAMESPACE" to "1" ================================================================================ TEST checkCxxOptionalExtensions from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:330) TESTING: checkCxxOptionalExtensions from config.compilers(config/BuildSystem/config/compilers.py:330) Check whether the C++ compiler (IBM xlC, OSF5) need special flag for .c files which contain C++ Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' ret = 512 Source: #include "confdefs.h" #include "conffix.h" int main() { class somename { int i; };; return 0; } Rejecting compiler flag due to nonzero status from link Rejecting compiler flag due to conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -+ /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -+ /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: cl : Command line warning D9002 : ignoring unknown option '-+' conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' Possible ERROR while running compiler: cl : Command line warning D9002 : ignoring unknown option '-+' conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' ret = 512 Source: #include "confdefs.h" #include "conffix.h" int main() { class somename { int i; };; return 0; } Rejecting compiler flag -+ due to nonzero status from link Rejecting compiler flag -+ due to cl : Command line warning D9002 : ignoring unknown option '-+' conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -x cxx -tlocal /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -x cxx -tlocal /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: Error: win32fe: Input File Not Found: C:\cygwin\packages\PETSC-~1.2\cxx Possible ERROR while running compiler: Error: win32fe: Input File Not Found: C:\cygwin\packages\PETSC-~1.2\cxx ret = 32512 Source: #include "confdefs.h" #include "conffix.h" int main() { class somename { int i; };; return 0; } Rejecting compiler flag -x cxx -tlocal due to nonzero status from link Rejecting compiler flag -x cxx -tlocal due to Error: win32fe: Input File Not Found: C:\cygwin\packages\PETSC-~1.2\cxx PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -Kc++ /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -Kc++ /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: cl : Command line warning D9002 : ignoring unknown option '-Kc++' conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' Possible ERROR while running compiler: cl : Command line warning D9002 : ignoring unknown option '-Kc++' conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' ret = 512 Source: #include "confdefs.h" #include "conffix.h" int main() { class somename { int i; };; return 0; } Rejecting compiler flag -Kc++ due to nonzero status from link Rejecting compiler flag -Kc++ due to cl : Command line warning D9002 : ignoring unknown option '-Kc++' conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'class' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2146: syntax error : missing ';' before identifier 'somename' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2065: 'somename' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.c(5) : error C2143: syntax error : missing ';' before '{' PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { class somename { int i; };; return 0; } Added Cxx compiler flag -TP Popping language Cxx ================================================================================ TEST checkCxxStaticInline from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:119) TESTING: checkCxxStaticInline from config.compilers(config/BuildSystem/config/compilers.py:119) Check for C++ keyword: static inline Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" static inline int foo(int a) {return a;} int main() { int i = foo(1);; return 0; } compilers: Set Cxx StaticInline keyword to static inline Popping language Cxx Defined "CXX_STATIC_INLINE" to "static inline" ================================================================================ TEST checkCxxLibraries from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:370) TESTING: checkCxxLibraries from config.compilers(config/BuildSystem/config/compilers.py:370) Determines the libraries needed to link with C++ Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -v -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -v -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.compilers/conftest.o sh: cl : Command line warning D9002 : ignoring unknown option '-v' LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link Popping language Cxx compilers: Checking arg cl compilers: Unknown arg cl compilers: Checking arg : compilers: Unknown arg : compilers: Checking arg Command compilers: Unknown arg Command compilers: Checking arg line compilers: Unknown arg line compilers: Checking arg warning compilers: Unknown arg warning compilers: Checking arg D9002 compilers: Unknown arg D9002 compilers: Checking arg : compilers: Unknown arg : compilers: Checking arg ignoring compilers: Unknown arg ignoring compilers: Checking arg unknown compilers: Unknown arg unknown compilers: Checking arg option compilers: Unknown arg option compilers: Checking arg LINK compilers: Unknown arg LINK compilers: Checking arg : compilers: Unknown arg : compilers: Checking arg C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe compilers: Unknown arg C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe compilers: Checking arg not compilers: Unknown arg not compilers: Checking arg found compilers: Unknown arg found compilers: Checking arg or compilers: Unknown arg or compilers: Checking arg not compilers: Unknown arg not compilers: Checking arg built compilers: Unknown arg built compilers: Checking arg by compilers: Unknown arg by compilers: Checking arg the compilers: Unknown arg the compilers: Checking arg last compilers: Unknown arg last compilers: Checking arg incremental compilers: Unknown arg incremental compilers: Checking arg link; compilers: Unknown arg link; compilers: Checking arg performing compilers: Unknown arg performing compilers: Checking arg full compilers: Unknown arg full compilers: Checking arg link compilers: Unknown arg link compilers: Libraries needed to link Cxx code with another linker: [] compilers: Check that Cxx libraries can be used from C Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language C compilers: Check that Cxx libraries can be used from Fortran Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language FC ================================================================================ TEST checkFortranTypeSizes from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:528) TESTING: checkFortranTypeSizes from config.compilers(config/BuildSystem/config/compilers.py:528) Check whether real*8 is supported and suggest flags which will allow support Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main real*8 variable end Popping language FC ================================================================================ TEST checkFortranNameMangling from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:587) TESTING: checkFortranNameMangling from config.compilers(config/BuildSystem/config/compilers.py:587) Checks Fortran name mangling, and defines HAVE_FORTRAN_UNDERSCORE, HAVE_FORTRAN_NOUNDERSCORE, HAVE_FORTRAN_CAPS, or HAVE_FORTRAN_STDCALL Testing Fortran mangling type underscore with code void d1chk_(void){return;} Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void d1chk_(void){return;} Popping language C Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main call d1chk() end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol D1CHK referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol D1CHK referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 24576 Pushing language FC Popping language FC in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o Source: program main call d1chk() end Popping language FC Testing Fortran mangling type unchanged with code void d1chk(void){return;} Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void d1chk(void){return;} Popping language C Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main call d1chk() end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o sh: conftest.obj : error LNK2019: unresolved external symbol D1CHK referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol D1CHK referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 24576 Pushing language FC Popping language FC in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o Source: program main call d1chk() end Popping language FC Testing Fortran mangling type caps with code void D1CHK(void){return;} Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void D1CHK(void){return;} Popping language C Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main call d1chk() end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o sh: Popping language FC compilers: Fortran name mangling is caps Defined "HAVE_FORTRAN_CAPS" to "1" ================================================================================ TEST checkFortranNameManglingDouble from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:624) TESTING: checkFortranNameManglingDouble from config.compilers(config/BuildSystem/config/compilers.py:624) Checks if symbols containing an underscore append an extra underscore, and defines HAVE_FORTRAN_UNDERSCORE_UNDERSCORE if necessary Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void d1_chk__(void){return;} Popping language C Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main call d1_chk() end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol D1_CHK referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol D1_CHK referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 24576 Pushing language FC Popping language FC in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/confc.o Source: program main call d1_chk() end Popping language FC ================================================================================ TEST checkFortranPreprocessor from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:634) TESTING: checkFortranPreprocessor from config.compilers(config/BuildSystem/config/compilers.py:634) Determine if Fortran handles preprocessing properly Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) Possible ERROR while running compiler: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) ret = 256 Source: program main #define dummy dummy #ifndef dummy fooey #endif end Rejecting compiler flag due to nonzero status from link Rejecting compiler flag due to C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -cpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -cpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) Possible ERROR while running compiler: ifort: command line warning #10006: ignoring unknown option '/cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) ret = 256 Source: program main #define dummy dummy #ifndef dummy fooey #endif end Rejecting compiler flag -cpp due to nonzero status from link Rejecting compiler flag -cpp due to ifort: command line warning #10006: ignoring unknown option '/cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -xpp=cpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -xpp=cpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/xpp=cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) Possible ERROR while running compiler: ifort: command line warning #10006: ignoring unknown option '/xpp=cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) ret = 256 Source: program main #define dummy dummy #ifndef dummy fooey #endif end Rejecting compiler flag -xpp=cpp due to nonzero status from link Rejecting compiler flag -xpp=cpp due to ifort: command line warning #10006: ignoring unknown option '/xpp=cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -F /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -F /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: ifort: command line error: option '/F' is ambiguous Possible ERROR while running compiler: ifort: command line error: option '/F' is ambiguous ret = 256 Source: program main #define dummy dummy #ifndef dummy fooey #endif end Rejecting compiler flag -F due to nonzero status from link Rejecting compiler flag -F due to ifort: command line error: option '/F' is ambiguous PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -Cpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -Cpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/Cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) Possible ERROR while running compiler: ifort: command line warning #10006: ignoring unknown option '/Cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) ret = 256 Source: program main #define dummy dummy #ifndef dummy fooey #endif end Rejecting compiler flag -Cpp due to nonzero status from link Rejecting compiler flag -Cpp due to ifort: command line warning #10006: ignoring unknown option '/Cpp' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(2): warning #5117: Bad # preprocessor line #define dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(4): warning #5117: Bad # preprocessor line #ifndef dummy -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(3): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => dummy ----------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(6): warning #5117: Bad # preprocessor line #endif -^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #5082: Syntax error, found END-OF-STATEMENT when expecting one of: ( % [ : . = => fooey ------------^ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F(5): error #6218: This statement is positioned incorrectly and/or has syntax errors. fooey ------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.F (code 1) PETSc Error: No output file produced sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main #define dummy dummy #ifndef dummy fooey #endif end Added FC compiler flag -fpp Popping language FC compilers: Fortran uses CPP preprocessor ================================================================================ TEST checkFortranDefineCompilerOption from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:655) TESTING: checkFortranDefineCompilerOption from config.compilers(config/BuildSystem/config/compilers.py:655) Check if -WF,-Dfoobar or -Dfoobar is the compiler option to define a macro Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp -DTesting /tmp/petsc-1nzsmm/config.setCompilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp -DTesting /tmp/petsc-1nzsmm/config.setCompilers/conftest.F sh: Successful compile: Source: program main #define dummy dummy #ifndef Testing fooey #endif end Popping language FC compilers: Fortran uses -D for defining macro ================================================================================ TEST checkFortranLibraries from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:671) TESTING: checkFortranLibraries from config.compilers(config/BuildSystem/config/compilers.py:671) Substitutes for FLIBS the libraries needed to link with Fortran This macro is intended to be used in those situations when it is necessary to mix, e.g. C++ and Fortran 77, source code into a single program or shared library. For example, if object files from a C++ and Fortran 77 compiler must be linked together, then the C++ compiler/linker must be used for linking (since special C++-ish things need to happen at link time like calling global constructors, instantiating templates, enabling exception support, etc.). However, the Fortran 77 intrinsic and run-time libraries must be linked in as well, but the C++ compiler/linker does not know how to add these Fortran 77 libraries. This code was translated from the autoconf macro which was packaged in its current form by Matthew D. Langston . However, nearly all of this macro came from the OCTAVE_FLIBS macro in octave-2.0.13/aclocal.m4, and full credit should go to John W. Eaton for writing this extremely useful macro. Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -V Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -V sh: ifort: command line error: no files specified; for help type "ifort /help" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -v -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -v -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o sh: ifort: command line warning #10006: ignoring unknown option '/v' C:\PROGRA~2\Intel\COMPOS~1\bin\intel64\fortcom -mP1OPT_version=13.1-intel64 -mGLOB_diag_file=C:\cygwin\tmp\conftest.diag -mGLOB_source_language=GLOB_SOURCE_LANGUAGE_F90 -mGLOB_tune_for_fort -mGLOB_use_fort_dope_vector -mP2OPT_static_promotion -mP1OPT_print_version=FALSE -mGLOB_microsoft_version=1600 "-mGLOB_options_string=-v -MT -Z7 -fpp -nologo -FeC:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe" -mGLOB_cxx_limited_range=FALSE -mCG_extend_parms=FALSE -mGLOB_compiler_bin_directory=C:\PROGRA~2\Intel\COMPOS~1\bin\intel64 -mP3OPT_defaultlibs_omit=FALSE -mP3OPT_defaultlibs=P3OPT_DEFAULTLIBS_STATIC_MULTITHREAD -mP3OPT_defaultlibs_select=P3OPT_DEFAULTLIBS_SELECT_F90_IVF -mP3OPT_defaultlibs_f90_port -mP1OPT_check_stack -mP3OPT_emit_line_numbers -mGLOB_debug_format=GLOB_DEBUG_FORMAT_CV10 -mDEBUG_no_pdb=TRUE -mP3OPT_inline_alloca -mGLOB_routine_pointer_size_64 -mGLOB_split_functions=0 -mIPOPT_activate -mGLOB_machine_model=GLOB_MACHINE_MODEL_EFI2 -mGLOB_product_id_code=0x22006d92 -mCG_bnl_movbe=T -mP3OPT_use_mspp_call_convention -mP2OPT_subs_out_of_bound=FALSE -mGLOB_ansi_alias -mPGOPTI_value_profile_use=T -mP2OPT_il0_array_sections=TRUE -mGLOB_offload_mode=0 -mP2OPT_offload_unique_var_string=7048200512 -mP2OPT_hlo -mP2OPT_hpo_rtt_control=0 -mIPOPT_args_in_regs=0 -mP2OPT_disam_assume_nonstd_intent_in=FALSE -mGLOB_imf_mapping_library=C:\PROGRA~2\Intel\COMPOS~1\bin\intel64\libiml_attr.dll -mIPOPT_link -mIPOPT_ipo_activate -mIPOPT_mo_activate -mIPOPT_source_files_list=C:\cygwin\tmp\7048slis4 -mIPOPT_mo_global_data "-mIPOPT_cmdline_link="-out:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe" "-debug" "-pdb:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.pdb" "-subsystem:console" "-nologo" "C:\cygwin\tmp\conftest.obj"" -mIPOPT_il_in_obj -mIPOPT_ipo_activate_warn=FALSE -mIPOPT_obj_output_file_name=C:\cygwin\tmp\ipo_7048.obj -mGLOB_routine_pointer_size_64 -mGLOB_driver_tempfile_name=C:\cygwin\tmp\7048tempfile2 -mGLOB_os_target=GLOB_OS_TARGET_WINNT -mP3OPT_asm_target=P3OPT_ASM_TARGET_MASM5 -mP3OPT_obj_target=P3OPT_OBJ_TARGET_NTCOFF -mGLOB_obj_output_file=C:\cygwin\tmp\ipo_7048.obj -mGLOB_source_dialect=GLOB_SOURCE_DIALECT_NONE -mP1OPT_source_file_name=ipo_out.f -mP2OPT_symtab_type_copy=true C:\cygwin\tmp\conftest.obj -mIPOPT_object_files=T -mIPOPT_assembly_files=C:\cygwin\tmp\7048alis3 link -out:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe -debug -pdb:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.pdb -subsystem:console -nologo C:\cygwin\tmp\conftest.obj Popping language FC compilers: Checking arg ifort: compilers: Unknown arg ifort: compilers: Checking arg command compilers: Unknown arg command compilers: Checking arg line compilers: Unknown arg line compilers: Checking arg warning compilers: Unknown arg warning compilers: Checking arg #10006: compilers: Unknown arg #10006: compilers: Checking arg ignoring compilers: Unknown arg ignoring compilers: Checking arg unknown compilers: Unknown arg unknown compilers: Checking arg option compilers: Unknown arg option compilers: Checking arg C:\PROGRA~2\Intel\COMPOS~1\bin\intel64\fortcom compilers: Unknown arg C:\PROGRA~2\Intel\COMPOS~1\bin\intel64\fortcom compilers: Checking arg -mP1OPT_version=13.1-intel64 compilers: Unknown arg -mP1OPT_version=13.1-intel64 compilers: Checking arg -mGLOB_diag_file=C:\cygwin\tmp\conftest.diag compilers: Unknown arg -mGLOB_diag_file=C:\cygwin\tmp\conftest.diag compilers: Checking arg -mGLOB_source_language=GLOB_SOURCE_LANGUAGE_F90 compilers: Unknown arg -mGLOB_source_language=GLOB_SOURCE_LANGUAGE_F90 compilers: Checking arg -mGLOB_tune_for_fort compilers: Unknown arg -mGLOB_tune_for_fort compilers: Checking arg -mGLOB_use_fort_dope_vector compilers: Unknown arg -mGLOB_use_fort_dope_vector compilers: Checking arg -mP2OPT_static_promotion compilers: Unknown arg -mP2OPT_static_promotion compilers: Checking arg -mP1OPT_print_version=FALSE compilers: Unknown arg -mP1OPT_print_version=FALSE compilers: Checking arg -mGLOB_microsoft_version=1600 compilers: Unknown arg -mGLOB_microsoft_version=1600 compilers: Checking arg "-mGLOB_options_string=-v compilers: Unknown arg "-mGLOB_options_string=-v compilers: Checking arg -MT compilers: Unknown arg -MT compilers: Checking arg -Z7 compilers: Unknown arg -Z7 compilers: Checking arg -fpp compilers: Unknown arg -fpp compilers: Checking arg -nologo compilers: Unknown arg -nologo compilers: Checking arg -FeC:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe" compilers: Unknown arg -FeC:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe compilers: Checking arg -mGLOB_cxx_limited_range=FALSE compilers: Unknown arg -mGLOB_cxx_limited_range=FALSE compilers: Checking arg -mCG_extend_parms=FALSE compilers: Unknown arg -mCG_extend_parms=FALSE compilers: Checking arg -mGLOB_compiler_bin_directory=C:\PROGRA~2\Intel\COMPOS~1\bin\intel64 compilers: Handling HPUX list of directories: \PROGRA~2\Intel\COMPOS~1\bin\intel64 compilers: Checking arg -mP3OPT_defaultlibs_omit=FALSE compilers: Unknown arg -mP3OPT_defaultlibs_omit=FALSE compilers: Checking arg -mP3OPT_defaultlibs=P3OPT_DEFAULTLIBS_STATIC_MULTITHREAD compilers: Unknown arg -mP3OPT_defaultlibs=P3OPT_DEFAULTLIBS_STATIC_MULTITHREAD compilers: Checking arg -mP3OPT_defaultlibs_select=P3OPT_DEFAULTLIBS_SELECT_F90_IVF compilers: Unknown arg -mP3OPT_defaultlibs_select=P3OPT_DEFAULTLIBS_SELECT_F90_IVF compilers: Checking arg -mP3OPT_defaultlibs_f90_port compilers: Unknown arg -mP3OPT_defaultlibs_f90_port compilers: Checking arg -mP1OPT_check_stack compilers: Unknown arg -mP1OPT_check_stack compilers: Checking arg -mP3OPT_emit_line_numbers compilers: Unknown arg -mP3OPT_emit_line_numbers compilers: Checking arg -mGLOB_debug_format=GLOB_DEBUG_FORMAT_CV10 compilers: Unknown arg -mGLOB_debug_format=GLOB_DEBUG_FORMAT_CV10 compilers: Checking arg -mDEBUG_no_pdb=TRUE compilers: Unknown arg -mDEBUG_no_pdb=TRUE compilers: Checking arg -mP3OPT_inline_alloca compilers: Unknown arg -mP3OPT_inline_alloca compilers: Checking arg -mGLOB_routine_pointer_size_64 compilers: Unknown arg -mGLOB_routine_pointer_size_64 compilers: Checking arg -mGLOB_split_functions=0 compilers: Unknown arg -mGLOB_split_functions=0 compilers: Checking arg -mIPOPT_activate compilers: Unknown arg -mIPOPT_activate compilers: Checking arg -mGLOB_machine_model=GLOB_MACHINE_MODEL_EFI2 compilers: Unknown arg -mGLOB_machine_model=GLOB_MACHINE_MODEL_EFI2 compilers: Checking arg -mGLOB_product_id_code=0x22006d92 compilers: Unknown arg -mGLOB_product_id_code=0x22006d92 compilers: Checking arg -mCG_bnl_movbe=T compilers: Unknown arg -mCG_bnl_movbe=T compilers: Checking arg -mP3OPT_use_mspp_call_convention compilers: Unknown arg -mP3OPT_use_mspp_call_convention compilers: Checking arg -mP2OPT_subs_out_of_bound=FALSE compilers: Unknown arg -mP2OPT_subs_out_of_bound=FALSE compilers: Checking arg -mGLOB_ansi_alias compilers: Unknown arg -mGLOB_ansi_alias compilers: Checking arg -mPGOPTI_value_profile_use=T compilers: Unknown arg -mPGOPTI_value_profile_use=T compilers: Checking arg -mP2OPT_il0_array_sections=TRUE compilers: Unknown arg -mP2OPT_il0_array_sections=TRUE compilers: Checking arg -mGLOB_offload_mode=0 compilers: Unknown arg -mGLOB_offload_mode=0 compilers: Checking arg -mP2OPT_offload_unique_var_string=7048200512 compilers: Unknown arg -mP2OPT_offload_unique_var_string=7048200512 compilers: Checking arg -mP2OPT_hlo compilers: Unknown arg -mP2OPT_hlo compilers: Checking arg -mP2OPT_hpo_rtt_control=0 compilers: Unknown arg -mP2OPT_hpo_rtt_control=0 compilers: Checking arg -mIPOPT_args_in_regs=0 compilers: Unknown arg -mIPOPT_args_in_regs=0 compilers: Checking arg -mP2OPT_disam_assume_nonstd_intent_in=FALSE compilers: Unknown arg -mP2OPT_disam_assume_nonstd_intent_in=FALSE compilers: Checking arg -mGLOB_imf_mapping_library=C:\PROGRA~2\Intel\COMPOS~1\bin\intel64\libiml_attr.dll compilers: Unknown arg -mGLOB_imf_mapping_library=C:\PROGRA~2\Intel\COMPOS~1\bin\intel64\libiml_attr.dll compilers: Checking arg -mIPOPT_link compilers: Unknown arg -mIPOPT_link compilers: Checking arg -mIPOPT_ipo_activate compilers: Unknown arg -mIPOPT_ipo_activate compilers: Checking arg -mIPOPT_mo_activate compilers: Unknown arg -mIPOPT_mo_activate compilers: Checking arg -mIPOPT_source_files_list=C:\cygwin\tmp\7048slis4 compilers: Unknown arg -mIPOPT_source_files_list=C:\cygwin\tmp\7048slis4 compilers: Checking arg -mIPOPT_mo_global_data compilers: Unknown arg -mIPOPT_mo_global_data compilers: Checking arg "-mIPOPT_cmdline_link="-out:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe" compilers: Unknown arg -mIPOPT_cmdline_link="-out:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe compilers: Checking arg "-debug" compilers: Unknown arg -debug compilers: Checking arg "-pdb:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.pdb" compilers: Unknown arg -pdb:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.pdb compilers: Checking arg "-subsystem:console" compilers: Unknown arg -subsystem:console compilers: Checking arg "-nologo" compilers: Unknown arg -nologo compilers: Checking arg "C:\cygwin\tmp\conftest.obj"" compilers: Unknown arg C:\cygwin\tmp\conftest.obj compilers: Checking arg -mIPOPT_il_in_obj compilers: Unknown arg -mIPOPT_il_in_obj compilers: Checking arg -mIPOPT_ipo_activate_warn=FALSE compilers: Unknown arg -mIPOPT_ipo_activate_warn=FALSE compilers: Checking arg -mIPOPT_obj_output_file_name=C:\cygwin\tmp\ipo_7048.obj compilers: Unknown arg -mIPOPT_obj_output_file_name=C:\cygwin\tmp\ipo_7048.obj compilers: Checking arg -mGLOB_routine_pointer_size_64 compilers: Unknown arg -mGLOB_routine_pointer_size_64 compilers: Checking arg -mGLOB_driver_tempfile_name=C:\cygwin\tmp\7048tempfile2 compilers: Unknown arg -mGLOB_driver_tempfile_name=C:\cygwin\tmp\7048tempfile2 compilers: Checking arg -mGLOB_os_target=GLOB_OS_TARGET_WINNT compilers: Unknown arg -mGLOB_os_target=GLOB_OS_TARGET_WINNT compilers: Checking arg -mP3OPT_asm_target=P3OPT_ASM_TARGET_MASM5 compilers: Unknown arg -mP3OPT_asm_target=P3OPT_ASM_TARGET_MASM5 compilers: Checking arg -mP3OPT_obj_target=P3OPT_OBJ_TARGET_NTCOFF compilers: Unknown arg -mP3OPT_obj_target=P3OPT_OBJ_TARGET_NTCOFF compilers: Checking arg -mGLOB_obj_output_file=C:\cygwin\tmp\ipo_7048.obj compilers: Unknown arg -mGLOB_obj_output_file=C:\cygwin\tmp\ipo_7048.obj compilers: Checking arg -mGLOB_source_dialect=GLOB_SOURCE_DIALECT_NONE compilers: Unknown arg -mGLOB_source_dialect=GLOB_SOURCE_DIALECT_NONE compilers: Checking arg -mP1OPT_source_file_name=ipo_out.f compilers: Unknown arg -mP1OPT_source_file_name=ipo_out.f compilers: Checking arg -mP2OPT_symtab_type_copy=true compilers: Unknown arg -mP2OPT_symtab_type_copy=true compilers: Checking arg C:\cygwin\tmp\conftest.obj compilers: Unknown arg C:\cygwin\tmp\conftest.obj compilers: Checking arg -mIPOPT_object_files=T compilers: Unknown arg -mIPOPT_object_files=T compilers: Checking arg -mIPOPT_assembly_files=C:\cygwin\tmp\7048alis3 compilers: Unknown arg -mIPOPT_assembly_files=C:\cygwin\tmp\7048alis3 compilers: Checking arg link compilers: Unknown arg link compilers: Checking arg -out:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe compilers: Unknown arg -out:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe compilers: Checking arg -debug compilers: Unknown arg -debug compilers: Checking arg -pdb:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.pdb compilers: Unknown arg -pdb:C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.pdb compilers: Checking arg -subsystem:console compilers: Unknown arg -subsystem:console compilers: Checking arg -nologo compilers: Unknown arg -nologo compilers: Checking arg C:\cygwin\tmp\conftest.obj compilers: Unknown arg C:\cygwin\tmp\conftest.obj compilers: Libraries needed to link Fortran code with the C linker: ['-L/cygdrive/c/cygwin/packages/petsc-3.4.2/\\PROGRA~2\\Intel\\COMPOS~1\\bin\\intel64'] compilers: Libraries needed to link Fortran main with the C linker: [] compilers: Check that Fortran libraries can be used from C Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language C compilers: Check that Fortran libraries can be used from C++ Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.setCompilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.setCompilers/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.SET\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.setCompilers/conftest.exe sh: Popping language Cxx compilers: Fortran libraries can be used from C++ ================================================================================ TEST checkFortranLinkingCxx from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1016) TESTING: checkFortranLinkingCxx from config.compilers(config/BuildSystem/config/compilers.py:1016) Check that Fortran can be linked against C++ Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern "C" void D1CHK(void); void foo(void){D1CHK();} Popping language Cxx Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.compilers/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern "C" void D1CHK(void); void D1CHK(void){return;} Popping language Cxx Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main call d1chk() end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/cxxobj.o /tmp/petsc-1nzsmm/config.compilers/confc.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/cxxobj.o /tmp/petsc-1nzsmm/config.compilers/confc.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link Popping language FC compilers: Fortran can link C++ functions ================================================================================ TEST checkFortran90 from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1051) TESTING: checkFortran90 from config.compilers(config/BuildSystem/config/compilers.py:1051) Determine whether the Fortran compiler handles F90 Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main INTEGER, PARAMETER :: int = SELECTED_INT_KIND(8) INTEGER (KIND=int) :: ierr ierr = 1 end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link Defined "USING_F90" to "1" Fortran compiler supports F90 Popping language FC ================================================================================ TEST checkFortran2003 from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1064) TESTING: checkFortran2003 from config.compilers(config/BuildSystem/config/compilers.py:1064) Determine whether the Fortran compiler handles F2003 Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main use,intrinsic :: iso_c_binding Type(C_Ptr),Dimension(:),Pointer :: CArray character(kind=c_char),pointer :: nullc => null() character(kind=c_char,len=5),dimension(:),pointer::list1 allocate(list1(5)) CArray = (/(c_loc(list1(i)),i=1,5),c_loc(nullc)/) end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link Defined "USING_F2003" to "1" Fortran compiler supports F2003 Popping language FC ================================================================================ TEST checkFortran90Array from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1084) TESTING: checkFortran90Array from config.compilers(config/BuildSystem/config/compilers.py:1084) Check for F90 array interfaces sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 Cygwin detected: ignoring HAVE_F90_2PTR_ARG test ================================================================================ TEST checkFortranModuleInclude from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1171) TESTING: checkFortranModuleInclude from config.compilers(config/BuildSystem/config/compilers.py:1171) Figures out what flag is used to specify the include path for Fortran modules Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: Successful compile: Source: program main use configtest write(*,*) testint end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -I/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/configtest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.compilers/conftest.exe -I/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.o /tmp/petsc-1nzsmm/config.compilers/configtest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.exe not found or not built by the last incremental link; performing full link compilers: Fortran module include flag -I found Popping language FC ================================================================================ TEST checkFortranModuleOutput from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1237) TESTING: checkFortranModuleOutput from config.compilers(config/BuildSystem/config/compilers.py:1237) Figures out what flag is used to specify the include path for Fortran modules Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -module /tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -module /tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: ifort: command line warning #10155: ignoring option '/module'; argument required ifort: command line warning #10161: unrecognized source type 'C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\confdir'; object file assumed ifort: warning #10145: no action performed for file 'C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\confdir' ifort: command line warning #10155: ignoring option '/module'; argument required Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -module failed sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -module:/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -module:/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.F(1): error #7001: Error in creating the compiled module file. [CONFIGTEST] module configtest -------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.F (code 1) Possible ERROR while running compiler: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.F(1): error #7001: Error in creating the compiled module file. [CONFIGTEST] module configtest -------------^ compilation aborted for C:\cygwin\tmp\PE3DC2~1\CONFIG~1.COM\conftest.F (code 1) ret = 256 Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -module: compile failed sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -fmod=/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -fmod=/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/fmod=/tmp/petsc-1nzsmm/config.compilers/confdir' Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -fmod= failed sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -J/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -J/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/J/tmp/petsc-1nzsmm/config.compilers/confdir' Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -J failed sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -M/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -M/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/M/tmp/petsc-1nzsmm/config.compilers/confdir' Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -M failed sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -p/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -p/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/p/tmp/petsc-1nzsmm/config.compilers/confdir' Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -p failed sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -qmoddir=/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -qmoddir=/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/qmoddir=/tmp/petsc-1nzsmm/config.compilers/confdir' Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -qmoddir= failed sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -moddir=/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.compilers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -moddir=/tmp/petsc-1nzsmm/config.compilers/confdir -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.compilers/conftest.F sh: ifort: command line warning #10006: ignoring unknown option '/moddir=/tmp/petsc-1nzsmm/config.compilers/confdir' Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -moddir= failed Popping language FC ================================================================================ TEST setupFrameworkCompilers from config.compilers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/compilers.py:1364) TESTING: setupFrameworkCompilers from config.compilers(config/BuildSystem/config/compilers.py:1364) ================================================================================ TEST configureFortranCPP from PETSc.utilities.fortranCPP(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/fortranCPP.py:27) TESTING: configureFortranCPP from PETSc.utilities.fortranCPP(config/PETSc/utilities/fortranCPP.py:27) Handle case where Fortran cannot preprocess properly Defined make rule ".f.o .f90.o .f95.o" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FFLAGS} ${FC_FLAGS} -o $@ $<'] Defined make rule ".f.a" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FFLAGS} ${FC_FLAGS} $<', '-${AR} ${AR_FLAGS} ${LIBNAME} $*.o', '-${RM} $*.o'] Defined make rule ".F.o .F90.o .F95.o" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FFLAGS} ${FC_FLAGS} ${FCPPFLAGS} -o $@ $<'] Defined make rule ".F.a" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FFLAGS} ${FC_FLAGS} ${FCPPFLAGS} $<', '-${AR} ${AR_FLAGS} ${LIBNAME} $*.o', '-${RM} $*.o'] ================================================================================ TEST checkStdC from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:105) TESTING: checkStdC from config.headers(config/BuildSystem/config/headers.py:105) sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/config.setCompilers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #include #include int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 65 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\string.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 65 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\stdlib.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #define ISLOWER(c) ('a' <= (c) && (c) <= 'z') #define TOUPPER(c) (ISLOWER(c) ? 'A' + ((c) - 'a') : (c)) #define XOR(e, f) (((e) && !(f)) || (!(e) && (f))) int main() { int i; for(i = 0; i < 256; i++) if (XOR(islower(i), ISLOWER(i)) || toupper(i) != TOUPPER(i)) exit(2); exit(0); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.headers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.headers/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.o sh: Executing: /tmp/petsc-1nzsmm/config.headers/conftest.exe sh: /tmp/petsc-1nzsmm/config.headers/conftest.exe Executing: /tmp/petsc-1nzsmm/config.headers/conftest.exe sh: Defined "STDC_HEADERS" to "1" ================================================================================ TEST checkStat from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:138) TESTING: checkStat from config.headers(config/BuildSystem/config/headers.py:138) Checks whether stat file-mode macros are broken, and defines STAT_MACROS_BROKEN if they are sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 69 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #pragma once #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 29 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 31 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 32 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef long __time32_t; #line 43 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef __int64 __time64_t; #line 48 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef __time64_t time_t; #line 55 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 57 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef unsigned short _ino_t; typedef unsigned short ino_t; #line 67 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 70 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef unsigned int _dev_t; typedef unsigned int dev_t; #line 80 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 83 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef long _off_t; typedef long off_t; #line 93 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 96 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 98 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 5 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/stat.h" #pragma once #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/stat.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" ================================================================================ TEST checkSysWait from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:173) TESTING: checkSysWait from config.headers(config/BuildSystem/config/headers.py:173) Check for POSIX.1 compatible sys/wait.h, and defines HAVE_SYS_WAIT_H if found sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(5) : fatal error C1083: Cannot open include file: 'sys/wait.h': No such file or directory Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(5) : fatal error C1083: Cannot open include file: 'sys/wait.h': No such file or directory ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #include #ifndef WEXITSTATUS #define WEXITSTATUS(stat_val) ((unsigned)(stat_val) >> 8) #endif #ifndef WIFEXITED #define WIFEXITED(stat_val) (((stat_val) & 255) == 0) #endif int main() { int s; wait (&s); s = WIFEXITED (s) ? WEXITSTATUS (s) : 1; ; return 0; } ================================================================================ TEST checkTime from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:195) TESTING: checkTime from config.headers(config/BuildSystem/config/headers.py:195) Checks if you can safely include both and , and if so defines TIME_WITH_SYS_TIME Checking for header: time.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 69 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\time.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_TIME_H" to "1" Checking for header: sys/time.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 73 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/time.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(4) : fatal error C1083: Cannot open include file: 'sys/time.h': No such file or directory Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(4) : fatal error C1083: Cannot open include file: 'sys/time.h': No such file or directory ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #include #include int main() { struct tm *tp = 0; if (tp); ; return 0; } ================================================================================ TEST checkMath from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:203) TESTING: checkMath from config.headers(config/BuildSystem/config/headers.py:203) Checks for the math headers and defines Checking for header: math.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 73 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\math.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_MATH_H" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(6) : error C2065: 'M_PI' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(6) : error C2065: 'M_PI' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { double pi = M_PI; if (pi); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #define _USE_MATH_DEFINES 1 #include int main() { double pi = M_PI; if (pi); ; return 0; } Defined "_USE_MATH_DEFINES" to "1" Activated Windows math #defines, like M_PI ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/socket.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 81 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/socket.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/types.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 81 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #pragma once #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 29 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 31 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 32 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef long __time32_t; #line 43 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef __int64 __time64_t; #line 48 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef __time64_t time_t; #line 55 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 57 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef unsigned short _ino_t; typedef unsigned short ino_t; #line 67 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 70 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef unsigned int _dev_t; typedef unsigned int dev_t; #line 80 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 83 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef long _off_t; typedef long off_t; #line 93 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 96 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 98 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 4 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" Defined "HAVE_SYS_TYPES_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: malloc.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 85 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\malloc.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_MALLOC_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: time.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 89 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\time.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_TIME_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: Direct.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 89 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\Direct.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_DIRECT_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: Ws2tcpip.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 93 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\Ws2tcpip.h" #pragma once #line 25 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\Ws2tcpip.h" #line 1 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\winsock2.h" #pragma once #line 49 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\winsock2.h" #line 57 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\winsock2.h" #line 61 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\winsock2.h" #line 1 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 1 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #pragma warning(push) #line 22 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #pragma warning(disable:4001) #line 24 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #pragma once #line 181 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 194 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 195 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 199 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 207 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 208 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 216 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 217 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 224 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 226 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 228 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 230 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 232 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 235 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 236 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 245 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 249 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 253 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 257 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 261 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 265 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 267 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #pragma warning(pop) #line 274 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 275 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 277 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\sdkddkver.h" #line 22 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #pragma once #line 29 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 79 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 100 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 104 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 108 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 112 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 116 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 122 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 127 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 128 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #pragma warning(disable:4116) #line 135 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 136 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 137 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #pragma warning(disable:4514) #pragma warning(disable:4103) #line 144 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #pragma warning(push) #line 147 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #pragma warning(disable:4001) #pragma warning(disable:4201) #pragma warning(disable:4214) #line 151 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\windows.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\excpt.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; Defined "HAVE_WS2TCPIP_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: endian.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 97 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'endian.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: ieeefp.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 97 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'ieeefp.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: strings.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 97 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'strings.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sched.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 97 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sched.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: cxxabi.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 97 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'cxxabi.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/systeminfo.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 97 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/systeminfo.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: dos.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 97 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\dos.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_DOS_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: WindowsX.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 101 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #pragma once #line 17 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 35 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 36 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 37 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 57 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 79 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 88 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 103 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" #line 145 "C:\\Program Files (x86)\\Microsoft SDKs\\Windows\\v7.0A\\include\\WindowsX.h" Defined "HAVE_WINDOWSX_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/sysinfo.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 105 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/sysinfo.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/wait.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 105 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/wait.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: stdlib.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 105 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\stdlib.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_STDLIB_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: pthread.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 109 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'pthread.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: setjmp.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 109 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\setjmp.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_SETJMP_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/utsname.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 113 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/utsname.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: machine/endian.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 113 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'machine/endian.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: limits.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 113 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\limits.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_LIMITS_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: fcntl.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 117 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\fcntl.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_FCNTL_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: fenv.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 121 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'fenv.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: string.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 121 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\string.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_STRING_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: memory.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 125 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\memory.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_MEMORY_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/times.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 129 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/times.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: io.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 129 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\io.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_IO_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: stdint.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 133 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\stdint.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\yvals.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_STDINT_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: pwd.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 137 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'pwd.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: float.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 137 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\float.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_FLOAT_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/param.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 141 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/param.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: netdb.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 141 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'netdb.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: search.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 141 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\search.h" #pragma once #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 22 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 42 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 46 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\crtdefs.h" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #pragma once #line 145 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 148 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 154 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 158 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 1 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #pragma once #line 21 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 23 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 24 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned __int64 size_t; #line 31 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 33 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef unsigned short wchar_t; #line 38 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 50 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" enum SA_YesNoMaybe { SA_No = 0x0fff0001, SA_Maybe = 0x0fff0010, SA_Yes = 0x0fff0100 }; typedef enum SA_YesNoMaybe SA_YesNoMaybe; enum SA_AccessType { SA_NoAccess = 0, SA_Read = 1, SA_Write = 2, SA_ReadWrite = 3 }; typedef enum SA_AccessType SA_AccessType; [source_annotation_attribute( SA_Parameter )] struct PreAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostAttribute { unsigned int Deref; SA_YesNoMaybe Valid; SA_YesNoMaybe Null; SA_YesNoMaybe Tainted; SA_AccessType Access; size_t ValidElementsConst; size_t ValidBytesConst; const wchar_t* ValidElements; const wchar_t* ValidBytes; const wchar_t* ValidElementsLength; const wchar_t* ValidBytesLength; size_t WritableElementsConst; size_t WritableBytesConst; const wchar_t* WritableElements; const wchar_t* WritableBytes; const wchar_t* WritableElementsLength; const wchar_t* WritableBytesLength; size_t ElementSizeConst; const wchar_t* ElementSize; SA_YesNoMaybe NullTerminated; SA_YesNoMaybe MustCheck; const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct FormatStringAttribute { const wchar_t* Style; const wchar_t* UnformattedAlternative; }; [source_annotation_attribute( SA_ReturnValue )] struct InvalidCheckAttribute { long Value; }; [source_annotation_attribute( SA_Method )] struct SuccessAttribute { const wchar_t* Condition; }; [source_annotation_attribute( SA_Parameter )] struct PreBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostBoundAttribute { unsigned int Deref; }; [source_annotation_attribute( SA_Parameter )] struct PreRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; [source_annotation_attribute( SA_Parameter|SA_ReturnValue )] struct PostRangeAttribute { unsigned int Deref; const char* MinVal; const char* MaxVal; }; #line 218 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" typedef struct PreAttribute SA_Pre; typedef struct PreAttribute PreAttribute; typedef struct PostAttribute SA_Post; typedef struct PostAttribute PostAttribute; typedef struct FormatStringAttribute SA_FormatString; typedef struct InvalidCheckAttribute SA_InvalidCheck; typedef struct SuccessAttribute SA_Success; typedef struct PreBoundAttribute SA_PreBound; typedef struct PostBoundAttribute SA_PostBound; typedef struct PreRangeAttribute SA_PreRange; typedef struct PostRangeAttribute SA_PostRange; #line 282 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 284 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 305 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 308 "c:\\program files (x86)\\microsoft visual studio 10.0\\vc\\include\\codeanalysis\\sourceannotations.h" #line 161 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" #line 162 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sal.h" Defined "HAVE_SEARCH_H" to "1" ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: mathimf.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 145 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'mathimf.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/procfs.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 145 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/procfs.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/resource.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 145 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/resource.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: unistd.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 145 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'unistd.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: netinet/in.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 145 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'netinet/in.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST checkRecursiveMacros from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:218) TESTING: checkRecursiveMacros from config.headers(config/BuildSystem/config/headers.py:218) Checks that the preprocessor allows recursive macros, and if not defines HAVE_BROKEN_RECURSIVE_MACRO sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.headers/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void a(int i, int j) {} #define a(b) a(b,__LINE__) int main() { a(0); ; return 0; } ================================================================================ TEST configureCacheDetails from PETSc.utilities.cacheDetails(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/cacheDetails.py:78) TESTING: configureCacheDetails from PETSc.utilities.cacheDetails(config/PETSc/utilities/cacheDetails.py:78) Try to determine the size and associativity of the cache. Pushing language C All intermediate test results are stored in /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.headers -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.c(3) : fatal error C1083: Cannot open include file: 'unistd.h': No such file or directory Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.c(3) : fatal error C1083: Cannot open include file: 'unistd.h': No such file or directory ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include long getconf_LEVEL1_DCACHE_SIZE() { long val = sysconf(_SC_LEVEL1_DCACHE_SIZE); return (16 <= val && val <= 2147483647) ? val : 32768; } int main() { ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.c(4) : warning C4047: 'initializing' : 'FILE *' differs in levels of indirection from 'int' Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include long getconf_LEVEL1_DCACHE_SIZE() { long val=-1; FILE *f = popen("getconf LEVEL1_DCACHE_SIZE","r"); fscanf(f,"%ld",&val); pclose(f); return (16 <= val && val <= 2147483647) ? val : 32768; } int main() { ; return 0; } Popping language C Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.c(5) : warning C4047: 'initializing' : 'FILE *' differs in levels of indirection from 'int' Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_SIZE() { long val=-1; FILE *f = popen("getconf LEVEL1_DCACHE_SIZE","r"); fscanf(f,"%ld",&val); pclose(f); return (16 <= val && val <= 2147483647) ? val : 32768; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_SIZE()); fclose(output);; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol pclose referenced in function getconf_LEVEL1_DCACHE_SIZE conftest.obj : error LNK2019: unresolved external symbol popen referenced in function getconf_LEVEL1_DCACHE_SIZE C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.exe : fatal error LNK1120: 2 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol pclose referenced in function getconf_LEVEL1_DCACHE_SIZE conftest.obj : error LNK2019: unresolved external symbol popen referenced in function getconf_LEVEL1_DCACHE_SIZE C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.exe : fatal error LNK1120: 2 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_SIZE() { long val=-1; FILE *f = popen("getconf LEVEL1_DCACHE_SIZE","r"); fscanf(f,"%ld",&val); pclose(f); return (16 <= val && val <= 2147483647) ? val : 32768; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_SIZE()); fclose(output);; return 0; } Could not determine LEVEL1_DCACHE_SIZE, using default 32768 Popping language C Defined "LEVEL1_DCACHE_SIZE" to "32768" Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.c(5) : warning C4047: 'initializing' : 'FILE *' differs in levels of indirection from 'int' Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_LINESIZE() { long val=-1; FILE *f = popen("getconf LEVEL1_DCACHE_LINESIZE","r"); fscanf(f,"%ld",&val); pclose(f); return (16 <= val && val <= 2147483647) ? val : 32; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_LINESIZE()); fclose(output);; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol pclose referenced in function getconf_LEVEL1_DCACHE_LINESIZE conftest.obj : error LNK2019: unresolved external symbol popen referenced in function getconf_LEVEL1_DCACHE_LINESIZE C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.exe : fatal error LNK1120: 2 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol pclose referenced in function getconf_LEVEL1_DCACHE_LINESIZE conftest.obj : error LNK2019: unresolved external symbol popen referenced in function getconf_LEVEL1_DCACHE_LINESIZE C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.exe : fatal error LNK1120: 2 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_LINESIZE() { long val=-1; FILE *f = popen("getconf LEVEL1_DCACHE_LINESIZE","r"); fscanf(f,"%ld",&val); pclose(f); return (16 <= val && val <= 2147483647) ? val : 32; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_LINESIZE()); fclose(output);; return 0; } Could not determine LEVEL1_DCACHE_LINESIZE, using default 32 Popping language C Defined "LEVEL1_DCACHE_LINESIZE" to "32" Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.c(5) : warning C4047: 'initializing' : 'FILE *' differs in levels of indirection from 'int' Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_ASSOC() { long val=-1; FILE *f = popen("getconf LEVEL1_DCACHE_ASSOC","r"); fscanf(f,"%ld",&val); pclose(f); return (0 <= val && val <= 2147483647) ? val : 2; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_ASSOC()); fclose(output);; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol pclose referenced in function getconf_LEVEL1_DCACHE_ASSOC conftest.obj : error LNK2019: unresolved external symbol popen referenced in function getconf_LEVEL1_DCACHE_ASSOC C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.exe : fatal error LNK1120: 2 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol pclose referenced in function getconf_LEVEL1_DCACHE_ASSOC conftest.obj : error LNK2019: unresolved external symbol popen referenced in function getconf_LEVEL1_DCACHE_ASSOC C:\cygwin\tmp\PE3DC2~1\PETSCU~1.CAC\conftest.exe : fatal error LNK1120: 2 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_ASSOC() { long val=-1; FILE *f = popen("getconf LEVEL1_DCACHE_ASSOC","r"); fscanf(f,"%ld",&val); pclose(f); return (0 <= val && val <= 2147483647) ? val : 2; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_ASSOC()); fclose(output);; return 0; } Could not determine LEVEL1_DCACHE_ASSOC, using default 2 Popping language C Defined "LEVEL1_DCACHE_ASSOC" to "2" ================================================================================ TEST checkMemcmp from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:78) TESTING: checkMemcmp from config.functions(config/BuildSystem/config/functions.py:78) Check for 8-bit clean memcmp All intermediate test results are stored in /tmp/petsc-1nzsmm/config.functions sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include void exit(int); int main() { char c0 = 0x40; char c1 = (char) 0x80; char c2 = (char) 0x81; exit(memcmp(&c0, &c2, 1) < 0 && memcmp(&c1, &c2, 1) < 0 ? 0 : 1); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Executing: /tmp/petsc-1nzsmm/config.functions/conftest.exe sh: /tmp/petsc-1nzsmm/config.functions/conftest.exe Executing: /tmp/petsc-1nzsmm/config.functions/conftest.exe sh: ================================================================================ TEST checkSysinfo from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:102) TESTING: checkSysinfo from config.functions(config/BuildSystem/config/functions.py:102) Check whether sysinfo takes three arguments, and if it does define HAVE_SYSINFO_3ARG Checking for function sysinfo sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sysinfo(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sysinfo(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sysinfo) || defined (__stub___sysinfo) choke me #else sysinfo(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol sysinfo referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol sysinfo referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sysinfo(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sysinfo(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sysinfo) || defined (__stub___sysinfo) choke me #else sysinfo(); #endif ; return 0; } ================================================================================ TEST checkVPrintf from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:125) TESTING: checkVPrintf from config.functions(config/BuildSystem/config/functions.py:125) Checks whether vprintf requires a char * last argument, and if it does defines HAVE_VPRINTF_CHAR Checking for function vprintf sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char vprintf(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char vprintf(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_vprintf) || defined (__stub___vprintf) choke me #else vprintf(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE_VPRINTF" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.functions\conftest.c(8) : warning C4700: uninitialized local variable 'Argp' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include int main() { va_list Argp; vprintf( "%d", Argp ); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link ================================================================================ TEST checkVFPrintf from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:132) TESTING: checkVFPrintf from config.functions(config/BuildSystem/config/functions.py:132) Checks whether vfprintf requires a char * last argument, and if it does defines HAVE_VFPRINTF_CHAR Checking for function vfprintf sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char vfprintf(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char vfprintf(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_vfprintf) || defined (__stub___vfprintf) choke me #else vfprintf(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_VFPRINTF" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.functions\conftest.c(8) : warning C4700: uninitialized local variable 'Argp' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include int main() { va_list Argp; vfprintf(stdout, "%d", Argp ); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link ================================================================================ TEST checkVSNPrintf from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:139) TESTING: checkVSNPrintf from config.functions(config/BuildSystem/config/functions.py:139) Checks whether vsnprintf requires a char * last argument, and if it does defines HAVE_VSNPRINTF_CHAR Checking for function _vsnprintf sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _vsnprintf(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _vsnprintf(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__vsnprintf) || defined (__stub____vsnprintf) choke me #else _vsnprintf(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE__VSNPRINTF" to "1" Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.functions/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.setCompilers -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.functions/conftest.cc sh: conftest.cc c:\cygwin\tmp\petsc-1nzsmm\config.functions\conftest.cc(8) : warning C4700: uninitialized local variable 'Argp' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include int main() { va_list Argp;char str[6]; _vsnprintf(str,5, "%d", Argp ); ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Popping language Cxx ================================================================================ TEST checkNanosleep from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:183) TESTING: checkNanosleep from config.functions(config/BuildSystem/config/functions.py:183) Check for functional nanosleep() - as time.h behaves differently for different compiler flags - like -std=c89 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(6) : error C2079: 'tp' uses undefined struct 'timespec' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(7) : error C2224: left of '.tv_sec' must have struct/union type C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(8) : error C2224: left of '.tv_nsec' must have struct/union type Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(6) : error C2079: 'tp' uses undefined struct 'timespec' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(7) : error C2224: left of '.tv_sec' must have struct/union type C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(8) : error C2224: left of '.tv_nsec' must have struct/union type ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { struct timespec tp; tp.tv_sec = 0; tp.tv_nsec = (long)(1e9); nanosleep(&tp,0); ; return 0; } Compile failed inside link ================================================================================ TEST checkSignalHandlerType from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:159) TESTING: checkSignalHandlerType from config.functions(config/BuildSystem/config/functions.py:159) Checks the type of C++ signals handlers, and defines SIGNAL_CAST to the correct value Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.functions -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.functions/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.functions -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.functions/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include static void myhandler(int sig) {} int main() { signal(SIGFPE,myhandler); ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Defined "SIGNAL_CAST" to " " Popping language Cxx ================================================================================ TEST checkFreeReturnType from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:169) TESTING: checkFreeReturnType from config.functions(config/BuildSystem/config/functions.py:169) Checks whether free returns void or int, and defines HAVE_FREE_RETURN_INT sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(6) : error C2120: 'void' illegal with all types Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(6) : error C2120: 'void' illegal with all types ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int ierr; void *p; ierr = free(p); return 0; ; return 0; } Compile failed inside link ================================================================================ TEST checkVariableArgumentLists from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:175) TESTING: checkVariableArgumentLists from config.functions(config/BuildSystem/config/functions.py:175) Checks whether the variable argument list functionality is working sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.functions\conftest.c(7) : warning C4700: uninitialized local variable 'l2' used c:\cygwin\tmp\petsc-1nzsmm\config.functions\conftest.c(7) : warning C4700: uninitialized local variable 'l1' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { va_list l1, l2; va_copy(l1, l2); return 0; ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol va_copy referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol va_copy referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include int main() { va_list l1, l2; va_copy(l1, l2); return 0; ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.functions\conftest.c(7) : warning C4700: uninitialized local variable 'l2' used c:\cygwin\tmp\petsc-1nzsmm\config.functions\conftest.c(7) : warning C4700: uninitialized local variable 'l1' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { va_list l1, l2; __va_copy(l1, l2); return 0; ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol __va_copy referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol __va_copy referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" #include int main() { va_list l1, l2; __va_copy(l1, l2); return 0; ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function rand sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char rand(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char rand(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_rand) || defined (__stub___rand) choke me #else rand(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE_RAND" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function getdomainname sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getdomainname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getdomainname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getdomainname) || defined (__stub___getdomainname) choke me #else getdomainname(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol getdomainname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol getdomainname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getdomainname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getdomainname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getdomainname) || defined (__stub___getdomainname) choke me #else getdomainname(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _sleep sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _sleep(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _sleep(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__sleep) || defined (__stub____sleep) choke me #else _sleep(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE__SLEEP" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function snprintf sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char snprintf(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char snprintf(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_snprintf) || defined (__stub___snprintf) choke me #else snprintf(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol snprintf referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol snprintf referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char snprintf(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char snprintf(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_snprintf) || defined (__stub___snprintf) choke me #else snprintf(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function realpath sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char realpath(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char realpath(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_realpath) || defined (__stub___realpath) choke me #else realpath(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol realpath referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol realpath referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char realpath(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char realpath(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_realpath) || defined (__stub___realpath) choke me #else realpath(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function dlsym sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlsym(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlsym(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlsym) || defined (__stub___dlsym) choke me #else dlsym(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol dlsym referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol dlsym referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlsym(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlsym(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlsym) || defined (__stub___dlsym) choke me #else dlsym(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function bzero sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char bzero(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char bzero(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_bzero) || defined (__stub___bzero) choke me #else bzero(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol bzero referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol bzero referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char bzero(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char bzero(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_bzero) || defined (__stub___bzero) choke me #else bzero(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _getcwd sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _getcwd(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _getcwd(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__getcwd) || defined (__stub____getcwd) choke me #else _getcwd(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE__GETCWD" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function getwd sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getwd(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getwd(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getwd) || defined (__stub___getwd) choke me #else getwd(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol getwd referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol getwd referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getwd(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getwd(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getwd) || defined (__stub___getwd) choke me #else getwd(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function uname sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char uname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char uname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_uname) || defined (__stub___uname) choke me #else uname(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol uname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol uname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char uname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char uname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_uname) || defined (__stub___uname) choke me #else uname(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _lseek sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _lseek(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _lseek(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__lseek) || defined (__stub____lseek) choke me #else _lseek(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE__LSEEK" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function sleep sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sleep(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sleep(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sleep) || defined (__stub___sleep) choke me #else sleep(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol sleep referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol sleep referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sleep(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sleep(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sleep) || defined (__stub___sleep) choke me #else sleep(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _access sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _access(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _access(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__access) || defined (__stub____access) choke me #else _access(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE__ACCESS" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function lseek sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char lseek(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char lseek(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_lseek) || defined (__stub___lseek) choke me #else lseek(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LSEEK" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function usleep sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char usleep(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char usleep(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_usleep) || defined (__stub___usleep) choke me #else usleep(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol usleep referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol usleep referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char usleep(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char usleep(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_usleep) || defined (__stub___usleep) choke me #else usleep(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _intel_fast_memset sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _intel_fast_memset(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _intel_fast_memset(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__intel_fast_memset) || defined (__stub____intel_fast_memset) choke me #else _intel_fast_memset(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol _intel_fast_memset referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol _intel_fast_memset referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _intel_fast_memset(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _intel_fast_memset(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__intel_fast_memset) || defined (__stub____intel_fast_memset) choke me #else _intel_fast_memset(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function dlclose sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlclose(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlclose(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlclose) || defined (__stub___dlclose) choke me #else dlclose(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol dlclose referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol dlclose referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlclose(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlclose(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlclose) || defined (__stub___dlclose) choke me #else dlclose(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function gethostname sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char gethostname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char gethostname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_gethostname) || defined (__stub___gethostname) choke me #else gethostname(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol gethostname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol gethostname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char gethostname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char gethostname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_gethostname) || defined (__stub___gethostname) choke me #else gethostname(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function clock sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char clock(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char clock(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_clock) || defined (__stub___clock) choke me #else clock(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE_CLOCK" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function get_nprocs sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char get_nprocs(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char get_nprocs(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_get_nprocs) || defined (__stub___get_nprocs) choke me #else get_nprocs(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol get_nprocs referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol get_nprocs referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char get_nprocs(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char get_nprocs(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_get_nprocs) || defined (__stub___get_nprocs) choke me #else get_nprocs(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function access sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char access(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char access(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_access) || defined (__stub___access) choke me #else access(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE_ACCESS" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _snprintf sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _snprintf(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _snprintf(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__snprintf) || defined (__stub____snprintf) choke me #else _snprintf(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE__SNPRINTF" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function dlerror sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlerror(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlerror(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlerror) || defined (__stub___dlerror) choke me #else dlerror(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol dlerror referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol dlerror referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlerror(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlerror(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlerror) || defined (__stub___dlerror) choke me #else dlerror(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function mkstemp sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char mkstemp(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char mkstemp(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_mkstemp) || defined (__stub___mkstemp) choke me #else mkstemp(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol mkstemp referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol mkstemp referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char mkstemp(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char mkstemp(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_mkstemp) || defined (__stub___mkstemp) choke me #else mkstemp(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function fork sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char fork(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char fork(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_fork) || defined (__stub___fork) choke me #else fork(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol fork referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol fork referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char fork(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char fork(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_fork) || defined (__stub___fork) choke me #else fork(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function getpagesize sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getpagesize(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getpagesize(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getpagesize) || defined (__stub___getpagesize) choke me #else getpagesize(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol getpagesize referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol getpagesize referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getpagesize(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getpagesize(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getpagesize) || defined (__stub___getpagesize) choke me #else getpagesize(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function sbreak sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sbreak(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sbreak(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sbreak) || defined (__stub___sbreak) choke me #else sbreak(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol sbreak referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol sbreak referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sbreak(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sbreak(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sbreak) || defined (__stub___sbreak) choke me #else sbreak(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function memalign sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char memalign(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char memalign(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_memalign) || defined (__stub___memalign) choke me #else memalign(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol memalign referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol memalign referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char memalign(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char memalign(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_memalign) || defined (__stub___memalign) choke me #else memalign(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function sigset sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sigset(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sigset(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sigset) || defined (__stub___sigset) choke me #else sigset(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol sigset referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol sigset referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sigset(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sigset(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sigset) || defined (__stub___sigset) choke me #else sigset(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _fullpath sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _fullpath(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _fullpath(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__fullpath) || defined (__stub____fullpath) choke me #else _fullpath(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE__FULLPATH" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function getcwd sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getcwd(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getcwd(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getcwd) || defined (__stub___getcwd) choke me #else getcwd(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_GETCWD" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function gethostbyname sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char gethostbyname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char gethostbyname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_gethostbyname) || defined (__stub___gethostbyname) choke me #else gethostbyname(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol gethostbyname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol gethostbyname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char gethostbyname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char gethostbyname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_gethostbyname) || defined (__stub___gethostbyname) choke me #else gethostbyname(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function gettimeofday sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char gettimeofday(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char gettimeofday(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_gettimeofday) || defined (__stub___gettimeofday) choke me #else gettimeofday(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol gettimeofday referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol gettimeofday referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char gettimeofday(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char gettimeofday(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_gettimeofday) || defined (__stub___gettimeofday) choke me #else gettimeofday(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function readlink sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char readlink(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char readlink(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_readlink) || defined (__stub___readlink) choke me #else readlink(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol readlink referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol readlink referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char readlink(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char readlink(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_readlink) || defined (__stub___readlink) choke me #else readlink(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function PXFGETARG sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char PXFGETARG(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char PXFGETARG(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_PXFGETARG) || defined (__stub___PXFGETARG) choke me #else PXFGETARG(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol PXFGETARG referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol PXFGETARG referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char PXFGETARG(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char PXFGETARG(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_PXFGETARG) || defined (__stub___PXFGETARG) choke me #else PXFGETARG(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function sigaction sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sigaction(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sigaction(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sigaction) || defined (__stub___sigaction) choke me #else sigaction(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol sigaction referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol sigaction referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sigaction(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sigaction(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sigaction) || defined (__stub___sigaction) choke me #else sigaction(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function strcasecmp sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char strcasecmp(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char strcasecmp(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_strcasecmp) || defined (__stub___strcasecmp) choke me #else strcasecmp(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol strcasecmp referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol strcasecmp referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char strcasecmp(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char strcasecmp(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_strcasecmp) || defined (__stub___strcasecmp) choke me #else strcasecmp(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function dlopen sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlopen(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlopen(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlopen) || defined (__stub___dlopen) choke me #else dlopen(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol dlopen referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol dlopen referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char dlopen(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char dlopen(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_dlopen) || defined (__stub___dlopen) choke me #else dlopen(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function drand48 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char drand48(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char drand48(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_drand48) || defined (__stub___drand48) choke me #else drand48(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol drand48 referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol drand48 referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char drand48(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char drand48(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_drand48) || defined (__stub___drand48) choke me #else drand48(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function socket sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char socket(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char socket(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_socket) || defined (__stub___socket) choke me #else socket(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol socket referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol socket referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char socket(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char socket(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_socket) || defined (__stub___socket) choke me #else socket(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function memmove sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char memmove(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char memmove(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_memmove) || defined (__stub___memmove) choke me #else memmove(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE_MEMMOVE" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function signal sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char signal(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char signal(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_signal) || defined (__stub___signal) choke me #else signal(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_SIGNAL" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function popen sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char popen(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char popen(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_popen) || defined (__stub___popen) choke me #else popen(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol popen referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol popen referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char popen(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char popen(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_popen) || defined (__stub___popen) choke me #else popen(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function getrusage sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getrusage(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getrusage(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getrusage) || defined (__stub___getrusage) choke me #else getrusage(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol getrusage referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol getrusage referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getrusage(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getrusage(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getrusage) || defined (__stub___getrusage) choke me #else getrusage(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function times sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char times(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char times(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_times) || defined (__stub___times) choke me #else times(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol times referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol times referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char times(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char times(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_times) || defined (__stub___times) choke me #else times(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function _intel_fast_memcpy sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _intel_fast_memcpy(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _intel_fast_memcpy(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__intel_fast_memcpy) || defined (__stub____intel_fast_memcpy) choke me #else _intel_fast_memcpy(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol _intel_fast_memcpy referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol _intel_fast_memcpy referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _intel_fast_memcpy(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _intel_fast_memcpy(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__intel_fast_memcpy) || defined (__stub____intel_fast_memcpy) choke me #else _intel_fast_memcpy(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function time sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char time(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char time(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_time) || defined (__stub___time) choke me #else time(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE_TIME" to "1" ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function sysctlbyname sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sysctlbyname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sysctlbyname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sysctlbyname) || defined (__stub___sysctlbyname) choke me #else sysctlbyname(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol sysctlbyname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol sysctlbyname referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char sysctlbyname(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sysctlbyname(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_sysctlbyname) || defined (__stub___sysctlbyname) choke me #else sysctlbyname(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function getpwuid sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getpwuid(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getpwuid(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getpwuid) || defined (__stub___getpwuid) choke me #else getpwuid(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol getpwuid referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol getpwuid referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char getpwuid(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getpwuid(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_getpwuid) || defined (__stub___getpwuid) choke me #else getpwuid(); #endif ; return 0; } ================================================================================ TEST check from config.functions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/functions.py:29) TESTING: check from config.functions(config/BuildSystem/config/functions.py:29) Checks for the function "funcName", and if found defines HAVE_"funcName" Checking for function stricmp sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char stricmp(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char stricmp(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_stricmp) || defined (__stub___stricmp) choke me #else stricmp(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o sh: Defined "HAVE_STRICMP" to "1" ================================================================================ TEST configureMemorySize from PETSc.utilities.getResidentSetSize(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/getResidentSetSize.py:31) TESTING: configureMemorySize from PETSc.utilities.getResidentSetSize(config/PETSc/utilities/getResidentSetSize.py:31) Try to determine how to measure the memory usage Defined "USE_PROC_FOR_SIZE" to "1" Using /proc for PetscMemoryGetCurrentUsage() ================================================================================ TEST configureFPTrap from PETSc.utilities.FPTrap(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/FPTrap.py:27) TESTING: configureFPTrap from PETSc.utilities.FPTrap(config/PETSc/utilities/FPTrap.py:27) Checking the handling of floating point traps Checking for header: sigfpe.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 237 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'sigfpe.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include Checking for header: fpxcp.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 237 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'fpxcp.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include Checking for header: floatingpoint.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 237 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 8 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 9 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 10 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'floatingpoint.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST check_siginfo_t from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:46) TESTING: check_siginfo_t from config.types(config/BuildSystem/config/types.py:46) Checks if siginfo_t exists in signal.h. This check is for windows, and C89 check. Checking for type: siginfo_t All intermediate test results are stored in /tmp/petsc-1nzsmm/config.types sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'siginfo_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'siginfo_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { siginfo_t a;; return 0; } siginfo_t found ================================================================================ TEST check__int64 from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:52) TESTING: check__int64 from config.types(config/BuildSystem/config/types.py:52) Checks if __int64 exists. This is primarily for windows. Checking for type: __int64 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { __int64 a;; return 0; } __int64 found Defined "HAVE___INT64" to "1" ================================================================================ TEST checkSizeTypes from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:58) TESTING: checkSizeTypes from config.types(config/BuildSystem/config/types.py:58) Checks for types associated with sizes, such as size_t. Checking for type: size_t sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { size_t a;; return 0; } size_t found ================================================================================ TEST checkFileTypes from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:68) TESTING: checkFileTypes from config.types(config/BuildSystem/config/types.py:68) Checks for types associated with files, such as mode_t, off_t, etc. Checking for type: mode_t sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'mode_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'mode_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { mode_t a;; return 0; } Typedefed "int" to "mode_t" Checking for type: off_t sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { off_t a;; return 0; } off_t found ================================================================================ TEST checkIntegerTypes from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:63) TESTING: checkIntegerTypes from config.types(config/BuildSystem/config/types.py:63) Checks for types associated with integers, such as int32_t. Checking for type: int32_t sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'int32_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'int32_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { int32_t a;; return 0; } Typedefed "int" to "int32_t" ================================================================================ TEST checkPID from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:74) TESTING: checkPID from config.types(config/BuildSystem/config/types.py:74) Checks for pid_t, and defines it if necessary Checking for type: pid_t sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'pid_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'pid_t' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2146: syntax error : missing ';' before identifier 'a' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'a' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { pid_t a;; return 0; } Typedefed "int" to "pid_t" ================================================================================ TEST checkUID from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:78) TESTING: checkUID from config.types(config/BuildSystem/config/types.py:78) Checks for uid_t and gid_t, and defines them if necessary sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.TYP\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 239 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 241 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.TYP\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\conffix.h" typedef int int32_t; typedef int mode_t; typedef int pid_t; #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\conffix.h" #line 12 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\conffix.h" #line 13 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.types\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.TYP\\conftest.c" #line 1 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #pragma once #line 23 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 29 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 31 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 32 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef long __time32_t; #line 43 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef __int64 __time64_t; #line 48 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef __time64_t time_t; #line 55 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 57 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef unsigned short _ino_t; typedef unsigned short ino_t; #line 67 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 70 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef unsigned int _dev_t; typedef unsigned int dev_t; #line 80 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 83 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" typedef long _off_t; typedef long off_t; #line 93 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 96 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 98 "C:\\Program Files (x86)\\Microsoft Visual Studio 10.0\\VC\\INCLUDE\\sys/types.h" #line 4 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.TYP\\conftest.c" Defined "uid_t" to "int" Defined "gid_t" to "int" ================================================================================ TEST checkSignal from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:85) TESTING: checkSignal from config.types(config/BuildSystem/config/types.py:85) Checks the return type of signal() and defines RETSIGTYPE to that type name sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #ifdef signal #undef signal #endif #ifdef __cplusplus extern "C" void (*signal (int, void(*)(int)))(int); #else void (*signal())(); #endif int main() { ; return 0; } Defined "RETSIGTYPE" to "void" ================================================================================ TEST checkC99Complex from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:106) TESTING: checkC99Complex from config.types(config/BuildSystem/config/types.py:106) Check for complex numbers in in C99 std sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(3) : fatal error C1083: Cannot open include file: 'complex.h': No such file or directory Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(3) : fatal error C1083: Cannot open include file: 'complex.h': No such file or directory ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { double complex x; x = I; ; return 0; } Compile failed inside link ================================================================================ TEST checkCxxComplex from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:115) TESTING: checkCxxComplex from config.types(config/BuildSystem/config/types.py:115) Check for complex numbers in namespace std Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.types/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/config.types/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { std::complex x; ; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.types/conftest.o sh: Defined "HAVE_CXX_COMPLEX" to "1" Popping language Cxx ================================================================================ TEST checkFortranKind from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:136) TESTING: checkFortranKind from config.types(config/BuildSystem/config/types.py:136) Checks whether selected_int_kind etc work USE_FORTRANKIND Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.types/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.setCompilers -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.types/conftest.F sh: Successful compile: Source: program main integer(kind=selected_int_kind(10)) i real(kind=selected_real_kind(10)) d end Defined "USE_FORTRANKIND" to "1" Popping language FC ================================================================================ TEST checkFortranDReal from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:148) TESTING: checkFortranDReal from config.types(config/BuildSystem/config/types.py:148) Checks whether dreal is provided in Fortran, and if not defines MISSING_DREAL Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.types/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.types/conftest.F sh: C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.F(3): warning #7319: This argument's data type is incompatible with this intrinsic procedure; procedure assumed EXTERNAL. [DREAL] d = dreal(3.0) ----------------^ Successful compile: Source: program main double precision d d = dreal(3.0) end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol DREAL referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol DREAL referenced in function MAIN__ C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 24576 Pushing language FC Popping language FC in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.types/conftest.o Source: program main double precision d d = dreal(3.0) end Defined "MISSING_DREAL" to "1" Popping language FC ================================================================================ TEST checkConst from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:156) TESTING: checkConst from config.types(config/BuildSystem/config/types.py:156) Checks for working const, and if not found defines it to empty string sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.types\conftest.c(25) : warning C4700: uninitialized local variable 'x' used c:\cygwin\tmp\petsc-1nzsmm\config.types\conftest.c(30) : warning C4700: uninitialized local variable 't' used c:\cygwin\tmp\petsc-1nzsmm\config.types\conftest.c(46) : warning C4700: uninitialized local variable 'b' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { /* Ultrix mips cc rejects this. */ typedef int charset[2]; const charset x; /* SunOS 4.1.1 cc rejects this. */ char const *const *ccp; char **p; /* NEC SVR4.0.2 mips cc rejects this. */ struct point {int x, y;}; static struct point const zero = {0,0}; /* AIX XL C 1.02.0.0 rejects this. It does not let you subtract one const X* pointer from another in an arm of an if-expression whose if-part is not a constant expression */ const char *g = "string"; ccp = &g + (g ? g-g : 0); /* HPUX 7.0 cc rejects these. */ ++ccp; p = (char**) ccp; ccp = (char const *const *) p; /* This section avoids unused variable warnings */ if (zero.x); if (x[0]); { /* SCO 3.2v4 cc rejects this. */ char *t; char const *s = 0 ? (char *) 0 : (char const *) 0; *t++ = 0; if (*s); } { /* Someone thinks the Sun supposedly-ANSI compiler will reject this. */ int x[] = {25, 17}; const int *foo = &x[0]; ++foo; } { /* Sun SC1.0 ANSI compiler rejects this -- but not the above. */ typedef const int *iptr; iptr p = 0; ++p; } { /* AIX XL C 1.02.0.0 rejects this saying "k.c", line 2.27: 1506-025 (S) Operand must be a modifiable lvalue. */ struct s { int j; const int *ap[3]; }; struct s *b; b->j = 5; } { /* ULTRIX-32 V3.1 (Rev 9) vcc rejects this */ const int foo = 10; /* Get rid of unused variable warning */ if (foo); } ; return 0; } ================================================================================ TEST checkEndian from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:212) TESTING: checkEndian from config.types(config/BuildSystem/config/types.py:212) If the machine is big endian, defines WORDS_BIGENDIAN sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2065: 'bogus' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2146: syntax error : missing ';' before identifier 'endian' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2065: 'endian' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2146: syntax error : missing ';' before identifier 'macros' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'macros' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2065: 'bogus' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2146: syntax error : missing ';' before identifier 'endian' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2065: 'endian' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(11) : error C2146: syntax error : missing ';' before identifier 'macros' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.c(13) : error C2065: 'macros' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #ifdef HAVE_SYS_PARAM_H #include #endif int main() { #if !BYTE_ORDER || !BIG_ENDIAN || !LITTLE_ENDIAN bogus endian macros #endif ; return 0; } Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { /* Are we little or big endian? From Harbison&Steele. */ union { long l; char c[sizeof(long)]; } u; u.l = 1; exit(u.c[sizeof(long) - 1] == 1); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: char Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(char)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_CHAR" to "1" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: void * Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(void *)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_VOID_P" to "8" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: short Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(short)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_SHORT" to "2" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: int Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(int)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_INT" to "4" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: long Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(long)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_LONG" to "4" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: long long Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(long long)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_LONG_LONG" to "8" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: float Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(float)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_FLOAT" to "4" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: double Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(double)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_DOUBLE" to "8" ================================================================================ TEST checkSizeof from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:265) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:265) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: size_t Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(size_t)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_SIZE_T" to "8" ================================================================================ TEST checkBitsPerByte from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:316) TESTING: checkBitsPerByte from config.types(config/BuildSystem/config/types.py:316) Determine the nubmer of bits per byte and define BITS_PER_BYTE sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #if STDC_HEADERS #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); char val[2]; int i = 0; if (!f) exit(1); val[0]='\1'; val[1]='\0'; while(val[0]) {val[0] <<= 1; i++;} fprintf(f, "%d\n", i); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Defined "BITS_PER_BYTE" to "8" ================================================================================ TEST checkVisibility from config.types(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/types.py:362) TESTING: checkVisibility from config.types(config/BuildSystem/config/types.py:362) ================================================================================ TEST configureMemAlign from PETSc.utilities.memAlign(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/memAlign.py:30) TESTING: configureMemAlign from PETSc.utilities.memAlign(config/PETSc/utilities/memAlign.py:30) Choose alignment Defined "MEMALIGN" to "16" Memory alignment is 16 ================================================================================ TEST configureCHUD from PETSc.utilities.CHUD(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/CHUD.py:25) TESTING: configureCHUD from PETSc.utilities.CHUD(config/PETSc/utilities/CHUD.py:25) Determines if the Apple CHUD hardware monitoring utilities are available sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function socket in library ['socket', 'nsl'] [] Pushing language C All intermediate test results are stored in /tmp/petsc-1nzsmm/config.libraries sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char socket(); int main() { socket() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lsocket -lnsl Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lsocket -lnsl sh: LINK : fatal error LNK1104: cannot open file 'libsocket.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libsocket.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lsocket -lnsl Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char socket(); int main() { socket() ; return 0; } Popping language C ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function handle_sigfpes in library ['fpe'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char handle_sigfpes(); int main() { handle_sigfpes() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfpe Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfpe sh: LINK : fatal error LNK1104: cannot open file 'libfpe.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfpe.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfpe Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char handle_sigfpes(); int main() { handle_sigfpes() ; return 0; } Popping language C ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function socket in library ['socket', 'nsl'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char socket(); int main() { socket() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lsocket -lnsl Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lsocket -lnsl sh: LINK : fatal error LNK1104: cannot open file 'libsocket.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libsocket.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lsocket -lnsl Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char socket(); int main() { socket() ; return 0; } Popping language C ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function handle_sigfpes in library ['fpe'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char handle_sigfpes(); int main() { handle_sigfpes() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfpe Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfpe sh: LINK : fatal error LNK1104: cannot open file 'libfpe.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfpe.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfpe Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char handle_sigfpes(); int main() { handle_sigfpes() ; return 0; } Popping language C ================================================================================ TEST checkMath from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:224) TESTING: checkMath from config.libraries(config/BuildSystem/config/libraries.py:224) Check for sin() in libm, the math library Checking for function sin in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double sin(double); int main() { double x = 0,y; y = sin(x); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: Popping language C Checking for function floor in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double floor(double); int main() { double x = 0,y; y = floor(x); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Popping language C Checking for function log10 in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double log10(double); int main() { double x = 0,y; y = log10(x); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Popping language C Checking for function pow in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double pow(double, double); int main() { double x = 0,y ; y = pow(x, x); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Popping language C Math functions are linked in by default ================================================================================ TEST checkMathErf from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:240) TESTING: checkMathErf from config.libraries(config/BuildSystem/config/libraries.py:240) Check for erf() in libm, the math library Checking for function erf in library [] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double erf(double); int main() { double x = 0,y; y = erf(x); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol erf referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol erf referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double erf(double); int main() { double x = 0,y; y = erf(x); ; return 0; } Popping language C Warning: erf() not found ================================================================================ TEST checkMathTgamma from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:249) TESTING: checkMathTgamma from config.libraries(config/BuildSystem/config/libraries.py:249) Check for tgama() in libm, the math library Checking for function tgamma in library [] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double tgamma(double); int main() { double x = 0,y; y = tgamma(x); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol tgamma referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol tgamma referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double tgamma(double); int main() { double x = 0,y; y = tgamma(x); ; return 0; } Popping language C Warning: tgamma() not found ================================================================================ TEST checkCompression from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:258) TESTING: checkCompression from config.libraries(config/BuildSystem/config/libraries.py:258) Check for libz, the compression library Checking for function compress in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); int main() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol compress referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol compress referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); int main() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; return 0; } Popping language C Checking for function compress in library ['z'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); int main() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lz Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lz sh: LINK : fatal error LNK1104: cannot open file 'libz.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libz.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lz Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); int main() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; return 0; } Popping language C Checking for function compress in library ['zlib.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); int main() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o zlib.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o zlib.lib sh: LINK : fatal error LNK1104: cannot open file 'zlib.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'zlib.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o zlib.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); int main() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; return 0; } Popping language C Warning: No compression library found ================================================================================ TEST checkRealtime from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:279) TESTING: checkRealtime from config.libraries(config/BuildSystem/config/libraries.py:279) Check for presence of clock_gettime() in realtime library (POSIX Realtime extensions) Checking for function clock_gettime in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2079: 'tp' uses undefined struct 'timespec' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2065: 'CLOCK_REALTIME' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2079: 'tp' uses undefined struct 'timespec' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2065: 'CLOCK_REALTIME' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include int main() { struct timespec tp; clock_gettime(CLOCK_REALTIME,&tp);; return 0; } Compile failed inside link Popping language C Checking for function clock_gettime in library ['rt'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2079: 'tp' uses undefined struct 'timespec' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2065: 'CLOCK_REALTIME' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2079: 'tp' uses undefined struct 'timespec' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(7) : error C2065: 'CLOCK_REALTIME' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include int main() { struct timespec tp; clock_gettime(CLOCK_REALTIME,&tp);; return 0; } Compile failed inside link Popping language C Warning: No realtime library found ================================================================================ TEST checkDynamic from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:295) TESTING: checkDynamic from config.libraries(config/BuildSystem/config/libraries.py:295) Check for the header and libraries necessary for dynamic library manipulation ================================================================================ TEST configureTimers from PETSc.utilities.timer(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/timer.py:35) TESTING: configureTimers from PETSc.utilities.timer(config/PETSc/utilities/timer.py:35) Sets PETSC_HAVE_FAST_MPI_WTIME PETSC_USE_READ_REAL_TIME PETSC_USE_MICROSOFT_TIME. Checking for function MPI_CRAY_barrier in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_CRAY_barrier(); int main() { MPI_CRAY_barrier() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o sh: conftest.obj : error LNK2019: unresolved external symbol MPI_CRAY_barrier referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_CRAY_barrier referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_CRAY_barrier(); int main() { MPI_CRAY_barrier() ; return 0; } Popping language C Cray-MPI test failure ================================================================================ TEST configureMissingDefines from PETSc.utilities.missing(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/missing.py:35) TESTING: configureMissingDefines from PETSc.utilities.missing(config/PETSc/utilities/missing.py:35) Checks for limits All intermediate test results are stored in /tmp/petsc-1nzsmm/PETSc.utilities.missing sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_LIMITS_H #include #endif int main() { int i=INT_MAX; if (i); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_FLOAT_H #include #endif int main() { double d=DBL_MAX; if (d); ; return 0; } ================================================================================ TEST configureMissingUtypeTypedefs from PETSc.utilities.missing(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/missing.py:45) TESTING: configureMissingUtypeTypedefs from PETSc.utilities.missing(config/PETSc/utilities/missing.py:45) Checks if u_short is undefined sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'u_short' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2146: syntax error : missing ';' before identifier 'foo' C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'foo' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'u_short' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2146: syntax error : missing ';' before identifier 'foo' C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'foo' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { u_short foo; ; return 0; } Defined "NEEDS_UTYPE_TYPEDEFS" to "1" ================================================================================ TEST configureMissingFunctions from PETSc.utilities.missing(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/missing.py:51) TESTING: configureMissingFunctions from PETSc.utilities.missing(config/PETSc/utilities/missing.py:51) Checks for SOCKETS Checking for function socket in library ['Ws2_32.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include int main() { socket(0,0,0);; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: Defined "HAVE_LIBWS2_32" to "1" Popping language C Adding ['Ws2_32.lib'] to LIBS Defined "HAVE_WINSOCK2_H" to "1" Defined "HAVE_SOCKET" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { closesocket(0); return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o Ws2_32.lib sh: Defined "HAVE_CLOSESOCKET" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { WSAGetLastError(); return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_WSAGETLASTERROR" to "1" ================================================================================ TEST configureMissingSignals from PETSc.utilities.missing(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/missing.py:71) TESTING: configureMissingSignals from PETSc.utilities.missing(config/PETSc/utilities/missing.py:71) Check for missing signals, and define MISSING_ if necessary sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGABRT; if (i); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGALRM' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGALRM' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGALRM; if (i); ; return 0; } Defined "MISSING_SIGALRM" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGBUS' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGBUS' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGBUS; if (i); ; return 0; } Defined "MISSING_SIGBUS" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGCHLD' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGCHLD' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGCHLD; if (i); ; return 0; } Defined "MISSING_SIGCHLD" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGCONT' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGCONT' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGCONT; if (i); ; return 0; } Defined "MISSING_SIGCONT" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGFPE; if (i); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGHUP' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGHUP' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGHUP; if (i); ; return 0; } Defined "MISSING_SIGHUP" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGILL; if (i); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGINT; if (i); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGKILL' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGKILL' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGKILL; if (i); ; return 0; } Defined "MISSING_SIGKILL" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGPIPE' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGPIPE' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGPIPE; if (i); ; return 0; } Defined "MISSING_SIGPIPE" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGQUIT' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGQUIT' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGQUIT; if (i); ; return 0; } Defined "MISSING_SIGQUIT" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGSEGV; if (i); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGSTOP' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGSTOP' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGSTOP; if (i); ; return 0; } Defined "MISSING_SIGSTOP" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGSYS' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGSYS' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGSYS; if (i); ; return 0; } Defined "MISSING_SIGSYS" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGTERM; if (i); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGTRAP' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGTRAP' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGTRAP; if (i); ; return 0; } Defined "MISSING_SIGTRAP" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGTSTP' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGTSTP' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGTSTP; if (i); ; return 0; } Defined "MISSING_SIGTSTP" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGURG' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGURG' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGURG; if (i); ; return 0; } Defined "MISSING_SIGURG" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGUSR1' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGUSR1' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGUSR1; if (i); ; return 0; } Defined "MISSING_SIGUSR1" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGUSR2' : undeclared identifier Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.c(6) : error C2065: 'SIGUSR2' : undeclared identifier ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGUSR2; if (i); ; return 0; } Defined "MISSING_SIGUSR2" to "1" ================================================================================ TEST configureMissingGetdomainnamePrototype from PETSc.utilities.missing(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/missing.py:88) TESTING: configureMissingGetdomainnamePrototype from PETSc.utilities.missing(config/PETSc/utilities/missing.py:88) sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_UNISTD_H #include #endif #ifdef PETSC_HAVE_NETDB_H #include #endif int main() { char test[10]; int err = getdomainname(test,10); ; return 0; } Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.cc sh: conftest.cc C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.cc(14) : error C3861: 'getdomainname': identifier not found Possible ERROR while running compiler: conftest.cc C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.cc(14) : error C3861: 'getdomainname': identifier not found ret = 512 Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_UNISTD_H #include #endif #ifdef PETSC_HAVE_NETDB_H #include #endif int main() { char test[10]; int err = getdomainname(test,10); ; return 0; } Compile failed inside link Added prototype int getdomainname(char *, int); to language extern C Popping language Cxx ================================================================================ TEST configureMissingSrandPrototype from PETSc.utilities.missing(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/missing.py:110) TESTING: configureMissingSrandPrototype from PETSc.utilities.missing(config/PETSc/utilities/missing.py:110) sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif int main() { double a; long b=10; srand(b); a=drand48(); ; return 0; } Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -MT -GR -EHsc -Z7 -Zm200 -TP /tmp/petsc-1nzsmm/PETSc.utilities.missing/conftest.cc sh: conftest.cc C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.cc(13) : error C3861: 'drand48': identifier not found Possible ERROR while running compiler: conftest.cc C:\cygwin\tmp\PE3DC2~1\PETSCU~1.MIS\conftest.cc(13) : error C3861: 'drand48': identifier not found ret = 512 Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif int main() { double a; long b=10; srand(b); a=drand48(); ; return 0; } Compile failed inside link Added prototype double drand48(); to language extern C Added prototype void srand48(long); to language extern C Popping language Cxx ================================================================================ TEST configureMissingIntelFastPrototypes from PETSc.utilities.missing(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/missing.py:134) TESTING: configureMissingIntelFastPrototypes from PETSc.utilities.missing(config/PETSc/utilities/missing.py:134) ================================================================================ TEST configureScalarType from PETSc.utilities.scalarTypes(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/scalarTypes.py:37) TESTING: configureScalarType from PETSc.utilities.scalarTypes(config/PETSc/utilities/scalarTypes.py:37) Choose between real and complex numbers Defined "USE_SCALAR_REAL" to "1" Scalar type is real Pushing language C All intermediate test results are stored in /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0; int a = isnan(b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol isnan referenced in function main C:\cygwin\tmp\PE3DC2~1\PETSCU~1.SCA\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol isnan referenced in function main C:\cygwin\tmp\PE3DC2~1\PETSCU~1.SCA\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0; int a = isnan(b); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0; int a = isinf(b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol isinf referenced in function main C:\cygwin\tmp\PE3DC2~1\PETSCU~1.SCA\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol isinf referenced in function main C:\cygwin\tmp\PE3DC2~1\PETSCU~1.SCA\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0; int a = isinf(b); ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0;int a = _isnan(b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib sh: Defined "HAVE__ISNAN" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0;int a = _finite(b); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes/conftest.o Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\PETSCU~1.SCA\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE__FINITE" to "1" Popping language C ================================================================================ TEST configurePrecision from PETSc.utilities.scalarTypes(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/scalarTypes.py:73) TESTING: configurePrecision from PETSc.utilities.scalarTypes(config/PETSc/utilities/scalarTypes.py:73) Set the default real number precision for PETSc objects Defined "USE_REAL_DOUBLE" to "1" Precision is double ================================================================================ TEST configureLibraryOptions from PETSc.utilities.libraryOptions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/libraryOptions.py:48) TESTING: configureLibraryOptions from PETSc.utilities.libraryOptions(config/PETSc/utilities/libraryOptions.py:48) Sets PETSC_USE_DEBUG, PETSC_USE_INFO, PETSC_USE_LOG, PETSC_USE_CTABLE and PETSC_USE_FORTRAN_KERNELS Defined "USE_LOG" to "1" Defined "USE_DEBUG" to "1" Defined "USE_INFO" to "1" Defined "USE_CTABLE" to "1" **********Checking if running on BGL/IBM detected Checking for function bgl_perfctr_void in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char bgl_perfctr_void(); int main() { bgl_perfctr_void() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol bgl_perfctr_void referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol bgl_perfctr_void referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char bgl_perfctr_void(); int main() { bgl_perfctr_void() ; return 0; } Popping language C Checking for function ADIOI_BGL_Open in library [''] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ADIOI_BGL_Open(); int main() { ADIOI_BGL_Open() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol ADIOI_BGL_Open referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol ADIOI_BGL_Open referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ADIOI_BGL_Open(); int main() { ADIOI_BGL_Open() ; return 0; } Popping language C *********BGL/IBM test failure Defined "USE_BACKWARD_LOOP" to "1" Defined "Alignx(a,b)" to " " Defined "USE_64BIT_INDICES" to "1" Checking for function __floatdidf in library ['-lgcc_s.1'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char __floatdidf(); int main() { __floatdidf() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lgcc_s.1 Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lgcc_s.1 Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libgcc_s.1.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libgcc_s.1.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lgcc_s.1 Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char __floatdidf(); int main() { __floatdidf() ; return 0; } Popping language C ================================================================================ TEST configureISColorValueType from PETSc.utilities.libraryOptions(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/libraryOptions.py:92) TESTING: configureISColorValueType from PETSc.utilities.libraryOptions(config/PETSc/utilities/libraryOptions.py:92) Sets PETSC_IS_COLOR_VALUE_TYPE, MPIU_COLORING_VALUE, IS_COLORING_MAX required by ISColor Defined "MPIU_COLORING_VALUE" to "MPI_UNSIGNED_SHORT" Defined "IS_COLORING_MAX" to "65535" Defined "IS_COLOR_VALUE_TYPE" to "short" ================================================================================ TEST configureFortranCommandLine from PETSc.utilities.fortranCommandLine(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/utilities/fortranCommandLine.py:27) TESTING: configureFortranCommandLine from PETSc.utilities.fortranCommandLine(config/PETSc/utilities/fortranCommandLine.py:27) Check for the mechanism to retrieve command line arguments in Fortran Pushing language FC Checking for function in library [''] [] Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.F sh: Successful compile: Source: program main integer i character*(80) arg call get_command_argument(i,arg) end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: Popping language FC Defined "HAVE_FORTRAN_GET_COMMAND_ARGUMENT" to "1" Popping language FC Pushing language C Checking for function GET_COMMAND_ARGUMENT in library [''] ['-L/cygdrive/c/cygwin/packages/petsc-3.4.2/\\PROGRA~2\\Intel\\COMPOS~1\\bin\\intel64'] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char GET_COMMAND_ARGUMENT(); int main() { GET_COMMAND_ARGUMENT() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol GET_COMMAND_ARGUMENT referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol GET_COMMAND_ARGUMENT referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char GET_COMMAND_ARGUMENT(); int main() { GET_COMMAND_ARGUMENT() ; return 0; } Popping language C Checking for function GETARG in library [''] ['-L/cygdrive/c/cygwin/packages/petsc-3.4.2/\\PROGRA~2\\Intel\\COMPOS~1\\bin\\intel64'] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char GETARG(); int main() { GETARG() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol GETARG referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol GETARG referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char GETARG(); int main() { GETARG() ; return 0; } Popping language C Checking for function ipxfargc_ sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char ipxfargc_(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char ipxfargc_(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_ipxfargc_) || defined (__stub___ipxfargc_) choke me #else ipxfargc_(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol ipxfargc_ referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol ipxfargc_ referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char ipxfargc_(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char ipxfargc_(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_ipxfargc_) || defined (__stub___ipxfargc_) choke me #else ipxfargc_(); #endif ; return 0; } Checking for function f90_unix_MP_iargc sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char f90_unix_MP_iargc(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char f90_unix_MP_iargc(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_f90_unix_MP_iargc) || defined (__stub___f90_unix_MP_iargc) choke me #else f90_unix_MP_iargc(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol f90_unix_MP_iargc referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol f90_unix_MP_iargc referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char f90_unix_MP_iargc(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char f90_unix_MP_iargc(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_f90_unix_MP_iargc) || defined (__stub___f90_unix_MP_iargc) choke me #else f90_unix_MP_iargc(); #endif ; return 0; } Checking for function PXFGETARG sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char PXFGETARG(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char PXFGETARG(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_PXFGETARG) || defined (__stub___PXFGETARG) choke me #else PXFGETARG(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol PXFGETARG referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol PXFGETARG referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char PXFGETARG(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char PXFGETARG(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_PXFGETARG) || defined (__stub___PXFGETARG) choke me #else PXFGETARG(); #endif ; return 0; } Checking for function iargc_ sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char iargc_(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char iargc_(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_iargc_) || defined (__stub___iargc_) choke me #else iargc_(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol iargc_ referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol iargc_ referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char iargc_(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char iargc_(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_iargc_) || defined (__stub___iargc_) choke me #else iargc_(); #endif ; return 0; } Checking for function GETARG at 16 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(10) : error C2018: unknown character '0x40' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(10) : error C2143: syntax error : missing '{' before 'constant' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(10) : error C2059: syntax error : '' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(17) : error C2004: expected 'defined(id)' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(17) : fatal error C1012: unmatched parenthesis : missing ')' Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(10) : error C2018: unknown character '0x40' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(10) : error C2143: syntax error : missing '{' before 'constant' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(10) : error C2059: syntax error : '' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(17) : error C2004: expected 'defined(id)' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.c(17) : fatal error C1012: unmatched parenthesis : missing ')' ret = 512 Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char GETARG at 16(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char GETARG at 16(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub_GETARG at 16) || defined (__stub___GETARG at 16) choke me #else GETARG at 16(); #endif ; return 0; } Compile failed inside link Checking for function _gfortran_iargc sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.functions/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _gfortran_iargc(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _gfortran_iargc(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__gfortran_iargc) || defined (__stub____gfortran_iargc) choke me #else _gfortran_iargc(); #endif ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 sh: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol _gfortran_iargc referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /cygdrive/c/cygwin/packages/petsc-3.4.2/PROGRA~2IntelCOMPOS~1binintel64 conftest.obj : error LNK2019: unresolved external symbol _gfortran_iargc referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.FUN\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.functions/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.functions/conftest.o Ws2_32.lib -L/cygdrive/c/cygwin/packages/petsc-3.4.2/\PROGRA~2\Intel\COMPOS~1\bin\intel64 Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully few prototypes, which can conflict with char _gfortran_iargc(); below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _gfortran_iargc(); int main() { /* The GNU C library defines this for functions which it implements to always fail with ENOSYS. Some functions are actually named something starting with __ and the normal name is an alias. */ #if defined (__stub__gfortran_iargc) || defined (__stub____gfortran_iargc) choke me #else _gfortran_iargc(); #endif ; return 0; } Popping language C Pushing language C ================================================================================ TEST configureLibrary from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:744) TESTING: configureLibrary from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:744) Calls the regular package configureLibrary and then does an additional test needed by MPI ================================================================================== Checking for a functional MPI Checking for library in Package specific search directory MPI: [] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library [] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['fmpich2.lib', 'fmpich2g.lib', 'fmpich2s.lib', 'mpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['fmpich2.lib', 'fmpich2g.lib', 'fmpich2s.lib', 'mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib fmpich2g.lib fmpich2s.lib mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib fmpich2g.lib fmpich2s.lib mpi.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'fmpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib fmpich2g.lib fmpich2s.lib mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['fmpich2.lib', 'fmpich2g.lib', 'mpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['fmpich2.lib', 'fmpich2g.lib', 'mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib fmpich2g.lib mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib fmpich2g.lib mpi.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'fmpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib fmpich2g.lib mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['fmpich2.lib', 'mpich2.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['fmpich2.lib', 'mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib mpich2.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'fmpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o fmpich2.lib mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libfmpich2g.a', 'libmpi.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libfmpich2g.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich2g -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich2g -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich2g -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libfmpich.a', 'libmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libfmpich.a', 'libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a', 'libgm.a', 'libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a', 'libgm.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lgm -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lgm -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lgm -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['liblammpi++.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['liblammpi++.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['amd64/msmpifec.lib', 'amd64/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['amd64/msmpifec.lib', 'amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o amd64/msmpifec.lib amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o amd64/msmpifec.lib amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: amd64/msmpifec.lib Warning: win32fe: File Not Found: amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'amd64/msmpifec.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: amd64/msmpifec.lib Warning: win32fe: File Not Found: amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'amd64/msmpifec.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o amd64/msmpifec.lib amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['i386/msmpifec.lib', 'i386/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['i386/msmpifec.lib', 'i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o i386/msmpifec.lib i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o i386/msmpifec.lib i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: i386/msmpifec.lib Warning: win32fe: File Not Found: i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'i386/msmpifec.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: i386/msmpifec.lib Warning: win32fe: File Not Found: i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'i386/msmpifec.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o i386/msmpifec.lib i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a', 'libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpi++.a', 'libmpi.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpi++.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi++ -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi++ -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi++ -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpi.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['libmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['libmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['mpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpi.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'mpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'mpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['mpich2.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpich2.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'mpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'mpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['mpich.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['mpich.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpich.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpich.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'mpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'mpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o mpich.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['amd64/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'amd64/msmpi.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'amd64/msmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['i386/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'i386/msmpi.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'i386/msmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: [] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library [] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/fmpich2.lib', 'lib64/fmpich2g.lib', 'lib64/fmpich2s.lib', 'lib64/mpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/fmpich2.lib', 'lib64/fmpich2g.lib', 'lib64/fmpich2s.lib', 'lib64/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/fmpich2g.lib lib64/fmpich2s.lib lib64/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/fmpich2g.lib lib64/fmpich2s.lib lib64/mpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/fmpich2.lib Warning: win32fe: File Not Found: lib64/fmpich2g.lib Warning: win32fe: File Not Found: lib64/fmpich2s.lib Warning: win32fe: File Not Found: lib64/mpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/fmpich2.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/fmpich2.lib Warning: win32fe: File Not Found: lib64/fmpich2g.lib Warning: win32fe: File Not Found: lib64/fmpich2s.lib Warning: win32fe: File Not Found: lib64/mpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/fmpich2g.lib lib64/fmpich2s.lib lib64/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/fmpich2.lib', 'lib64/fmpich2g.lib', 'lib64/mpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/fmpich2.lib', 'lib64/fmpich2g.lib', 'lib64/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/fmpich2g.lib lib64/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/fmpich2g.lib lib64/mpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/fmpich2.lib Warning: win32fe: File Not Found: lib64/fmpich2g.lib Warning: win32fe: File Not Found: lib64/mpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/fmpich2.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/fmpich2.lib Warning: win32fe: File Not Found: lib64/fmpich2g.lib Warning: win32fe: File Not Found: lib64/mpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/fmpich2g.lib lib64/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/fmpich2.lib', 'lib64/mpich2.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/fmpich2.lib', 'lib64/mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/mpich2.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/fmpich2.lib Warning: win32fe: File Not Found: lib64/mpich2.lib LINK : fatal error LNK1104: cannot open file 'lib64/fmpich2.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/fmpich2.lib Warning: win32fe: File Not Found: lib64/mpich2.lib LINK : fatal error LNK1104: cannot open file 'lib64/fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/fmpich2.lib lib64/mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libfmpich2g.a', 'libmpi.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libfmpich2g.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich2g -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich2g -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich2g -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a', 'libgm.a', 'libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a', 'libgm.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lgm -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lgm -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lgm -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpio -lpmpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpi.a', 'liblam.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/amd64/msmpifec.lib', 'lib64/amd64/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/amd64/msmpifec.lib', 'lib64/amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/amd64/msmpifec.lib lib64/amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/amd64/msmpifec.lib lib64/amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/amd64/msmpifec.lib Warning: win32fe: File Not Found: lib64/amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/amd64/msmpifec.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/amd64/msmpifec.lib Warning: win32fe: File Not Found: lib64/amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/amd64/msmpifec.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/amd64/msmpifec.lib lib64/amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/i386/msmpifec.lib', 'lib64/i386/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/i386/msmpifec.lib', 'lib64/i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/i386/msmpifec.lib lib64/i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/i386/msmpifec.lib lib64/i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/i386/msmpifec.lib Warning: win32fe: File Not Found: lib64/i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/i386/msmpifec.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/i386/msmpifec.lib Warning: win32fe: File Not Found: lib64/i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/i386/msmpifec.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/i386/msmpifec.lib lib64/i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a', 'libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpi++.a', 'libmpi.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpi++.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi++ -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi++ -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi++ -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpi.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/libmpich.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/libmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/mpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/mpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/mpi.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/mpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/mpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/mpich2.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpich2.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/mpich2.lib LINK : fatal error LNK1104: cannot open file 'lib64/mpich2.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/mpich2.lib LINK : fatal error LNK1104: cannot open file 'lib64/mpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/mpich.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/mpich.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpich.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpich.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/mpich.lib LINK : fatal error LNK1104: cannot open file 'lib64/mpich.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/mpich.lib LINK : fatal error LNK1104: cannot open file 'lib64/mpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/mpich.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/amd64/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/amd64/msmpi.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/amd64/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/amd64/msmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['lib64/i386/msmpi.lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['lib64/i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: lib64/i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/i386/msmpi.lib' Possible ERROR while running linker: output: Warning: win32fe: File Not Found: lib64/i386/msmpi.lib LINK : fatal error LNK1104: cannot open file 'lib64/i386/msmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o lib64/i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/fmpich2.lib', '/opt/mpich/lib/fmpich2g.lib', '/opt/mpich/lib/fmpich2s.lib', '/opt/mpich/lib/mpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/fmpich2.lib', '/opt/mpich/lib/fmpich2g.lib', '/opt/mpich/lib/mpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/fmpich2.lib', '/opt/mpich/lib/mpich2.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/amd64/msmpifec.lib', '/opt/mpich/lib/amd64/msmpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/i386/msmpifec.lib', '/opt/mpich/lib/i386/msmpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpi.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/libmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/mpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/mpich2.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/mpich.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/amd64/msmpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib/i386/msmpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/fmpich2.lib', '/opt/mpich/lib64/fmpich2g.lib', '/opt/mpich/lib64/fmpich2s.lib', '/opt/mpich/lib64/mpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/fmpich2.lib', '/opt/mpich/lib64/fmpich2g.lib', '/opt/mpich/lib64/mpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/fmpich2.lib', '/opt/mpich/lib64/mpich2.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/amd64/msmpifec.lib', '/opt/mpich/lib64/amd64/msmpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/i386/msmpifec.lib', '/opt/mpich/lib64/i386/msmpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpi.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/libmpich.a']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/mpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/mpich2.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/mpich.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/amd64/msmpi.lib']") Directory does not exist: /opt/mpich (while checking "Package specific search directory MPI" for "['/opt/mpich/lib64/i386/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/fmpich2.lib', '/usr/lpp/ppe.poe/lib/fmpich2g.lib', '/usr/lpp/ppe.poe/lib/fmpich2s.lib', '/usr/lpp/ppe.poe/lib/mpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/fmpich2.lib', '/usr/lpp/ppe.poe/lib/fmpich2g.lib', '/usr/lpp/ppe.poe/lib/mpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/fmpich2.lib', '/usr/lpp/ppe.poe/lib/mpich2.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/amd64/msmpifec.lib', '/usr/lpp/ppe.poe/lib/amd64/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/i386/msmpifec.lib', '/usr/lpp/ppe.poe/lib/i386/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpi.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/libmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/mpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/mpich2.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/mpich.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/amd64/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib/i386/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/fmpich2.lib', '/usr/lpp/ppe.poe/lib64/fmpich2g.lib', '/usr/lpp/ppe.poe/lib64/fmpich2s.lib', '/usr/lpp/ppe.poe/lib64/mpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/fmpich2.lib', '/usr/lpp/ppe.poe/lib64/fmpich2g.lib', '/usr/lpp/ppe.poe/lib64/mpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/fmpich2.lib', '/usr/lpp/ppe.poe/lib64/mpich2.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/amd64/msmpifec.lib', '/usr/lpp/ppe.poe/lib64/amd64/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/i386/msmpifec.lib', '/usr/lpp/ppe.poe/lib64/i386/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpi.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/libmpich.a']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/mpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/mpich2.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/mpich.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/amd64/msmpi.lib']") Directory does not exist: /usr/lpp/ppe.poe (while checking "Package specific search directory MPI" for "['/usr/lpp/ppe.poe/lib64/i386/msmpi.lib']") Checking for library in Package specific search directory MPI: [] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library [] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/fmpich2.lib', '/usr/local/lib/fmpich2g.lib', '/usr/local/lib/fmpich2s.lib', '/usr/local/lib/mpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/fmpich2.lib', '/usr/local/lib/fmpich2g.lib', '/usr/local/lib/fmpich2s.lib', '/usr/local/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/fmpich2g.lib /usr/local/lib/fmpich2s.lib /usr/local/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/fmpich2g.lib /usr/local/lib/fmpich2s.lib /usr/local/lib/mpi.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\fmpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/fmpich2g.lib /usr/local/lib/fmpich2s.lib /usr/local/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/fmpich2.lib', '/usr/local/lib/fmpich2g.lib', '/usr/local/lib/mpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/fmpich2.lib', '/usr/local/lib/fmpich2g.lib', '/usr/local/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/fmpich2g.lib /usr/local/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/fmpich2g.lib /usr/local/lib/mpi.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\fmpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/fmpich2g.lib /usr/local/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/fmpich2.lib', '/usr/local/lib/mpich2.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/fmpich2.lib', '/usr/local/lib/mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/mpich2.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\fmpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\fmpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/fmpich2.lib /usr/local/lib/mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libfmpich2g.a', 'libmpi.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libfmpich2g.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich2g -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich2g -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich2g -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich -lmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a', 'libgm.a', 'libpthread.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a', 'libgm.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lgm -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lgm -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lgm -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/liblammpi++.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpio -lpmpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/liblammpi++.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpi++ -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi -llam Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/amd64/msmpifec.lib', '/usr/local/lib/amd64/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/amd64/msmpifec.lib', '/usr/local/lib/amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/amd64/msmpifec.lib /usr/local/lib/amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/amd64/msmpifec.lib /usr/local/lib/amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib/amd64/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/amd64/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib/amd64/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/amd64/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/amd64/msmpifec.lib /usr/local/lib/amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/i386/msmpifec.lib', '/usr/local/lib/i386/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/i386/msmpifec.lib', '/usr/local/lib/i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/i386/msmpifec.lib /usr/local/lib/i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/i386/msmpifec.lib /usr/local/lib/i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib/i386/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/i386/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib/i386/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/i386/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/i386/msmpifec.lib /usr/local/lib/i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a', 'libpthread.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpi++.a', 'libmpi.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpi++.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi++ -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi++ -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi++ -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpi.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/libmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/libmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib -L/usr/local/lib -lmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/mpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpi.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\mpi.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\mpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/mpich2.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpich2.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\mpich2.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\mpich2.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/mpich.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/mpich.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpich.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpich.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\mpich.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'C:\cygwin\usr\local\lib\mpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/mpich.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/amd64/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib/i386/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib/i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib/i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: [] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library [] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/fmpich2.lib', '/usr/local/lib64/fmpich2g.lib', '/usr/local/lib64/fmpich2s.lib', '/usr/local/lib64/mpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/fmpich2.lib', '/usr/local/lib64/fmpich2g.lib', '/usr/local/lib64/fmpich2s.lib', '/usr/local/lib64/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/fmpich2g.lib /usr/local/lib64/fmpich2s.lib /usr/local/lib64/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/fmpich2g.lib /usr/local/lib64/fmpich2s.lib /usr/local/lib64/mpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2.lib Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2g.lib Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2s.lib Warning: win32fe: File Not Found: /usr/local/lib64/mpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2g.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2s.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2.lib Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2g.lib Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2s.lib Warning: win32fe: File Not Found: /usr/local/lib64/mpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2g.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2s.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/fmpich2g.lib /usr/local/lib64/fmpich2s.lib /usr/local/lib64/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/fmpich2.lib', '/usr/local/lib64/fmpich2g.lib', '/usr/local/lib64/mpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/fmpich2.lib', '/usr/local/lib64/fmpich2g.lib', '/usr/local/lib64/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/fmpich2g.lib /usr/local/lib64/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/fmpich2g.lib /usr/local/lib64/mpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2.lib Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2g.lib Warning: win32fe: File Not Found: /usr/local/lib64/mpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2g.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2.lib Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2g.lib Warning: win32fe: File Not Found: /usr/local/lib64/mpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2g.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/fmpich2g.lib /usr/local/lib64/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/fmpich2.lib', '/usr/local/lib64/mpich2.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/fmpich2.lib', '/usr/local/lib64/mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/mpich2.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2.lib Warning: win32fe: File Not Found: /usr/local/lib64/mpich2.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpich2.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/fmpich2.lib Warning: win32fe: File Not Found: /usr/local/lib64/mpich2.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/fmpich2.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpich2.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/fmpich2.lib /usr/local/lib64/mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libfmpich2g.a', 'libmpi.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libfmpich2g.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich2g -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich2g -lmpi Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libfmpich2g.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich2g -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich -lmpich -lpmpich Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpmpich Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libfmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lfmpich -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpmpich -lmpich -lpmpich -lpmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lrt -laio -lsnl -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lssl -luuid -lpthread -lrt -ldl Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lnsl -lsocket -lrt -lnsl -lsocket Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a', 'libgm.a', 'libpthread.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a', 'libgm.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lgm -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lgm -lpthread Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lgm -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblamf77mpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llamf77mpi -lmpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpi++ -lmpi -llam Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -llamf77mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -llamf90mpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -lmpi -llam Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpio.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpio -lpmpi -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpi++ -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpi++ -lmpi -llam Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'liblammpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -llammpi++ -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpi.a', 'liblam.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpi.a', 'liblam.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi -llam Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi -llam Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi -llam Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/amd64/msmpifec.lib', '/usr/local/lib64/amd64/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/amd64/msmpifec.lib', '/usr/local/lib64/amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/amd64/msmpifec.lib /usr/local/lib64/amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/amd64/msmpifec.lib /usr/local/lib64/amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/amd64/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib64/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/amd64/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/amd64/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib64/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/amd64/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/amd64/msmpifec.lib /usr/local/lib64/amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/i386/msmpifec.lib', '/usr/local/lib64/i386/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/i386/msmpifec.lib', '/usr/local/lib64/i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/i386/msmpifec.lib /usr/local/lib64/i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/i386/msmpifec.lib /usr/local/lib64/i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/i386/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib64/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/i386/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/i386/msmpifec.lib Warning: win32fe: File Not Found: /usr/local/lib64/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/i386/msmpifec.lib' cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/i386/msmpifec.lib /usr/local/lib64/i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a', 'libpthread.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a', 'libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpthread Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpi++.a', 'libmpi.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpi++.a', 'libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi++ -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi++ -lmpi Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpi++.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi++ -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpi.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpi.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpi.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpi.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpi Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/libmpich.a'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/libmpich.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich Ws2_32.lib sh: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' Possible ERROR while running linker: output: Warning: win32fe: Library Path Not Found: /usr/local/lib64 Warning: win32fe: Library Path Not Found: /usr/local/lib64 LINK : fatal error LNK1104: cannot open file 'libmpich.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -L/usr/local/lib64 -L/usr/local/lib64 -lmpich Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/mpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/mpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/mpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/mpich2.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/mpich2.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpich2.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpich2.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/mpich2.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpich2.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/mpich2.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpich2.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpich2.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/mpich.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/mpich.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpich.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpich.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/mpich.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpich.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/mpich.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/mpich.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/mpich.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/amd64/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/amd64/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/amd64/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/amd64/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/amd64/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/amd64/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/amd64/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/usr/local/lib64/i386/msmpi.lib'] Contents: ['bin', 'etc', 'include', 'lib'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/usr/local/lib64/i386/msmpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/i386/msmpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/i386/msmpi.lib Ws2_32.lib sh: Warning: win32fe: File Not Found: /usr/local/lib64/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: Warning: win32fe: File Not Found: /usr/local/lib64/i386/msmpi.lib cl : Command line warning D9002 : ignoring unknown option '/usr/local/lib64/i386/msmpi.lib' conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /usr/local/lib64/i386/msmpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2g.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2s.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2g.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpifec.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/i386/msmpifec.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpi.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpich.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2g.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2s.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2g.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpifec.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpifec.lib', '/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpi.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpich.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2g.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2s.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2g.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/amd64/msmpifec.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/i386/msmpifec.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpi.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/mpich.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2g.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2s.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2g.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/amd64/msmpifec.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/i386/msmpifec.lib', '/c/Program Files/Microsoft Compute Cluster Pack/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpi.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpich.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files/Microsoft Compute Cluster Pack/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/fmpich2.lib', '/c/Program Files/MPICH2/lib/fmpich2g.lib', '/c/Program Files/MPICH2/lib/fmpich2s.lib', '/c/Program Files/MPICH2/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/fmpich2.lib', '/c/Program Files/MPICH2/lib/fmpich2g.lib', '/c/Program Files/MPICH2/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/fmpich2.lib', '/c/Program Files/MPICH2/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/amd64/msmpifec.lib', '/c/Program Files/MPICH2/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/i386/msmpifec.lib', '/c/Program Files/MPICH2/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpi.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/libmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/mpich.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/fmpich2.lib', '/c/Program Files/MPICH2/lib64/fmpich2g.lib', '/c/Program Files/MPICH2/lib64/fmpich2s.lib', '/c/Program Files/MPICH2/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/fmpich2.lib', '/c/Program Files/MPICH2/lib64/fmpich2g.lib', '/c/Program Files/MPICH2/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/fmpich2.lib', '/c/Program Files/MPICH2/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/amd64/msmpifec.lib', '/c/Program Files/MPICH2/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/i386/msmpifec.lib', '/c/Program Files/MPICH2/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpi.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/libmpich.a']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/mpich.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH2/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/fmpich2.lib', '/c/Program Files/MPICH/lib/fmpich2g.lib', '/c/Program Files/MPICH/lib/fmpich2s.lib', '/c/Program Files/MPICH/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/fmpich2.lib', '/c/Program Files/MPICH/lib/fmpich2g.lib', '/c/Program Files/MPICH/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/fmpich2.lib', '/c/Program Files/MPICH/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/amd64/msmpifec.lib', '/c/Program Files/MPICH/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/i386/msmpifec.lib', '/c/Program Files/MPICH/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpi.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/libmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/mpich.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/fmpich2.lib', '/c/Program Files/MPICH/lib64/fmpich2g.lib', '/c/Program Files/MPICH/lib64/fmpich2s.lib', '/c/Program Files/MPICH/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/fmpich2.lib', '/c/Program Files/MPICH/lib64/fmpich2g.lib', '/c/Program Files/MPICH/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/fmpich2.lib', '/c/Program Files/MPICH/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/amd64/msmpifec.lib', '/c/Program Files/MPICH/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/i386/msmpifec.lib', '/c/Program Files/MPICH/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpi.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/libmpich.a']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/mpich.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/fmpich2.lib', '/c/Program Files/MPICH/SDK.gcc/lib/fmpich2g.lib', '/c/Program Files/MPICH/SDK.gcc/lib/fmpich2s.lib', '/c/Program Files/MPICH/SDK.gcc/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/fmpich2.lib', '/c/Program Files/MPICH/SDK.gcc/lib/fmpich2g.lib', '/c/Program Files/MPICH/SDK.gcc/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/fmpich2.lib', '/c/Program Files/MPICH/SDK.gcc/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/amd64/msmpifec.lib', '/c/Program Files/MPICH/SDK.gcc/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/i386/msmpifec.lib', '/c/Program Files/MPICH/SDK.gcc/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/libmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/mpich.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/fmpich2.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/fmpich2g.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/fmpich2s.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/fmpich2.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/fmpich2g.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/fmpich2.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/amd64/msmpifec.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/i386/msmpifec.lib', '/c/Program Files/MPICH/SDK.gcc/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/libmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/mpich.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK.gcc/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/fmpich2.lib', '/c/Program Files/MPICH/SDK/lib/fmpich2g.lib', '/c/Program Files/MPICH/SDK/lib/fmpich2s.lib', '/c/Program Files/MPICH/SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/fmpich2.lib', '/c/Program Files/MPICH/SDK/lib/fmpich2g.lib', '/c/Program Files/MPICH/SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/fmpich2.lib', '/c/Program Files/MPICH/SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/amd64/msmpifec.lib', '/c/Program Files/MPICH/SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/i386/msmpifec.lib', '/c/Program Files/MPICH/SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/libmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/mpich.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/fmpich2.lib', '/c/Program Files/MPICH/SDK/lib64/fmpich2g.lib', '/c/Program Files/MPICH/SDK/lib64/fmpich2s.lib', '/c/Program Files/MPICH/SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/fmpich2.lib', '/c/Program Files/MPICH/SDK/lib64/fmpich2g.lib', '/c/Program Files/MPICH/SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/fmpich2.lib', '/c/Program Files/MPICH/SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/amd64/msmpifec.lib', '/c/Program Files/MPICH/SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/i386/msmpifec.lib', '/c/Program Files/MPICH/SDK/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpi.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/libmpich.a']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/mpich.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files/MPICH/SDK/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/fmpich2g.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/fmpich2s.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/fmpich2g.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpifec.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/i386/msmpifec.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/libmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/mpich.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/fmpich2g.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/fmpich2s.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/fmpich2g.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpifec.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpifec.lib', '/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/mpich.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/fmpich2g.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/fmpich2s.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/fmpich2g.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/amd64/msmpifec.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/i386/msmpifec.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/libmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/mpich.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/fmpich2g.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/fmpich2s.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/fmpich2g.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/amd64/msmpifec.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/i386/msmpifec.lib', '/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpi.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/libmpich.a']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/mpich.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/Microsoft Compute Cluster Pack/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH2/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH2/lib/fmpich2s.lib', '/c/Program Files (x86)/MPICH2/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH2/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH2/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH2/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH2/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH2/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH2/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH2/lib64/fmpich2s.lib', '/c/Program Files (x86)/MPICH2/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH2/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH2/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH2/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH2/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH2/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH2 (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH2/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH/lib/fmpich2s.lib', '/c/Program Files (x86)/MPICH/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH/lib64/fmpich2s.lib', '/c/Program Files (x86)/MPICH/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/fmpich2s.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/fmpich2s.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK.gcc/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK.gcc (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK.gcc/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK/lib/fmpich2s.lib', '/c/Program Files (x86)/MPICH/SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK/lib/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/fmpich2s.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/fmpich2g.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/fmpich2.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/amd64/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/i386/msmpifec.lib', '/c/Program Files (x86)/MPICH/SDK/lib64/i386/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpi.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/libmpich.a']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/mpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/mpich2.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/mpich.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /c/Program Files (x86)/MPICH/SDK (while checking "Package specific search directory MPI" for "['/c/Program Files (x86)/MPICH/SDK/lib64/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2s.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/i386/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/libmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/mpich.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2s.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/libmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/mpich.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft HPC Pack 2008 SDK/lib64/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2s.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/amd64/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/i386/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpi++.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/libmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/mpich.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "[]") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2s.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2g.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/fmpich2.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libfmpich2g.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libfmpich.a', 'libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpmpich.a', 'libmpich.a', 'libpmpich.a', 'libpmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'librt.a', 'libaio.a', 'libsnl.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libssl.a', 'libuuid.a', 'libpthread.a', 'librt.a', 'libdl.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libnsl.a', 'libsocket.a', 'librt.a', 'libnsl.a', 'libsocket.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libgm.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblamf77mpi.a', 'libmpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'liblamf77mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'liblamf90mpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpio.a', 'libpmpi.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/liblammpi++.a', 'libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpi.a', 'liblam.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/amd64/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/i386/msmpifec.lib', '/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/i386/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a', 'libpthread.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpi++.a', 'libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpi.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/libmpich.a']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpich2.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/mpich.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/amd64/msmpi.lib']") Directory does not exist: /cygdrive/c/Program Files/Microsoft Compute Cluster Pack (while checking "Package specific search directory MPI" for "['/cygdrive/c/Program Files/Microsoft Compute Cluster Pack/lib64/i386/msmpi.lib']") Checking for library in Package specific search directory MPI: [] Contents: ['bin', 'COPYRIGHT.rtf', 'examples', 'include', 'lib', 'README.winbin.rtf', 'setup.jpg'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library [] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_Init referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2s.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] Contents: ['bin', 'COPYRIGHT.rtf', 'examples', 'include', 'lib', 'README.winbin.rtf', 'setup.jpg'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2s.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2s.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2s.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'C:\PROGRA~1\MPICH2\lib\fmpich2s.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'C:\PROGRA~1\MPICH2\lib\fmpich2s.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2s.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Popping language C Checking for library in Package specific search directory MPI: ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] Contents: ['bin', 'COPYRIGHT.rtf', 'examples', 'include', 'lib', 'README.winbin.rtf', 'setup.jpg'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function MPI_Init in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); int main() { MPI_Init() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Checking for function MPI_Comm_create in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Comm_create(); int main() { MPI_Comm_create() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Checking for headers Package specific search directory MPI: ['/cygdrive/c/Program Files/MPICH2/include'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['mpi.h'] in ['/cygdrive/c/Program Files/MPICH2/include'] Checking include with compiler flags var CPPFLAGS ['/cygdrive/c/Program Files/MPICH2/include'] sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 239 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 243 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 247 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 251 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 255 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 259 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 263 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 267 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 271 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 275 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 279 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 283 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 287 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 291 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 295 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 299 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 303 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 307 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 311 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 315 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 319 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 323 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 327 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 331 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 335 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 339 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 343 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 347 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 351 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 355 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 359 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 363 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 367 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 371 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 375 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 379 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 383 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 387 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 391 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 395 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 399 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 403 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 407 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 411 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 415 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 419 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 423 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 427 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 431 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 435 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 439 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 443 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 447 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 451 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 455 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 459 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 463 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 465 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" typedef int int32_t; typedef int mode_t; typedef int pid_t; #line 14 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 16 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" #line 16 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" typedef int MPI_Datatype; typedef int MPI_Comm; typedef int MPI_Group; typedef int MPI_Win; typedef struct ADIOI_FileD *MPI_File; typedef int MPI_Op; extern int MPICH_ATTR_FAILED_PROCESSES; typedef enum MPIR_Topo_type { MPI_GRAPH=1, MPI_CART=2, MPI_DIST_GRAPH=3 } MPIR_Topo_type; typedef void (MPI_Handler_function) ( MPI_Comm *, int *, ... ); typedef int (MPI_Comm_copy_attr_function)(MPI_Comm, int, void *, void *, void *, int *); typedef int (MPI_Comm_delete_attr_function)(MPI_Comm, int, void *, void *); typedef int (MPI_Type_copy_attr_function)(MPI_Datatype, int, void *, void *, void *, int *); typedef int (MPI_Type_delete_attr_function)(MPI_Datatype, int, void *, void *); typedef int (MPI_Win_copy_attr_function)(MPI_Win, int, void *, void *, void *, int *); typedef int (MPI_Win_delete_attr_function)(MPI_Win, int, void *, void *); typedef void (MPI_Comm_errhandler_function)(MPI_Comm *, int *, ...); typedef void (MPI_File_errhandler_function)(MPI_File *, int *, ...); typedef void (MPI_Win_errhandler_function)(MPI_Win *, int *, ...); typedef MPI_Comm_errhandler_function MPI_Comm_errhandler_fn; typedef MPI_File_errhandler_function MPI_File_errhandler_fn; typedef MPI_Win_errhandler_function MPI_Win_errhandler_fn; typedef int MPI_Errhandler; typedef int MPI_Request; typedef void (MPI_User_function) ( void *, void *, int *, MPI_Datatype * ); typedef int (MPI_Copy_function) ( MPI_Comm, int, void *, void *, void *, int * ); typedef int (MPI_Delete_function) ( MPI_Comm, int, void *, void * ); enum MPIR_Combiner_enum { MPI_COMBINER_NAMED = 1, MPI_COMBINER_DUP = 2, MPI_COMBINER_CONTIGUOUS = 3, MPI_COMBINER_VECTOR = 4, MPI_COMBINER_HVECTOR_INTEGER = 5, MPI_COMBINER_HVECTOR = 6, MPI_COMBINER_INDEXED = 7, MPI_COMBINER_HINDEXED_INTEGER = 8, MPI_COMBINER_HINDEXED = 9, MPI_COMBINER_INDEXED_BLOCK = 10, MPI_COMBINER_STRUCT_INTEGER = 11, MPI_COMBINER_STRUCT = 12, MPI_COMBINER_SUBARRAY = 13, MPI_COMBINER_DARRAY = 14, MPI_COMBINER_F90_REAL = 15, MPI_COMBINER_F90_COMPLEX = 16, MPI_COMBINER_F90_INTEGER = 17, MPI_COMBINER_RESIZED = 18 }; typedef int MPI_Info; #line 378 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" #line 380 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" typedef __int64 MPI_Aint; typedef int MPI_Fint; #line 398 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" typedef __int64 MPI_Offset; #line 400 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" typedef struct MPI_Status { int count; int cancelled; int MPI_SOURCE; int MPI_TAG; int MPI_ERROR; } MPI_Status; #line 462 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" #line 463 "C:\\PROGRA~1\\MPICH2\\include\\mpi.h" extern __declspec(dllimport) MPI_Fint * MPI_F_STATUS_IGNORE; extern __declspec(dllimport) MPI_Fint * MPI_F_STATUSES_IGNORE; typedef int (MPI_Grequest_cancel_function)(void *, int); typedef int (MPI_Grequest_free_function)(void *); typedef int (MPI_Grequest_query_function)(void *, MPI_Status *); Found header files ['mpi.h'] in ['/cygdrive/c/Program Files/MPICH2/include'] Popping language C ================================================================================ TEST configureConversion from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:214) TESTING: configureConversion from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:214) Check for the functions which convert communicators between C and Fortran - Define HAVE_MPI_COMM_F2C and HAVE_MPI_COMM_C2F if they are present - Some older MPI 1 implementations are missing these All intermediate test results are stored in /tmp/petsc-1nzsmm/config.packages.MPI sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Comm_f2c((MPI_Fint)0)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: Defined "HAVE_MPI_COMM_F2C" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Comm_c2f(MPI_COMM_WORLD)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_MPI_COMM_C2F" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_Fint a; ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_MPI_FINT" to "1" ================================================================================ TEST configureMPI2 from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:182) TESTING: configureMPI2 from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:182) Check for functions added to the interface in MPI-2 sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int flag;if (MPI_Finalized(&flag)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_MPI_FINALIZED" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Allreduce(MPI_IN_PLACE,0, 1, MPI_INT, MPI_SUM, MPI_COMM_SELF)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_MPI_IN_PLACE" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int count=2; int blocklens[2]={0,1}; MPI_Aint indices[2]={0,1}; MPI_Datatype old_types[2]={0,1}; MPI_Datatype *newtype = 0; if (MPI_Type_create_struct(count, blocklens, indices, old_types, newtype)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_Comm_errhandler_fn * p_err_fun = 0; MPI_Errhandler * p_errhandler = 0; if (MPI_Comm_create_errhandler(p_err_fun,p_errhandler)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Comm_set_errhandler(MPI_COMM_WORLD,MPI_ERRORS_RETURN)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link ================================================================================ TEST configureTypes from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:234) TESTING: configureTypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:234) Checking for MPI types Checking for size of type: MPI_Comm Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif #define MPICH_IGNORE_CXX_SEEK #define MPICH_SKIP_MPICXX 1 #define OMPI_SKIP_MPICXX 1 #include int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(MPI_Comm)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_MPI_COMM" to "4" Checking for size of type: MPI_Fint Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.types/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.types/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.types/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif #define MPICH_IGNORE_CXX_SEEK #define MPICH_SKIP_MPICXX 1 #define OMPI_SKIP_MPICXX 1 #include int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(MPI_Fint)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.types/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.types/conftest.o Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.TYP\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: /tmp/petsc-1nzsmm/config.types/conftest.exe Executing: /tmp/petsc-1nzsmm/config.types/conftest.exe sh: Popping language C Defined "SIZEOF_MPI_FINT" to "4" ================================================================================ TEST configureMPITypes from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:246) TESTING: configureMPITypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:246) Checking for MPI Datatype handles sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_LONG_DOUBLE, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_LONG_DOUBLE, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe sh: /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe Executing: /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe sh: Defined "HAVE_MPI_LONG_DOUBLE" to "1" Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_C_DOUBLE_COMPLEX, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_C_DOUBLE_COMPLEX, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link Executing: /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe sh: /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe Executing: /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe sh: Defined "HAVE_MPI_C_DOUBLE_COMPLEX" to "1" Popping language C ================================================================================ TEST configureMissingPrototypes from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:319) TESTING: configureMissingPrototypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:319) Checks for missing prototypes, which it adds to petscfix.h ================================================================================ TEST SGIMPICheck from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:630) TESTING: SGIMPICheck from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:630) Returns true if SGI MPI is used Checking for function MPI_SGI_barrier in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_SGI_barrier(); int main() { MPI_SGI_barrier() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol MPI_SGI_barrier referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol MPI_SGI_barrier referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_SGI_barrier(); int main() { MPI_SGI_barrier() ; return 0; } Popping language C SGI MPI test failure ================================================================================ TEST CxxMPICheck from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:640) TESTING: CxxMPICheck from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:640) Make sure C++ can compile and link Pushing language Cxx Checking for header mpi.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -GR -EHsc -Z7 -Zm200 -TP -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -GR -EHsc -Z7 -Zm200 -TP -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { ; return 0; } Checking for C++ MPI_Finalize() Checking for function MPI_Finalize in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language Cxx sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -GR -EHsc -Z7 -Zm200 -TP -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.cc Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.libraries -MT -GR -EHsc -Z7 -Zm200 -TP -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.cc sh: conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include int main() { int ierr; ierr = MPI_Finalize();; return 0; } Pushing language CXX Popping language CXX sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -GR -EHsc -Z7 -Zm200 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language Cxx Popping language Cxx ================================================================================ TEST FortranMPICheck from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:658) TESTING: FortranMPICheck from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:658) Make sure fortran include [mpif.h] and library symbols are found Pushing language FC Checking for header mpif.h sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.F sh: Successful compile: Source: program main include 'mpif.h' end Checking for fortran mpi_init() Checking for function in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.F sh: Successful compile: Source: program main include 'mpif.h' integer ierr call mpi_init(ierr) end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language FC Checking for mpi.mod Checking for function in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.F Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.libraries -MT -Z7 -fpp -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.libraries/conftest.F sh: Successful compile: Source: program main use mpi integer ierr,rank call mpi_init(ierr) call mpi_comm_rank(MPI_COMM_WORLD,rank,ierr) end Pushing language FC Popping language FC sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -Z7 -fpp /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language FC Defined "HAVE_MPI_F90MODULE" to "1" Popping language FC ================================================================================ TEST configureIO from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:683) TESTING: configureIO from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:683) Check for the functions in MPI/IO - Define HAVE_MPIIO if they are present - Some older MPI 1 implementations are missing these sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_Aint lb, extent; if (MPI_Type_get_extent(MPI_INT, &lb, &extent)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(9) : warning C4700: uninitialized local variable 'buf' used c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(9) : warning C4700: uninitialized local variable 'fh' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; void *buf; MPI_Status status; if (MPI_File_write_all(fh, buf, 1, MPI_INT, &status)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(9) : warning C4700: uninitialized local variable 'buf' used c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(9) : warning C4700: uninitialized local variable 'fh' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; void *buf; MPI_Status status; if (MPI_File_read_all(fh, buf, 1, MPI_INT, &status)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(9) : warning C4700: uninitialized local variable 'info' used c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(9) : warning C4700: uninitialized local variable 'disp' used c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(9) : warning C4700: uninitialized local variable 'fh' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; MPI_Offset disp; MPI_Info info; if (MPI_File_set_view(fh, disp, MPI_INT, MPI_INT, "", info)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c c:\cygwin\tmp\petsc-1nzsmm\config.packages.mpi\conftest.c(8) : warning C4700: uninitialized local variable 'info' used Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; MPI_Info info; if (MPI_File_open(MPI_COMM_SELF, "", 0, info, &fh)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; MPI_Info info; if (MPI_File_close(&fh)); ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.MPI\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_MPIIO" to "1" ================================================================================ TEST findMPIInc from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:720) TESTING: findMPIInc from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:720) Find MPI include paths from "mpicc -show" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -show Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -show sh: cl : Command line warning D9002 : ignoring unknown option '-show' cl : Command line error D8003 : missing source filename Checking for function MPI_Alltoallw in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Alltoallw(); int main() { MPI_Alltoallw() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Checking for function MPI_Type_create_indexed_block in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Type_create_indexed_block(); int main() { MPI_Type_create_indexed_block() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_ALLTOALLW" to "1" Checking for function MPI_Win_create in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Win_create(); int main() { MPI_Win_create() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_WIN_CREATE" to "1" Defined "HAVE_MPI_REPLACE" to "1" Checking for function MPI_Comm_spawn in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Comm_spawn(); int main() { MPI_Comm_spawn() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_COMM_SPAWN" to "1" Checking for function MPI_Type_get_envelope in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Type_get_envelope(); int main() { MPI_Type_get_envelope() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_TYPE_GET_ENVELOPE" to "1" Checking for function MPI_Type_get_extent in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Type_get_extent(); int main() { MPI_Type_get_extent() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_TYPE_GET_EXTENT" to "1" Checking for function MPI_Type_dup in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Type_dup(); int main() { MPI_Type_dup() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_TYPE_DUP" to "1" Checking for function MPI_Init_thread in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init_thread(); int main() { MPI_Init_thread() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_INIT_THREAD" to "1" Checking for function MPIX_Iallreduce in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIX_Iallreduce(); int main() { MPIX_Iallreduce() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol MPIX_Iallreduce referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol MPIX_Iallreduce referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIX_Iallreduce(); int main() { MPIX_Iallreduce() ; return 0; } Popping language C Checking for function MPI_Iallreduce in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Iallreduce(); int main() { MPI_Iallreduce() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPI_Iallreduce referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_Iallreduce referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Iallreduce(); int main() { MPI_Iallreduce() ; return 0; } Popping language C Checking for function MPI_Ibarrier in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Ibarrier(); int main() { MPI_Ibarrier() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPI_Ibarrier referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPI_Ibarrier referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Ibarrier(); int main() { MPI_Ibarrier() ; return 0; } Popping language C Checking for function MPI_Finalized in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Finalized(); int main() { MPI_Finalized() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_FINALIZED" to "1" Checking for function MPI_Exscan in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Exscan(); int main() { MPI_Exscan() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link Defined "HAVE_LIBFMPICH2" to "1" Defined "HAVE_LIBFMPICH2G" to "1" Defined "HAVE_LIBMPI" to "1" Popping language C Defined "HAVE_MPI_EXSCAN" to "1" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int combiner = MPI_COMBINER_DUP;; return 0; } Defined "HAVE_MPI_COMBINER_DUP" to "1" Checking for function MPIDI_CH3I_sock_set in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIDI_CH3I_sock_set(); int main() { MPIDI_CH3I_sock_set() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol MPIDI_CH3I_sock_set referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: LINK : C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe not found or not built by the last incremental link; performing full link conftest.obj : error LNK2019: unresolved external symbol MPIDI_CH3I_sock_set referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIDI_CH3I_sock_set(); int main() { MPIDI_CH3I_sock_set() ; return 0; } Popping language C Checking for function MPIDI_CH3I_sock_fixed_nbc_progress in library ['/cygdrive/c/Program Files/MPICH2/lib/fmpich2.lib', '/cygdrive/c/Program Files/MPICH2/lib/fmpich2g.lib', '/cygdrive/c/Program Files/MPICH2/lib/mpi.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIDI_CH3I_sock_fixed_nbc_progress(); int main() { MPIDI_CH3I_sock_fixed_nbc_progress() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib sh: conftest.obj : error LNK2019: unresolved external symbol MPIDI_CH3I_sock_fixed_nbc_progress referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals Possible ERROR while running linker: output: conftest.obj : error LNK2019: unresolved external symbol MPIDI_CH3I_sock_fixed_nbc_progress referenced in function main C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1120: 1 unresolved externals ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIDI_CH3I_sock_fixed_nbc_progress(); int main() { MPIDI_CH3I_sock_fixed_nbc_progress() ; return 0; } Popping language C ================================================================================ TEST checkSharedLibrary from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:132) TESTING: checkSharedLibrary from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:132) Sets flag indicating if MPI libraries are shared or not and determines if MPI libraries CANNOT be used by shared libraries ================================================================================ TEST configureMPIEXEC from config.packages.MPI(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/packages/MPI.py:145) TESTING: configureMPIEXEC from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:145) Checking for mpiexec Pushing language C Popping language C Checking for program /cygdrive/c/Program Files/MPICH2/bin/mpiexec...found Defined make macro "MPIEXEC" to "/cygdrive/c/Program\ Files/MPICH2/bin/mpiexec" sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #ifdef __cplusplus extern "C" #endif int init(int argc, char *argv[]) { int isInitialized; MPI_Init(&argc, &argv); MPI_Initialized(&isInitialized); return (int) isInitialized; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o ; /usr/bin/true /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o ; /usr/bin/true /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 -I/cygdrive/c/Program\ Files/MPICH2/include /tmp/petsc-1nzsmm/config.packages.MPI/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #ifdef __cplusplus extern "C" #endif int checkInit(void) { int isInitialized; MPI_Initialized(&isInitialized); if (isInitialized) MPI_Finalize(); return (int) isInitialized; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o ; /usr/bin/true /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe lib -a /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib /tmp/petsc-1nzsmm/config.packages.MPI/conftest.o ; /usr/bin/true /tmp/petsc-1nzsmm/config.packages.MPI/libconftest.lib sh: sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(19) : error C2065: 'RTLD_LAZY' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(19) : warning C4047: '=' : 'void *' differs in levels of indirection from 'int' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(33) : error C2065: 'RTLD_LAZY' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(33) : warning C4047: '=' : 'void *' differs in levels of indirection from 'int' Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(19) : error C2065: 'RTLD_LAZY' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(19) : warning C4047: '=' : 'void *' differs in levels of indirection from 'int' C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(33) : error C2065: 'RTLD_LAZY' : undeclared identifier C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.c(33) : warning C4047: '=' : 'void *' differs in levels of indirection from 'int' ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include #include #ifdef PETSC_HAVE_DLFCN_H #include #endif int main() { int argc = 1; char *argv[2] = {(char *) "conftest", NULL}; void *lib; int (*init)(int, char **); int (*checkInit)(void); lib = dlopen("/tmp/petsc-1nzsmm/config.libraries/lib1.lib", RTLD_LAZY); if (!lib) { fprintf(stderr, "Could not open lib1.so: %s\n", dlerror()); exit(1); } init = (int (*)(int, char **)) dlsym(lib, "init"); if (!init) { fprintf(stderr, "Could not find initialization function\n"); exit(1); } if (!(*init)(argc, argv)) { fprintf(stderr, "Could not initialize library\n"); exit(1); } lib = dlopen("/tmp/petsc-1nzsmm/config.libraries/lib2.lib", RTLD_LAZY); if (!lib) { fprintf(stderr, "Could not open lib2.so: %s\n", dlerror()); exit(1); } checkInit = (int (*)(void)) dlsym(lib, "checkInit"); if (!checkInit) { fprintf(stderr, "Could not find initialization check function\n"); exit(1); } if (!(*checkInit)()) { fprintf(stderr, "Did not link with shared library\n"); exit(2); } ; return 0; } Compile failed inside link Library was not shared Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.hdf5(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.hdf5(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.netcdf(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.netcdf(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.netcdf-cxx(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.netcdf-cxx(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.MOAB(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.MOAB(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.exodusii(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.exodusii(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from PETSc.packages.valgrind(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:464) TESTING: configureLibrary from PETSc.packages.valgrind(config/BuildSystem/config/package.py:464) Find an installation and check if it can work with PETSc ================================================================================== Checking for a functional valgrind Not checking for library in Package specific search directory VALGRIND: [] because no functions given to check for ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for headers Package specific search directory VALGRIND: [] Pushing language C ================================================================================ TEST checkInclude from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['valgrind/valgrind.h'] in [] Checking include with compiler flags var CPPFLAGS [] sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 239 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 243 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 247 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 251 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 255 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 259 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 263 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 267 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 271 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 275 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 279 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 283 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 287 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 291 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 295 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 299 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 303 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 307 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 311 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 315 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 319 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 323 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 327 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 331 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 335 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 339 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 343 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 347 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 351 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 355 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 359 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 363 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 367 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 371 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 375 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 379 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 383 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 387 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 391 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 395 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 399 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 403 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 407 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 411 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 415 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 419 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 423 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 427 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 431 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 435 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 439 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 443 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 447 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 451 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 455 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 459 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 463 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 467 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 471 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 475 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 479 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 483 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 487 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 491 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 495 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 499 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 503 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 507 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 511 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 515 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 519 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 523 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 527 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 531 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 535 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 539 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 543 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 547 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 549 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" typedef int int32_t; typedef int mode_t; typedef int pid_t; #line 14 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 16 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'valgrind/valgrind.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include Popping language C Not checking for library in Package specific search directory VALGRIND: [] because no functions given to check for ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for headers Package specific search directory VALGRIND: [] Pushing language C ================================================================================ TEST checkInclude from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['valgrind/valgrind.h'] in [] Checking include with compiler flags var CPPFLAGS [] sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 239 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 243 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 247 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 251 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 255 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 259 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 263 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 267 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 271 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 275 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 279 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 283 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 287 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 291 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 295 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 299 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 303 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 307 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 311 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 315 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 319 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 323 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 327 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 331 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 335 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 339 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 343 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 347 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 351 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 355 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 359 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 363 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 367 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 371 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 375 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 379 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 383 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 387 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 391 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 395 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 399 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 403 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 407 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 411 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 415 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 419 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 423 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 427 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 431 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 435 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 439 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 443 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 447 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 451 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 455 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 459 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 463 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 467 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 471 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 475 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 479 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 483 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 487 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 491 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 495 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 499 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 503 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 507 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 511 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 515 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 519 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 523 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 527 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 531 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 535 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 539 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 543 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 547 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 549 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" typedef int int32_t; typedef int mode_t; typedef int pid_t; #line 14 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 16 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'valgrind/valgrind.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include Popping language C Not checking for library in Package specific search directory VALGRIND: [] because no functions given to check for ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for headers Package specific search directory VALGRIND: ['/usr/local/include'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['valgrind/valgrind.h'] in ['/usr/local/include'] Checking include with compiler flags var CPPFLAGS ['/usr/local/include'] sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/usr/local/include /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/usr/local/include /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 239 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 243 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 247 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 251 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 255 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 259 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 263 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 267 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 271 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 275 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 279 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 283 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 287 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 291 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 295 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 299 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 303 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 307 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 311 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 315 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 319 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 323 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 327 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 331 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 335 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 339 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 343 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 347 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 351 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 355 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 359 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 363 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 367 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 371 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 375 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 379 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 383 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 387 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 391 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 395 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 399 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 403 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 407 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 411 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 415 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 419 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 423 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 427 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 431 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 435 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 439 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 443 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 447 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 451 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 455 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 459 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 463 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 467 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 471 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 475 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 479 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 483 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 487 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 491 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 495 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 499 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 503 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 507 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 511 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 515 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 519 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 523 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 527 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 531 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 535 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 539 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 543 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 547 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 549 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" typedef int int32_t; typedef int mode_t; typedef int pid_t; #line 14 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 16 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'valgrind/valgrind.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include Popping language C Not checking for library in Package specific search directory VALGRIND: [] because no functions given to check for ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for headers Package specific search directory VALGRIND: ['/usr/local/include'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['valgrind/valgrind.h'] in ['/usr/local/include'] Checking include with compiler flags var CPPFLAGS ['/usr/local/include'] sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/usr/local/include /tmp/petsc-1nzsmm/config.headers/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/usr/local/include /tmp/petsc-1nzsmm/config.headers/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 239 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 243 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 247 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 251 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 255 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 259 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 263 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 267 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 271 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 275 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 279 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 283 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 287 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 291 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 295 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 299 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 303 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 307 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 311 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 315 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 319 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 323 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 327 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 331 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 335 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 339 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 343 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 347 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 351 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 355 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 359 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 363 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 367 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 371 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 375 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 379 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 383 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 387 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 391 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 395 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 399 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 403 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 407 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 411 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 415 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 419 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 423 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 427 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 431 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 435 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 439 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 443 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 447 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 451 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 455 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 459 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 463 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 467 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 471 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 475 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 479 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 483 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 487 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 491 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 495 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 499 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 503 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 507 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 511 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 515 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 519 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 523 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 527 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 531 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 535 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 539 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 543 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 547 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 549 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" typedef int int32_t; typedef int mode_t; typedef int pid_t; #line 14 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 16 "c:\\cygwin\\tmp\\petsc-1nzsmm\\config.headers\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\CONFIG~1.HEA\\conftest.c" C:\cygwin\tmp\PE3DC2~1\CONFIG~1.HEA\conftest.c(3) : fatal error C1083: Cannot open include file: 'valgrind/valgrind.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include Popping language C Directory does not exist: /opt/local (while checking "Package specific search directory VALGRIND" for "[]") Directory does not exist: /opt/local (while checking "Package specific search directory VALGRIND" for "[]") sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 sh: uname -s Executing: uname -s sh: CYGWIN_NT-6.1-WOW64 Popping language C ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.threadcomm(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.threadcomm(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.tetgen(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.tetgen(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.sprng(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.sprng(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default Not a clone of PETSc, don't need Sowing ================================================================================ TEST alternateConfigureLibrary from config.packages.boost(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.boost(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.Sieve(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.Sieve(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.yaml(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.yaml(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.PVODE(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.PVODE(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.pcbddc(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.pcbddc(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.PARTY(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.PARTY(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.papi(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.papi(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.pami(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.pami(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.P3Dlib(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.P3Dlib(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from PETSc.packages.pthread(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/packages/pthread.py:27) TESTING: configureLibrary from PETSc.packages.pthread(config/PETSc/packages/pthread.py:27) Checks for pthread_barrier_t, cpu_set_t, and sys/sysctl.h ================================================================================== Checking for a functional pthread Checking for library in Package specific search directory PTHREAD: ['libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function pthread_create in library ['libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char pthread_create(); int main() { pthread_create() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libpthread.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libpthread.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char pthread_create(); int main() { pthread_create() ; return 0; } Popping language C Checking for library in Package specific search directory PTHREAD: ['lib64/libpthread.a'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function pthread_create in library ['lib64/libpthread.a'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char pthread_create(); int main() { pthread_create() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lpthread Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lpthread Ws2_32.lib sh: LINK : fatal error LNK1104: cannot open file 'libpthread.lib' Possible ERROR while running linker: output: LINK : fatal error LNK1104: cannot open file 'libpthread.lib' ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o -lpthread Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char pthread_create(); int main() { pthread_create() ; return 0; } Popping language C All intermediate test results are stored in /tmp/petsc-1nzsmm/PETSc.packages.pthread sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/PETSc.packages.pthread -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/PETSc.packages.pthread -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCP~1.PTH\conftest.c(3) : fatal error C1083: Cannot open include file: 'pthread.h': No such file or directory Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCP~1.PTH\conftest.c(3) : fatal error C1083: Cannot open include file: 'pthread.h': No such file or directory ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { pthread_barrier_t *a; ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.packages.pthread -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/config.libraries -I/tmp/petsc-1nzsmm/PETSc.packages.pthread -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.c sh: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCP~1.PTH\conftest.c(3) : fatal error C1083: Cannot open include file: 'sched.h': No such file or directory Possible ERROR while running compiler: conftest.c C:\cygwin\tmp\PE3DC2~1\PETSCP~1.PTH\conftest.c(3) : fatal error C1083: Cannot open include file: 'sched.h': No such file or directory ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include int main() { cpu_set_t *a; ; return 0; } sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.packages.pthread /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -E -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.packages.pthread /tmp/petsc-1nzsmm/PETSc.packages.pthread/conftest.c sh: conftest.c #line 1 "C:\\cygwin\\tmp\\PE3DC2~1\\PETSCP~1.PTH\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 7 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 11 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 19 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 23 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 27 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 31 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 35 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 39 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 43 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 47 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 51 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 55 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 59 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 63 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 67 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 71 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 75 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 79 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 83 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 87 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 91 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 95 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 99 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 103 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 107 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 111 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 115 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 119 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 123 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 127 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 131 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 135 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 139 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 143 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 147 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 151 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 155 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 159 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 163 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 167 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 171 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 175 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 179 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 183 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 187 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 191 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 195 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 199 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 203 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 207 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 211 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 215 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 219 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 223 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 227 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 231 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 235 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 239 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 243 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 247 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 251 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 255 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 259 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 263 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 267 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 271 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 275 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 279 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 283 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 287 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 291 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 295 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 299 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 303 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 307 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 311 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 315 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 319 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 323 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 327 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 331 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 335 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 339 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 343 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 347 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 351 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 355 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 359 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 363 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 367 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 371 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 375 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 379 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 383 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 387 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 391 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 395 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 399 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 403 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 407 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 411 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 415 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 419 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 423 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 427 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 431 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 435 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 439 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 443 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 447 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 451 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 455 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 459 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 463 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 467 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 471 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 475 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 479 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 483 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 487 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 491 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 495 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 499 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 503 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 507 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 511 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 515 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 519 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 523 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 527 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 531 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 535 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 539 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 543 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 547 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 549 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\confdefs.h" #line 2 "C:\\cygwin\\tmp\\PE3DC2~1\\PETSCP~1.PTH\\conftest.c" #line 1 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\conffix.h" typedef int int32_t; typedef int mode_t; typedef int pid_t; #line 14 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\conffix.h" #line 15 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\conffix.h" #line 16 "c:\\cygwin\\tmp\\petsc-1nzsmm\\petsc.packages.pthread\\conffix.h" #line 3 "C:\\cygwin\\tmp\\PE3DC2~1\\PETSCP~1.PTH\\conftest.c" C:\cygwin\tmp\PE3DC2~1\PETSCP~1.PTH\conftest.c(3) : fatal error C1083: Cannot open include file: 'sys/sysctl.h': No such file or directory Possible ERROR while running preprocessor: ret = 512 Source: #include "confdefs.h" #include "conffix.h" #include ================================================================================ TEST checkSharedLibrary from PETSc.packages.pthread(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:518) TESTING: checkSharedLibrary from PETSc.packages.pthread(config/BuildSystem/config/package.py:518) By default we don't care about checking if the library is shared Popping language C ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.pthreadclasses(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.pthreadclasses(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.openmp(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.openmp(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.PTScotch(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.PTScotch(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.Numpy(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.Numpy(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.mpe(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.mpe(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.Matlab(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/PETSc/packages/Matlab.py:36) TESTING: alternateConfigureLibrary from PETSc.packages.Matlab(config/PETSc/packages/Matlab.py:36) ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.MatlabEngine(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.MatlabEngine(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.Mathematica(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.Mathematica(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default Not a clone of PETSc, don't need Lgrind ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.hwloc(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.hwloc(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.opengl(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.opengl(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.glut(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.glut(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.Generator(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.Generator(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.fftw(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.fftw(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.scientificpython(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.scientificpython(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.fiat(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.fiat(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.FFC(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.FFC(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.expat(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.expat(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.thrust(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.thrust(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.cusp(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from config.packages.cusp(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.txpetscgpu(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.txpetscgpu(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.cuda(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.cuda(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.ctetgen(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.ctetgen(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from PETSc.packages.Suggar(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:522) TESTING: alternateConfigureLibrary from PETSc.packages.Suggar(config/BuildSystem/config/package.py:522) Called if --with-packagename=0; does nothing by default Checking for program /usr/local/bin/cmake...not found Checking for program /usr/bin/cmake...found Defined make macro "CMAKE" to "/usr/bin/cmake" Defined make macro "CMAKE " to "/usr/bin/cmake" Pushing language C ================================================================================ TEST configureLibrary from PETSc.packages.metis(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py:464) TESTING: configureLibrary from PETSc.packages.metis(config/BuildSystem/config/package.py:464) Find an installation and check if it can work with PETSc ================================================================================== Checking for a functional metis Checking for library in User specified METIS libraries: ['/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib'] Contents: ['build', 'BUILD-Windows.txt', 'BUILD.txt', 'Changelog', 'CMakeLists.txt', 'GKlib', 'include', 'Install.txt', 'libmetis', 'LICENSE.txt', 'Makefile', 'programs', 'vsgen.bat'] ================================================================================ TEST check from config.libraries(/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/libraries.py:145) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for function METIS_PartGraphKway in library ['/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib'] [] Pushing language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/PETSc.packages.pthread -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -c -o /tmp/petsc-1nzsmm/config.libraries/conftest.o -I/tmp/petsc-1nzsmm/config.setCompilers -I/tmp/petsc-1nzsmm/config.compilers -I/tmp/petsc-1nzsmm/config.headers -I/tmp/petsc-1nzsmm/PETSc.utilities.cacheDetails -I/tmp/petsc-1nzsmm/PETSc.utilities.missing -I/tmp/petsc-1nzsmm/config.functions -I/tmp/petsc-1nzsmm/PETSc.utilities.scalarTypes -I/tmp/petsc-1nzsmm/config.types -I/tmp/petsc-1nzsmm/config.packages.MPI -I/tmp/petsc-1nzsmm/PETSc.packages.pthread -I/tmp/petsc-1nzsmm/config.libraries -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.c sh: conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char METIS_PartGraphKway(); int main() { METIS_PartGraphKway() ; return 0; } Pushing language C Popping language C sh: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib Ws2_32.lib Executing: /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib Ws2_32.lib sh: MSVCRT.lib(MSVCR100.dll) : error LNK2005: memmove already defined in LIBCMT.lib(memcpy.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: free already defined in LIBCMT.lib(free.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: malloc already defined in LIBCMT.lib(malloc.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: realloc already defined in LIBCMT.lib(realloc.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: exit already defined in LIBCMT.lib(crt0dat.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: raise already defined in LIBCMT.lib(winsig.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: signal already defined in LIBCMT.lib(winsig.obj) LINK : warning LNK4098: defaultlib 'MSVCRT' conflicts with use of other libs; use /NODEFAULTLIB:library C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1169: one or more multiply defined symbols found Possible ERROR while running linker: output: MSVCRT.lib(MSVCR100.dll) : error LNK2005: memmove already defined in LIBCMT.lib(memcpy.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: free already defined in LIBCMT.lib(free.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: malloc already defined in LIBCMT.lib(malloc.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: realloc already defined in LIBCMT.lib(realloc.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: exit already defined in LIBCMT.lib(crt0dat.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: raise already defined in LIBCMT.lib(winsig.obj) MSVCRT.lib(MSVCR100.dll) : error LNK2005: signal already defined in LIBCMT.lib(winsig.obj) LINK : warning LNK4098: defaultlib 'MSVCRT' conflicts with use of other libs; use /NODEFAULTLIB:library C:\cygwin\tmp\PE3DC2~1\CONFIG~1.LIB\conftest.exe : fatal error LNK1169: one or more multiply defined symbols found ret = 512 Pushing language C Popping language C in /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -o /tmp/petsc-1nzsmm/config.libraries/conftest.exe -MT -wd4996 -Z7 /tmp/petsc-1nzsmm/config.libraries/conftest.o /cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib Ws2_32.lib Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char METIS_PartGraphKway(); int main() { METIS_PartGraphKway() ; return 0; } Popping language C **** Configure header conftest.h **** #if !defined(INCLUDED_UNKNOWN) #define INCLUDED_UNKNOWN #ifndef IS_COLORING_MAX #define IS_COLORING_MAX 65535 #endif #ifndef STDC_HEADERS #define STDC_HEADERS 1 #endif #ifndef MPIU_COLORING_VALUE #define MPIU_COLORING_VALUE MPI_UNSIGNED_SHORT #endif #ifndef _USE_MATH_DEFINES #define _USE_MATH_DEFINES 1 #endif #ifndef PETSC_HAVE_MALLOC_H #define PETSC_HAVE_MALLOC_H 1 #endif #ifndef PETSC_HAVE_IO_H #define PETSC_HAVE_IO_H 1 #endif #ifndef PETSC_HAVE_TIME_H #define PETSC_HAVE_TIME_H 1 #endif #ifndef PETSC_HAVE_MATH_H #define PETSC_HAVE_MATH_H 1 #endif #ifndef PETSC_HAVE_STRING_H #define PETSC_HAVE_STRING_H 1 #endif #ifndef PETSC_HAVE_FCNTL_H #define PETSC_HAVE_FCNTL_H 1 #endif #ifndef PETSC_HAVE_DIRECT_H #define PETSC_HAVE_DIRECT_H 1 #endif #ifndef PETSC_HAVE_WINDOWSX_H #define PETSC_HAVE_WINDOWSX_H 1 #endif #ifndef PETSC_HAVE_SYS_TYPES_H #define PETSC_HAVE_SYS_TYPES_H 1 #endif #ifndef PETSC_HAVE_FLOAT_H #define PETSC_HAVE_FLOAT_H 1 #endif #ifndef PETSC_HAVE_DOS_H #define PETSC_HAVE_DOS_H 1 #endif #ifndef PETSC_HAVE_MEMORY_H #define PETSC_HAVE_MEMORY_H 1 #endif #ifndef PETSC_HAVE_STDLIB_H #define PETSC_HAVE_STDLIB_H 1 #endif #ifndef PETSC_HAVE_SEARCH_H #define PETSC_HAVE_SEARCH_H 1 #endif #ifndef PETSC_HAVE_SETJMP_H #define PETSC_HAVE_SETJMP_H 1 #endif #ifndef PETSC_HAVE_STDINT_H #define PETSC_HAVE_STDINT_H 1 #endif #ifndef PETSC_HAVE_WS2TCPIP_H #define PETSC_HAVE_WS2TCPIP_H 1 #endif #ifndef PETSC_HAVE_LIMITS_H #define PETSC_HAVE_LIMITS_H 1 #endif #ifndef PETSC_USING_F2003 #define PETSC_USING_F2003 1 #endif #ifndef PETSC_HAVE_FORTRAN_CAPS #define PETSC_HAVE_FORTRAN_CAPS 1 #endif #ifndef PETSC_C_STATIC_INLINE #define PETSC_C_STATIC_INLINE static __inline #endif #ifndef PETSC_USING_F90 #define PETSC_USING_F90 1 #endif #ifndef PETSC_HAVE_CXX_NAMESPACE #define PETSC_HAVE_CXX_NAMESPACE 1 #endif #ifndef PETSC_C_RESTRICT #define PETSC_C_RESTRICT __restrict #endif #ifndef PETSC_CXX_RESTRICT #define PETSC_CXX_RESTRICT __restrict #endif #ifndef PETSC_CXX_STATIC_INLINE #define PETSC_CXX_STATIC_INLINE static inline #endif #ifndef PETSC_HAVE_LIBWS2_32 #define PETSC_HAVE_LIBWS2_32 1 #endif #ifndef PETSC_HAVE_LIBFMPICH2G #define PETSC_HAVE_LIBFMPICH2G 1 #endif #ifndef PETSC_HAVE_LIBMPI #define PETSC_HAVE_LIBMPI 1 #endif #ifndef PETSC_HAVE_LIBFMPICH2 #define PETSC_HAVE_LIBFMPICH2 1 #endif #ifndef PETSC_ARCH #define PETSC_ARCH "arch-mswin-c-debug" #endif #ifndef PETSC_DIR #define PETSC_DIR "/cygdrive/c/cygwin/packages/petsc-3.4.2" #endif #ifndef HAVE_GZIP #define HAVE_GZIP 1 #endif #ifndef PETSC_CLANGUAGE_C #define PETSC_CLANGUAGE_C 1 #endif #ifndef PETSC_USE_ERRORCHECKING #define PETSC_USE_ERRORCHECKING 1 #endif #ifndef PETSC_MISSING_DREAL #define PETSC_MISSING_DREAL 1 #endif #ifndef PETSC_SIZEOF_MPI_COMM #define PETSC_SIZEOF_MPI_COMM 4 #endif #ifndef PETSC_BITS_PER_BYTE #define PETSC_BITS_PER_BYTE 8 #endif #ifndef PETSC_SIZEOF_MPI_FINT #define PETSC_SIZEOF_MPI_FINT 4 #endif #ifndef PETSC_SIZEOF_VOID_P #define PETSC_SIZEOF_VOID_P 8 #endif #ifndef PETSC_RETSIGTYPE #define PETSC_RETSIGTYPE void #endif #ifndef PETSC_HAVE___INT64 #define PETSC_HAVE___INT64 1 #endif #ifndef PETSC_HAVE_CXX_COMPLEX #define PETSC_HAVE_CXX_COMPLEX 1 #endif #ifndef PETSC_SIZEOF_LONG #define PETSC_SIZEOF_LONG 4 #endif #ifndef PETSC_USE_FORTRANKIND #define PETSC_USE_FORTRANKIND 1 #endif #ifndef PETSC_SIZEOF_INT #define PETSC_SIZEOF_INT 4 #endif #ifndef PETSC_SIZEOF_SIZE_T #define PETSC_SIZEOF_SIZE_T 8 #endif #ifndef PETSC_uid_t #define PETSC_uid_t int #endif #ifndef PETSC_SIZEOF_CHAR #define PETSC_SIZEOF_CHAR 1 #endif #ifndef PETSC_SIZEOF_DOUBLE #define PETSC_SIZEOF_DOUBLE 8 #endif #ifndef PETSC_SIZEOF_FLOAT #define PETSC_SIZEOF_FLOAT 4 #endif #ifndef PETSC_gid_t #define PETSC_gid_t int #endif #ifndef PETSC_SIZEOF_LONG_LONG #define PETSC_SIZEOF_LONG_LONG 8 #endif #ifndef PETSC_SIZEOF_SHORT #define PETSC_SIZEOF_SHORT 2 #endif #ifndef PETSC_HAVE_ACCESS #define PETSC_HAVE_ACCESS 1 #endif #ifndef PETSC_HAVE__FULLPATH #define PETSC_HAVE__FULLPATH 1 #endif #ifndef PETSC_HAVE_SIGNAL #define PETSC_HAVE_SIGNAL 1 #endif #ifndef PETSC_HAVE__LSEEK #define PETSC_HAVE__LSEEK 1 #endif #ifndef PETSC_HAVE_VFPRINTF #define PETSC_HAVE_VFPRINTF 1 #endif #ifndef PETSC_HAVE__GETCWD #define PETSC_HAVE__GETCWD 1 #endif #ifndef PETSC_HAVE_MEMMOVE #define PETSC_HAVE_MEMMOVE 1 #endif #ifndef PETSC_HAVE_RAND #define PETSC_HAVE_RAND 1 #endif #ifndef PETSC_HAVE__SLEEP #define PETSC_HAVE__SLEEP 1 #endif #ifndef PETSC_HAVE_TIME #define PETSC_HAVE_TIME 1 #endif #ifndef PETSC_HAVE_GETCWD #define PETSC_HAVE_GETCWD 1 #endif #ifndef PETSC_HAVE_LSEEK #define PETSC_HAVE_LSEEK 1 #endif #ifndef PETSC_HAVE__VSNPRINTF #define PETSC_HAVE__VSNPRINTF 1 #endif #ifndef PETSC_HAVE_VPRINTF #define PETSC_HAVE_VPRINTF 1 #endif #ifndef PETSC_HAVE_STRICMP #define PETSC_HAVE_STRICMP 1 #endif #ifndef PETSC_HAVE__SNPRINTF #define PETSC_HAVE__SNPRINTF 1 #endif #ifndef PETSC_SIGNAL_CAST #define PETSC_SIGNAL_CAST #endif #ifndef PETSC_HAVE__ACCESS #define PETSC_HAVE__ACCESS 1 #endif #ifndef PETSC_HAVE_CLOCK #define PETSC_HAVE_CLOCK 1 #endif #ifndef PETSC_HAVE_MPI_COMM_C2F #define PETSC_HAVE_MPI_COMM_C2F 1 #endif #ifndef PETSC_HAVE_MPI_INIT_THREAD #define PETSC_HAVE_MPI_INIT_THREAD 1 #endif #ifndef PETSC_HAVE_MPI_LONG_DOUBLE #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #endif #ifndef PETSC_HAVE_MPI_COMM_F2C #define PETSC_HAVE_MPI_COMM_F2C 1 #endif #ifndef PETSC_HAVE_MPI_FINT #define PETSC_HAVE_MPI_FINT 1 #endif #ifndef PETSC_HAVE_MPI_F90MODULE #define PETSC_HAVE_MPI_F90MODULE 1 #endif #ifndef PETSC_HAVE_MPI_TYPE_GET_ENVELOPE #define PETSC_HAVE_MPI_TYPE_GET_ENVELOPE 1 #endif #ifndef PETSC_HAVE_MPI_FINALIZED #define PETSC_HAVE_MPI_FINALIZED 1 #endif #ifndef PETSC_HAVE_MPI_COMM_SPAWN #define PETSC_HAVE_MPI_COMM_SPAWN 1 #endif #ifndef PETSC_HAVE_MPI_EXSCAN #define PETSC_HAVE_MPI_EXSCAN 1 #endif #ifndef PETSC_HAVE_MPI_TYPE_GET_EXTENT #define PETSC_HAVE_MPI_TYPE_GET_EXTENT 1 #endif #ifndef PETSC_HAVE_MPI_COMBINER_DUP #define PETSC_HAVE_MPI_COMBINER_DUP 1 #endif #ifndef PETSC_HAVE_MPI_WIN_CREATE #define PETSC_HAVE_MPI_WIN_CREATE 1 #endif #ifndef PETSC_HAVE_MPI_REPLACE #define PETSC_HAVE_MPI_REPLACE 1 #endif #ifndef PETSC_HAVE_MPI_TYPE_DUP #define PETSC_HAVE_MPI_TYPE_DUP 1 #endif #ifndef PETSC_HAVE_MPIIO #define PETSC_HAVE_MPIIO 1 #endif #ifndef PETSC_HAVE_MPI_C_DOUBLE_COMPLEX #define PETSC_HAVE_MPI_C_DOUBLE_COMPLEX 1 #endif #ifndef PETSC_HAVE_MPI_ALLTOALLW #define PETSC_HAVE_MPI_ALLTOALLW 1 #endif #ifndef PETSC_HAVE_MPI_IN_PLACE #define PETSC_HAVE_MPI_IN_PLACE 1 #endif #ifndef PETSC_LEVEL1_DCACHE_LINESIZE #define PETSC_LEVEL1_DCACHE_LINESIZE 32 #endif #ifndef PETSC_LEVEL1_DCACHE_SIZE #define PETSC_LEVEL1_DCACHE_SIZE 32768 #endif #ifndef PETSC_LEVEL1_DCACHE_ASSOC #define PETSC_LEVEL1_DCACHE_ASSOC 2 #endif #ifndef PETSC_USE_GDB_DEBUGGER #define PETSC_USE_GDB_DEBUGGER 1 #endif #ifndef PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT #define PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT 1 #endif #ifndef PETSC_USE_PROC_FOR_SIZE #define PETSC_USE_PROC_FOR_SIZE 1 #endif #ifndef PETSC_USE_INFO #define PETSC_USE_INFO 1 #endif #ifndef PETSC_Alignx #define PETSC_Alignx(a,b) #endif #ifndef PETSC_USE_BACKWARD_LOOP #define PETSC_USE_BACKWARD_LOOP 1 #endif #ifndef PETSC_USE_DEBUG #define PETSC_USE_DEBUG 1 #endif #ifndef PETSC_USE_64BIT_INDICES #define PETSC_USE_64BIT_INDICES 1 #endif #ifndef PETSC_USE_LOG #define PETSC_USE_LOG 1 #endif #ifndef PETSC_IS_COLOR_VALUE_TYPE #define PETSC_IS_COLOR_VALUE_TYPE short #endif #ifndef PETSC_USE_CTABLE #define PETSC_USE_CTABLE 1 #endif #ifndef PETSC_USE_SCALAR_REAL #define PETSC_USE_SCALAR_REAL 1 #endif #ifndef PETSC_HAVE__ISNAN #define PETSC_HAVE__ISNAN 1 #endif #ifndef PETSC_HAVE__FINITE #define PETSC_HAVE__FINITE 1 #endif #ifndef PETSC_USE_REAL_DOUBLE #define PETSC_USE_REAL_DOUBLE 1 #endif #ifndef PETSC_MEMALIGN #define PETSC_MEMALIGN 16 #endif #ifndef PETSC_MISSING_SIGUSR2 #define PETSC_MISSING_SIGUSR2 1 #endif #ifndef PETSC_MISSING_SIGURG #define PETSC_MISSING_SIGURG 1 #endif #ifndef PETSC_MISSING_SIGPIPE #define PETSC_MISSING_SIGPIPE 1 #endif #ifndef PETSC_MISSING_SIGHUP #define PETSC_MISSING_SIGHUP 1 #endif #ifndef PETSC_MISSING_SIGSTOP #define PETSC_MISSING_SIGSTOP 1 #endif #ifndef PETSC_MISSING_SIGSYS #define PETSC_MISSING_SIGSYS 1 #endif #ifndef PETSC_MISSING_SIGCONT #define PETSC_MISSING_SIGCONT 1 #endif #ifndef PETSC_HAVE_WSAGETLASTERROR #define PETSC_HAVE_WSAGETLASTERROR 1 #endif #ifndef PETSC_HAVE_CLOSESOCKET #define PETSC_HAVE_CLOSESOCKET 1 #endif #ifndef PETSC_MISSING_SIGTSTP #define PETSC_MISSING_SIGTSTP 1 #endif #ifndef PETSC_MISSING_SIGCHLD #define PETSC_MISSING_SIGCHLD 1 #endif #ifndef PETSC_HAVE_SOCKET #define PETSC_HAVE_SOCKET 1 #endif #ifndef PETSC_MISSING_SIGUSR1 #define PETSC_MISSING_SIGUSR1 1 #endif #ifndef PETSC_MISSING_SIGTRAP #define PETSC_MISSING_SIGTRAP 1 #endif #ifndef PETSC_MISSING_SIGQUIT #define PETSC_MISSING_SIGQUIT 1 #endif #ifndef PETSC_MISSING_SIGBUS #define PETSC_MISSING_SIGBUS 1 #endif #ifndef PETSC_HAVE_WINSOCK2_H #define PETSC_HAVE_WINSOCK2_H 1 #endif #ifndef PETSC_MISSING_SIGALRM #define PETSC_MISSING_SIGALRM 1 #endif #ifndef PETSC_NEEDS_UTYPE_TYPEDEFS #define PETSC_NEEDS_UTYPE_TYPEDEFS 1 #endif #ifndef PETSC_MISSING_SIGKILL #define PETSC_MISSING_SIGKILL 1 #endif #ifndef PETSC_HAVE_SHARED_LIBRARIES #define PETSC_HAVE_SHARED_LIBRARIES 1 #endif #endif **** C specific Configure header conffix.h **** #if !defined(INCLUDED_UNKNOWN) #define INCLUDED_UNKNOWN typedef int int32_t; typedef int mode_t; typedef int pid_t; #if defined(__cplusplus) extern "C" { int getdomainname(char *, int); double drand48(); void srand48(long); } #else #endif #endif ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- --with-metis-lib=['/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib'] and --with-metis-include=['/cygdrive/c/cygwin/packages/parmetis-4.0.3/metis/include'] did not work ******************************************************************************* File "./config/configure.py", line 293, in petsc_configure framework.configure(out = sys.stdout) File "/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/framework.py", line 933, in configure child.configure() File "/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py", line 556, in configure self.executeTest(self.configureLibrary) File "/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/base.py", line 115, in executeTest ret = apply(test, args,kargs) File "/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py", line 484, in configureLibrary for location, directory, lib, incl in self.generateGuesses(): File "/cygdrive/c/cygwin/packages/petsc-3.4.2/config/BuildSystem/config/package.py", line 294, in generateGuesses '--with-'+self.package+'-include='+str(self.framework.argDB['with-'+self.package+'-include'])+' did not work') From balay at mcs.anl.gov Fri Aug 2 22:28:14 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 2 Aug 2013 22:28:14 -0500 (CDT) Subject: [petsc-users] How to configure metis/parmetis in cygwin In-Reply-To: <51FC5241.7070704@gmail.com> References: <51FC5241.7070704@gmail.com> Message-ID: On Fri, 2 Aug 2013, Danyang Su wrote: > Hi All, > > I can install petsc successfully in CYGWIN without metis/parmetis. But when i > configure with metis or parmetis, there will be some error. > > First I tried the following configuration > > ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > --with-cxx='win32fe cl' --with-64-bit-indices --download-f-blas-lapack > --download-superlu_dist --download-mumps --download-hypre --download-parmetis > --download-metis > > There is error in build metis library so I build metis and parmetis manually, > and then configure with the following configuration yeah --download-metis does not work [as cmake doesn't like 'win32fe cl'] > > ./configure --with-cc='win32fe cl' --with-fc='win32fe ifort' > --with-cxx='win32fe cl' --with-64-bit-indices > --with-parmetis-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/include > --with-parmetis-lib=/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libparmetis/Release/parmetis.lib > --with-metis-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/metis/include > --with-metis-lib=/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib > --download-f-blas-lapack --download-superlu_dist --download-mumps > --download-hypre > > Then i get the following error > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > ------------------------------------------------------------------------------- > --with-metis-lib=['/cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libmetis/Release/metis.lib'] > and > --with-metis-include=['/cygdrive/c/cygwin/packages/parmetis-4.0.3/metis/include'] > did not work > ******************************************************************************* > The log file for the last configuration is attached. For one - make sure metis and parmetis are built with -DMETIS_USE_LONGINDEX=1 this way - its compatible with --with-64-bit-indices option. Also rebuilt metis/parmetis with '/MT' or equivalent compiler option. PETSc by default is built with this option - and 'cl' does not like mixing object files compiled with different options. Satish From jedbrown at mcs.anl.gov Fri Aug 2 22:47:47 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 02 Aug 2013 22:47:47 -0500 Subject: [petsc-users] DIVERGED_NONLINEAR_SOLVE error In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F8B@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F0F@EMAIL04.pnl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F1B@EMAIL04.pnl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB5B4F8B@EMAIL04.pnl.gov> Message-ID: <8738qr5tb0.fsf@mcs.anl.gov> "Jin, Shuangshuang" writes: > Thank you, Mat, problem resolved. It's floating point errors. Located > it after turning on the debugging option. Run in a debugger and pass the option -fp_trap to PETSc. It should break the first place a NaN or denormal is computed. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jyawney123 at gmail.com Fri Aug 2 23:17:37 2013 From: jyawney123 at gmail.com (John Yawney) Date: Sat, 3 Aug 2013 00:17:37 -0400 Subject: [petsc-users] Question about KSPSolve In-Reply-To: References: Message-ID: Hi Matt, Thank you for the response. I've rerun the test using 2 processors and in serial and got similar results. I turned off my implicit viscosity terms though and it appears as though the pressure solve produces very little error on its own. I'll have to check over my FD laplacian matrices used for the viscous terms and see if I can spot any errors in their definitions. Thanks again, John On Fri, Aug 2, 2013 at 4:28 PM, Matthew Knepley wrote: > On Sat, Aug 3, 2013 at 1:27 AM, John Yawney wrote: > >> Good Afternoon Everyone, >> >> I'm using PETSc to solve some linear systems in my ocean model. Currently >> I'm using the KSPSolve environment with 4 MPI processors. I've established >> the matrix A and the RHS b and confirmed that everything looks correct >> using VecView and MatView. Here are some code snippets that show the basic >> steps I took. >> >> ------------------------------------------------------------------------- >> *Assembling A:* >> MatConvert(matLaplacianX, MATSAME, MAT_INITIAL_MATRIX, &matLaplacian); >> MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); >> MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); >> >> MatAXPY(matLaplacian, 1.0, matLaplacianY, DIFFERENT_NONZERO_PATTERN); >> MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); >> MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); >> >> MatAXPY(matLaplacian, 1.0, matLaplacianZ, DIFFERENT_NONZERO_PATTERN); >> MatAssemblyBegin(matLaplacian, MAT_FINAL_ASSEMBLY); >> MatAssemblyEnd(matLaplacian, MAT_FINAL_ASSEMBLY); >> >> *Defining KSP environment:* >> KSPCreate(MPI_COMM_WORLD, &m_inksp); >> KSPSetOperators(m_inksp, matLaplacian, matLaplacian, >> DIFFERENT_NONZERO_PATTERN); >> KSPSetType(m_inksp, KSPGMRES); >> KSPSetInitialGuessNonzero(m_inksp,PETSC_TRUE); >> KSPSetFromOptions(m_inksp); >> KSPSetUp(m_inksp); >> >> *Defining RHS vector:* >> VecCreateMPI(MPI_COMM_WORLD, nLocalElements, nGlobalElements, &m_vecRHS); >> >> *Solving the linear system:* >> VecAssemblyBegin(m_vecRHS); >> VecAssemblyEnd(m_vecRHS); >> KSPSolve(m_inksp, m_vecRHS, m_vecPressure); >> ------------------------------------------------------------------------- >> >> If I modify my problem to consider a 2D (x-z) domain with flat bottom >> topography and I set the initial velocity fields to 0 and a constant >> density of 1025 throughout, then if I compute a number of time steps I get >> computational artifacts at the beginning and end locations of each block. I >> should also mention I'm only splitting up the domain into sub-blocks in the >> x direction currently. After about 10 time steps, the min density is off by >> about 1E-8 but only at these locations. I've attached a figure to >> demonstrate the errors. >> >> Are there ways for me to remove these errors? Should I be looking at the >> DM manual pages? >> > > The above looks correct, so I assume there is a problem with the > definition of the system. I would try putting > in an exact solution, or comparing a serial and parallel run. > > Thanks, > > Matt > > >> Thanks for any help and suggestions. >> >> All the best, >> John >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pengxwang at hotmail.com Sun Aug 4 11:12:18 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Sun, 4 Aug 2013 11:12:18 -0500 Subject: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS Message-ID: Hello, I am developing a poisson solver based on the example code petsc-3.4.0/src/ksp/ksp/examples/tutorials/ex45.c. I have a question on utilizing functions KSPSetComputeRHS (KSP ksp,PetscErrorCode (*func)(KSP,Vec,void*),void *ctx) and ComputeRHS(KSP,Vec,void*) which is the (*func)(KSP,Vec,void*) in the interface. Since I have source term generated on the right hand side of Poisson equation and they are computed outside KSPSetComputeRHS() and ComputeRHS(), an array of source term variables must be passed into ComuteRHS() function. From the manual online, it seems the interface of ComputeRHS is only for passing variables from inside of ComputeRHS to KSPSetComputeRHS. There is not a dummy for passing into ComputeRHS. Can anybody give me suggestions on it? Thanks in advance. -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Sun Aug 4 11:18:54 2013 From: dave.mayhem23 at gmail.com (Dave May) Date: Sun, 4 Aug 2013 18:18:54 +0200 Subject: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS In-Reply-To: References: Message-ID: The last argument you give to KSPSetComputeRHS() will be passed into your user defined function ComputeRHS(). All data needed to evaluate the rhs should be made available through that pointer (void *ctx) On Sunday, 4 August 2013, Roc Wang wrote: > Hello, I am developing a poisson solver based on the example code > petsc-3.4.0/src/ksp/ksp/examples/tutorials/ex45.c. > > I have a question on utilizing functions > > KSPSetComputeRHS (KSP ksp,PetscErrorCode (*func)(KSP,Vec,void*),void *ctx) and ComputeRHS(KSP,Vec,void*) which is the (*func)(KSP,Vec,void*) in the interface. > > Since I have source term generated on the right hand side of Poisson equation and they are computed outside KSPSetComputeRHS() and ComputeRHS(), an array of source term variables must be passed into ComuteRHS() function. From the manual online, it seems the interface of ComputeRHS is only for passing variables from inside of ComputeRHS to KSPSetComputeRHS. There is not a dummy for passing into ComputeRHS. Can anybody give me suggestions on it? Thanks in advance. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pengxwang at hotmail.com Sun Aug 4 15:35:01 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Sun, 4 Aug 2013 15:35:01 -0500 Subject: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS In-Reply-To: References: , Message-ID: Thanks a lot, Dave, I can pass an 1-D array into ComputeRHS via *void ctx. I am using the following procedure: in main() {//------- ... double array_1d[array_size]; ComputeRHS(array_1d); ierr = KSPSetComputeRHS(ksp,ComputeRHS, array_1d); ... //-------- } in ComputeRHS(KSP,Vec,void* ctx ) { int size_in_func double* arr1d=static_cast(ctx); for (i=0;i wrote: > Thanks a lot, Dave, > > I can pass an 1-D array into ComputeRHS via *void ctx. > > I am using the following procedure: > > in main() > {//------- > ... > double array_1d[array_size]; > > ComputeRHS(array_1d); > ierr = KSPSetComputeRHS(ksp,ComputeRHS, array_1d); > ... > //-------- > } > > in ComputeRHS(KSP,Vec,void* ctx ) > { > int size_in_func > double* arr1d=static_cast(ctx); > for (i=0;i cout<<" ComputeRHS="< } > > Definitely, a 3d array can be passed by converting it to 1d in main() before calling KSPSetComputeRHS and back to 3d in ComputeRHS(). However, is it possible to pass 3d array directly? Thanks. > > Date: Sun, 4 Aug 2013 18:18:54 +0200 > Subject: Re: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS > From: dave.mayhem23 at gmail.com > To: pengxwang at hotmail.com > CC: petsc-users at mcs.anl.gov > > The last argument you give to KSPSetComputeRHS() will be passed into your user defined function ComputeRHS(). All data needed to evaluate the rhs should be made available through that pointer (void *ctx) > > > On Sunday, 4 August 2013, Roc Wang wrote: > Hello, I am developing a poisson solver based on the example code petsc-3.4.0/src/ksp/ksp/examples/tutorials/ex45.c. > > I have a question on utilizing functions > KSPSetComputeRHS (KSP ksp,PetscErrorCode (*func)(KSP,Vec,void*),void *ctx) and ComputeRHS(KSP,Vec,void*) which is the (*func)(KSP,Vec,void*) in the interface. > > > Since I have source term generated on the right hand side of Poisson equation and they are computed outside KSPSetComputeRHS() and ComputeRHS(), an array of source term variables must be passed into ComuteRHS() function. From the manual online, it seems the interface of ComputeRHS is only for passing variables from inside of ComputeRHS to KSPSetComputeRHS. There is not a dummy for passing into ComputeRHS. Can anybody give me suggestions on it? Thanks in advance. > > > From pengxwang at hotmail.com Sun Aug 4 19:54:45 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Sun, 4 Aug 2013 19:54:45 -0500 Subject: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS In-Reply-To: References: , , Message-ID: Thanks, Barry, I meant in ComputeRHS(KSP,Vec,void* ctx), I can only has one ponter array like double* arr1d which can onle be casted from void *ctx to 1d array directly. If it is a 3d array outside, I have to use 3 for-iterations to convert the 1d array ( arr1d ) to a 3d array (arr3d) inside, like int i,j,k,cnt; cnt=0; double* arr1d=static_cast(ctx); for (i=0; i<3; i++) { for(j=0; j<3; j++) { for(k=0; k<3;k++) arr3d[i][j][k]=arr1d[cnt]; cnt++; } } but cannot use the arr3d directly inside the function. Is there anyway to cast *ctx to a 3d array directly? Thanks again. > > int size_in_func > > double* arr1d=static_cast(ctx); > Subject: Re: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS > From: bsmith at mcs.anl.gov > Date: Sun, 4 Aug 2013 17:17:59 -0500 > CC: dave.mayhem23 at gmail.com; petsc-users at mcs.anl.gov > To: pengxwang at hotmail.com > > > Sure, just caste it to a void* and then caste it back in the subroutine. Note that in parallel you will only want to store on each process the "local" values for the field, not all the values. > > Barry > > On Aug 4, 2013, at 3:35 PM, Roc Wang wrote: > > > Thanks a lot, Dave, > > > > I can pass an 1-D array into ComputeRHS via *void ctx. > > > > I am using the following procedure: > > > > in main() > > {//------- > > ... > > double array_1d[array_size]; > > > > ComputeRHS(array_1d); > > ierr = KSPSetComputeRHS(ksp,ComputeRHS, array_1d); > > ... > > //-------- > > } > > > > in ComputeRHS(KSP,Vec,void* ctx ) > > { > > int size_in_func > > double* arr1d=static_cast(ctx); > > for (i=0;i > cout<<" ComputeRHS="< > } > > > > Definitely, a 3d array can be passed by converting it to 1d in main() before calling KSPSetComputeRHS and back to 3d in ComputeRHS(). However, is it possible to pass 3d array directly? Thanks. > > > > Date: Sun, 4 Aug 2013 18:18:54 +0200 > > Subject: Re: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS > > From: dave.mayhem23 at gmail.com > > To: pengxwang at hotmail.com > > CC: petsc-users at mcs.anl.gov > > > > The last argument you give to KSPSetComputeRHS() will be passed into your user defined function ComputeRHS(). All data needed to evaluate the rhs should be made available through that pointer (void *ctx) > > > > > > On Sunday, 4 August 2013, Roc Wang wrote: > > Hello, I am developing a poisson solver based on the example code petsc-3.4.0/src/ksp/ksp/examples/tutorials/ex45.c. > > > > I have a question on utilizing functions > > KSPSetComputeRHS (KSP ksp,PetscErrorCode (*func)(KSP,Vec,void*),void *ctx) and ComputeRHS(KSP,Vec,void*) which is the (*func)(KSP,Vec,void*) in the interface. > > > > > > Since I have source term generated on the right hand side of Poisson equation and they are computed outside KSPSetComputeRHS() and ComputeRHS(), an array of source term variables must be passed into ComuteRHS() function. From the manual online, it seems the interface of ComputeRHS is only for passing variables from inside of ComputeRHS to KSPSetComputeRHS. There is not a dummy for passing into ComputeRHS. Can anybody give me suggestions on it? Thanks in advance. > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Aug 4 20:54:58 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 4 Aug 2013 20:54:58 -0500 Subject: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS In-Reply-To: References: , , Message-ID: <866B4DA3-ED65-4746-A9AC-C3FF93AF7DE1@mcs.anl.gov> On Aug 4, 2013, at 7:54 PM, Roc Wang wrote: > Thanks, Barry, > > I meant in ComputeRHS(KSP,Vec,void* ctx), I can only has one ponter array like double* arr1d which can onle be casted from void *ctxto 1d array directly. If it is a 3d array outside, I have to use 3 for-iterations to convert the 1d array ( arr1d ) to a 3d array (arr3d) inside, like > > int i,j,k,cnt; > cnt=0; > double* arr1d=static_cast(ctx); > for (i=0; i<3; i++) > { for(j=0; j<3; j++) > { for(k=0; k<3;k++) > arr3d[i][j][k]=arr1d[cnt]; > cnt++; > } > } > > but cannot use the arr3d directly inside the function. Is there anyway to cast *ctx to a 3d array directly? Thanks again. Sure, you can case any pointer to void and then back to whatever point typer it is. Barry > > > > > int size_in_func > > > double* arr1d=static_cast(ctx); > > > Subject: Re: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS > > From: bsmith at mcs.anl.gov > > Date: Sun, 4 Aug 2013 17:17:59 -0500 > > CC: dave.mayhem23 at gmail.com; petsc-users at mcs.anl.gov > > To: pengxwang at hotmail.com > > > > > > Sure, just caste it to a void* and then caste it back in the subroutine. Note that in parallel you will only want to store on each process the "local" values for the field, not all the values. > > > > Barry > > > > On Aug 4, 2013, at 3:35 PM, Roc Wang wrote: > > > > > Thanks a lot, Dave, > > > > > > I can pass an 1-D array into ComputeRHS via *void ctx. > > > > > > I am using the following procedure: > > > > > > in main() > > > {//------- > > > ... > > > double array_1d[array_size]; > > > > > > ComputeRHS(array_1d); > > > ierr = KSPSetComputeRHS(ksp,ComputeRHS, array_1d); > > > ... > > > //-------- > > > } > > > > > > in ComputeRHS(KSP,Vec,void* ctx ) > > > { > > > int size_in_func > > > double* arr1d=static_cast(ctx); > > > for (i=0;i > > cout<<" ComputeRHS="< > > } > > > > > > Definitely, a 3d array can be passed by converting it to 1d in main() before calling KSPSetComputeRHS and back to 3d in ComputeRHS(). However, is it possible to pass 3d array directly? Thanks. > > > > > > Date: Sun, 4 Aug 2013 18:18:54 +0200 > > > Subject: Re: [petsc-users] passing variables into (*func)(KSP, Vec, void*) in the interface of KSPSetComputeRHS > > > From: dave.mayhem23 at gmail.com > > > To: pengxwang at hotmail.com > > > CC: petsc-users at mcs.anl.gov > > > > > > The last argument you give to KSPSetComputeRHS() will be passed into your user defined function ComputeRHS(). All data needed to evaluate the rhs should be made available through that pointer (void *ctx) > > > > > > > > > On Sunday, 4 August 2013, Roc Wang wrote: > > > Hello, I am developing a poisson solver based on the example code petsc-3.4.0/src/ksp/ksp/examples/tutorials/ex45.c. > > > > > > I have a question on utilizing functions > > > KSPSetComputeRHS (KSP ksp,PetscErrorCode (*func)(KSP,Vec,void*),void *ctx) and ComputeRHS(KSP,Vec,void*) which is the (*func)(KSP,Vec,void*) in the interface. > > > > > > > > > Since I have source term generated on the right hand side of Poisson equation and they are computed outside KSPSetComputeRHS() and ComputeRHS(), an array of source term variables must be passed into ComuteRHS() function. From the manual online, it seems the interface of ComputeRHS is only for passing variables from inside of ComputeRHS to KSPSetComputeRHS. There is not a dummy for passing into ComputeRHS. Can anybody give me suggestions on it? Thanks in advance. > > > > > > > > > > > From Wadud.Miah at awe.co.uk Mon Aug 5 07:20:48 2013 From: Wadud.Miah at awe.co.uk (Wadud.Miah at awe.co.uk) Date: Mon, 5 Aug 2013 12:20:48 +0000 Subject: [petsc-users] EXTERNAL: Re: Matrix assembly error in PETSc In-Reply-To: <94E6CE15-F08C-4961-A184-FEC03FA241F3@mcs.anl.gov> References: <201308011508.r71F8bgq024836@msw2.awe.co.uk> <94E6CE15-F08C-4961-A184-FEC03FA241F3@mcs.anl.gov> Message-ID: <201308051220.r75CKvpJ029152@msw2.awe.co.uk> Hello Barry, Thanks for your response. Do you know how to get the entire error message? Regards, Wadud. -----Original Message----- From: Barry Smith [mailto:bsmith at mcs.anl.gov] Sent: 01 August 2013 19:02 To: Miah Wadud AWE Cc: petsc-users at mcs.anl.gov Subject: EXTERNAL: Re: [petsc-users] Matrix assembly error in PETSc Please always send the ENTIRE error message, it makes it much easier for us to deduce what is going on. Error code 63 is PETSC_ERR_ARG_OUTOFRANGE which presumably is generated in MatSetValues_MPIAIJ() which means a row or column index is out of range. But since this is called within the MatAssemblyEnd_MPIAIJ() it should never be out of range. The most likely cause is data corruption on values passed between processes with MPI. It is possible the error is due to bugs in the MPI implementation or due to memory corruption elsewhere. I would first recommend running the code with valgrind (and enormously powerful tool) to eliminate the chance of memory corruption http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind Let us know what happens, Barry MPI 2.0 vs MPI 3.0 is likely not the issue. On Aug 1, 2013, at 10:08 AM, Wadud.Miah at awe.co.uk wrote: > Hello, > > I am running an application code which works with 4, 8 and 18 processes but crashes with 16 processes. I have used MPICH2 and MVAPICH2 (both adhered to the MPI 3.0 standard) and both cause the same problem. I get the following error message: > > [12]PETSC ERROR: MatSetValues_MPIAIJ() line 564 in src/mat/impls/aij/mpi/mpiaij.c > [12]PETSC ERROR: MatAssemblyEnd_MPIAIJ() line 680 in src/mat/impls/aij/mpi/mpiaij.c > [12]PETSC ERROR: MatAssemblyEnd() line 4879 in src/mat/interface/matrix.c > > [12] --> Error in "MatAssemblyEnd()". > [12] --> Code: 63 > > However, I do not get this using the Intel MPI (which adheres to the MPI 2.0 standard) library. Any help will be greatly appreciated. > > Regards, > > -------------------------- > Wadud Miah > HPC, Design and Theoretical Physics > Direct: 0118 98 56220 > AWE, Aldermaston, Reading, RG7 4PR > > > ___________________________________________________ ____________________________ The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited. Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer. While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected. AWE Plc Registered in England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR > ___________________________________________________ ____________________________ The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited. Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer. While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected. AWE Plc Registered in England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR From knepley at gmail.com Mon Aug 5 07:24:12 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 5 Aug 2013 07:24:12 -0500 Subject: [petsc-users] EXTERNAL: Re: Matrix assembly error in PETSc In-Reply-To: <201308051220.r75CKvpJ029152@msw2.awe.co.uk> References: <201308011508.r71F8bgq024836@msw2.awe.co.uk> <94E6CE15-F08C-4961-A184-FEC03FA241F3@mcs.anl.gov> <201308051220.r75CKvpJ029152@msw2.awe.co.uk> Message-ID: On Mon, Aug 5, 2013 at 7:20 AM, wrote: > Hello Barry, > > Thanks for your response. Do you know how to get the entire error message? > That has to do with how MPI handles output in your environment. You should ask your system administrator. Also consider running in the debugger. Matt > Regards, > Wadud. > > -----Original Message----- > From: Barry Smith [mailto:bsmith at mcs.anl.gov] > Sent: 01 August 2013 19:02 > To: Miah Wadud AWE > Cc: petsc-users at mcs.anl.gov > Subject: EXTERNAL: Re: [petsc-users] Matrix assembly error in PETSc > > > Please always send the ENTIRE error message, it makes it much easier > for us to deduce what is going on. > > Error code 63 is PETSC_ERR_ARG_OUTOFRANGE which presumably is generated > in MatSetValues_MPIAIJ() which means a row or column index is out of > range. But since this is called within the MatAssemblyEnd_MPIAIJ() it > should never be out of range. The most likely cause is data corruption on > values passed between processes with MPI. It is possible the error is due > to bugs in the MPI implementation or due to memory corruption elsewhere. I > would first recommend running the code with valgrind (and enormously > powerful tool) to eliminate the chance of memory corruption > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > Let us know what happens, > > Barry > > MPI 2.0 vs MPI 3.0 is likely not the issue. > > > On Aug 1, 2013, at 10:08 AM, Wadud.Miah at awe.co.uk wrote: > > > Hello, > > > > I am running an application code which works with 4, 8 and 18 processes > but crashes with 16 processes. I have used MPICH2 and MVAPICH2 (both > adhered to the MPI 3.0 standard) and both cause the same problem. I get the > following error message: > > > > [12]PETSC ERROR: MatSetValues_MPIAIJ() line 564 in > src/mat/impls/aij/mpi/mpiaij.c > > [12]PETSC ERROR: MatAssemblyEnd_MPIAIJ() line 680 in > src/mat/impls/aij/mpi/mpiaij.c > > [12]PETSC ERROR: MatAssemblyEnd() line 4879 in src/mat/interface/matrix.c > > > > [12] --> Error in "MatAssemblyEnd()". > > [12] --> Code: 63 > > > > However, I do not get this using the Intel MPI (which adheres to the MPI > 2.0 standard) library. Any help will be greatly appreciated. > > > > Regards, > > > > -------------------------- > > Wadud Miah > > HPC, Design and Theoretical Physics > > Direct: 0118 98 56220 > > AWE, Aldermaston, Reading, RG7 4PR > > > > > > ___________________________________________________ > ____________________________ The information in this email and in any > attachment(s) is commercial in confidence. If you are not the named > addressee(s) or if you receive this email in error then any distribution, > copying or use of this communication or the information in it is strictly > prohibited. Please notify us immediately by email at admin.internet(at) > awe.co.uk, and then delete this message from your computer. While > attachments are virus checked, AWE plc does not accept any liability in > respect of any virus which is not detected. AWE Plc Registered in England > and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR > > > > > ___________________________________________________ > ____________________________ > > The information in this email and in any attachment(s) is > commercial in confidence. If you are not the named addressee(s) > or > if you receive this email in error then any distribution, copying or > use of this communication or the information in it is strictly > prohibited. Please notify us immediately by email at > admin.internet(at)awe.co.uk, and then delete this message from > your computer. While attachments are virus checked, AWE plc > does not accept any liability in respect of any virus which is not > detected. > > AWE Plc > Registered in England and Wales > Registration No 02763902 > AWE, Aldermaston, Reading, RG7 4PR > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Mon Aug 5 07:54:48 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Mon, 5 Aug 2013 14:54:48 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: <87li5555oo.fsf@mcs.anl.gov> References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: > Bishesh Khanal writes: > > > Now, I implemented two different approaches, each for both 2D and 3D, in > > MATLAB. It works for the smaller sizes but I have problems solving it for > > the problem size I need (250^3 grid size). > > I use staggered grid with p on cell centers, and components of v on cell > > faces. Similar split up of K to cell center and faces to account for the > > variable viscosity case) > > Okay, you're using a staggered-grid finite difference discretization of > variable-viscosity Stokes. This is a common problem and I recommend > starting with PCFieldSplit with Schur complement reduction (make that > work first, then switch to block preconditioner). You can use PCLSC or > (probably better for you), assemble a preconditioning matrix containing > the inverse viscosity in the pressure-pressure block. This diagonal > matrix is a spectrally equivalent (or nearly so, depending on > discretization) approximation of the Schur complement. The velocity > block can be solved with algebraic multigrid. Read the PCFieldSplit > docs (follow papers as appropriate) and let us know if you get stuck. > I was trying to assemble the inverse viscosity diagonal matrix to use as the preconditioner for the Schur complement solve step as you suggested. I've few questions about the ways to implement this in Petsc: A naive approach that I can think of would be to create a vector with its components as reciprocal viscosities of the cell centers corresponding to the pressure variables, and then create a diagonal matrix from this vector. However I'm not sure about: How can I make this matrix, (say S_p) compatible to the Petsc distribution of the different rows of the main system matrix over different processors ? The main matrix was created using the DMDA structure with 4 dof as explained before. The main matrix correspond to the DMDA with 4 dofs but for the S_p matrix would correspond to only pressure space. Should the distribution of the rows of S_p among different processor not correspond to the distribution of the rhs vector, say h' if it is solving for p with Sp = h' where S = A11 inv(A00) A01 ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Aug 5 08:17:52 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 5 Aug 2013 08:17:52 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: > > > > On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: > >> Bishesh Khanal writes: >> >> > Now, I implemented two different approaches, each for both 2D and 3D, in >> > MATLAB. It works for the smaller sizes but I have problems solving it >> for >> > the problem size I need (250^3 grid size). >> > I use staggered grid with p on cell centers, and components of v on cell >> > faces. Similar split up of K to cell center and faces to account for the >> > variable viscosity case) >> >> Okay, you're using a staggered-grid finite difference discretization of >> variable-viscosity Stokes. This is a common problem and I recommend >> starting with PCFieldSplit with Schur complement reduction (make that >> work first, then switch to block preconditioner). You can use PCLSC or >> (probably better for you), assemble a preconditioning matrix containing >> the inverse viscosity in the pressure-pressure block. This diagonal >> matrix is a spectrally equivalent (or nearly so, depending on >> discretization) approximation of the Schur complement. The velocity >> block can be solved with algebraic multigrid. Read the PCFieldSplit >> docs (follow papers as appropriate) and let us know if you get stuck. >> > > I was trying to assemble the inverse viscosity diagonal matrix to use as > the preconditioner for the Schur complement solve step as you suggested. > I've few questions about the ways to implement this in Petsc: > A naive approach that I can think of would be to create a vector with its > components as reciprocal viscosities of the cell centers corresponding to > the pressure variables, and then create a diagonal matrix from this vector. > However I'm not sure about: > How can I make this matrix, (say S_p) compatible to the Petsc distribution > of the different rows of the main system matrix over different processors ? > The main matrix was created using the DMDA structure with 4 dof as > explained before. > The main matrix correspond to the DMDA with 4 dofs but for the S_p matrix > would correspond to only pressure space. Should the distribution of the > rows of S_p among different processor not correspond to the distribution of > the rhs vector, say h' if it is solving for p with Sp = h' where S = A11 > inv(A00) A01 ? > PETSc distributed vertices, not dofs, so it never breaks blocks. The P distribution is the same as the entire problem divided by 4. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Mon Aug 5 08:48:09 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Mon, 5 Aug 2013 15:48:09 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: > On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: > >> >> >> >> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >> >>> Bishesh Khanal writes: >>> >>> > Now, I implemented two different approaches, each for both 2D and 3D, >>> in >>> > MATLAB. It works for the smaller sizes but I have problems solving it >>> for >>> > the problem size I need (250^3 grid size). >>> > I use staggered grid with p on cell centers, and components of v on >>> cell >>> > faces. Similar split up of K to cell center and faces to account for >>> the >>> > variable viscosity case) >>> >>> Okay, you're using a staggered-grid finite difference discretization of >>> variable-viscosity Stokes. This is a common problem and I recommend >>> starting with PCFieldSplit with Schur complement reduction (make that >>> work first, then switch to block preconditioner). You can use PCLSC or >>> (probably better for you), assemble a preconditioning matrix containing >>> the inverse viscosity in the pressure-pressure block. This diagonal >>> matrix is a spectrally equivalent (or nearly so, depending on >>> discretization) approximation of the Schur complement. The velocity >>> block can be solved with algebraic multigrid. Read the PCFieldSplit >>> docs (follow papers as appropriate) and let us know if you get stuck. >>> >> >> I was trying to assemble the inverse viscosity diagonal matrix to use as >> the preconditioner for the Schur complement solve step as you suggested. >> I've few questions about the ways to implement this in Petsc: >> A naive approach that I can think of would be to create a vector with its >> components as reciprocal viscosities of the cell centers corresponding to >> the pressure variables, and then create a diagonal matrix from this vector. >> However I'm not sure about: >> How can I make this matrix, (say S_p) compatible to the Petsc >> distribution of the different rows of the main system matrix over different >> processors ? The main matrix was created using the DMDA structure with 4 >> dof as explained before. >> The main matrix correspond to the DMDA with 4 dofs but for the S_p matrix >> would correspond to only pressure space. Should the distribution of the >> rows of S_p among different processor not correspond to the distribution of >> the rhs vector, say h' if it is solving for p with Sp = h' where S = A11 >> inv(A00) A01 ? >> > > PETSc distributed vertices, not dofs, so it never breaks blocks. The P > distribution is the same as the entire problem divided by 4. > Thanks Matt. So if I create a new DMDA with same grid size but with dof=1 instead of 4, the vertices for this new DMDA will be identically distributed as for the original DMDA ? Or should I inform PETSc by calling a particular function to make these two DMDA have identical distribution of the vertices ? Even then I think there might be a problem due to the presence of "fictitious pressure vertices". The system matrix (A) contains an identity corresponding to these fictitious pressure nodes, thus when using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size that correspond to only non-fictitious P-nodes. So the preconditioner S_p for the Schur complement outer solve with Sp = h' will also need to correspond to only the non-fictitious P-nodes. This means its size does not directly correspond to the DMDA grid defined for the original problem. Could you please suggest an efficient way of assembling this S_p matrix ? > > Matt > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Aug 5 09:14:20 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 5 Aug 2013 09:14:20 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: > > > > On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: > >> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>> >>>> Bishesh Khanal writes: >>>> >>>> > Now, I implemented two different approaches, each for both 2D and 3D, >>>> in >>>> > MATLAB. It works for the smaller sizes but I have problems solving it >>>> for >>>> > the problem size I need (250^3 grid size). >>>> > I use staggered grid with p on cell centers, and components of v on >>>> cell >>>> > faces. Similar split up of K to cell center and faces to account for >>>> the >>>> > variable viscosity case) >>>> >>>> Okay, you're using a staggered-grid finite difference discretization of >>>> variable-viscosity Stokes. This is a common problem and I recommend >>>> starting with PCFieldSplit with Schur complement reduction (make that >>>> work first, then switch to block preconditioner). You can use PCLSC or >>>> (probably better for you), assemble a preconditioning matrix containing >>>> the inverse viscosity in the pressure-pressure block. This diagonal >>>> matrix is a spectrally equivalent (or nearly so, depending on >>>> discretization) approximation of the Schur complement. The velocity >>>> block can be solved with algebraic multigrid. Read the PCFieldSplit >>>> docs (follow papers as appropriate) and let us know if you get stuck. >>>> >>> >>> I was trying to assemble the inverse viscosity diagonal matrix to use as >>> the preconditioner for the Schur complement solve step as you suggested. >>> I've few questions about the ways to implement this in Petsc: >>> A naive approach that I can think of would be to create a vector with >>> its components as reciprocal viscosities of the cell centers corresponding >>> to the pressure variables, and then create a diagonal matrix from this >>> vector. However I'm not sure about: >>> How can I make this matrix, (say S_p) compatible to the Petsc >>> distribution of the different rows of the main system matrix over different >>> processors ? The main matrix was created using the DMDA structure with 4 >>> dof as explained before. >>> The main matrix correspond to the DMDA with 4 dofs but for the S_p >>> matrix would correspond to only pressure space. Should the distribution of >>> the rows of S_p among different processor not correspond to the >>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>> where S = A11 inv(A00) A01 ? >>> >> >> PETSc distributed vertices, not dofs, so it never breaks blocks. The P >> distribution is the same as the entire problem divided by 4. >> > > Thanks Matt. So if I create a new DMDA with same grid size but with dof=1 > instead of 4, the vertices for this new DMDA will be identically > distributed as for the original DMDA ? Or should I inform PETSc by calling > a particular function to make these two DMDA have identical distribution of > the vertices ? > Yes. > Even then I think there might be a problem due to the presence of > "fictitious pressure vertices". The system matrix (A) contains an identity > corresponding to these fictitious pressure nodes, thus when using a > -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size > that correspond to only non-fictitious P-nodes. So the preconditioner S_p > for the Schur complement outer solve with Sp = h' will also need to > correspond to only the non-fictitious P-nodes. This means its size does not > directly correspond to the DMDA grid defined for the original problem. > Could you please suggest an efficient way of assembling this S_p matrix ? > Don't use detect_saddle, but split it by fields -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 Matt > >> Matt >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Mon Aug 5 21:53:50 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Tue, 6 Aug 2013 10:53:50 +0800 (CST) Subject: [petsc-users] how to get the input value from the input command Message-ID: <7040174a.190d8.140518b7e11.Coremail.ztdepyahoo@163.com> i need to get the number of processor in the command line as the input of my function.for example, mpiexec -np 4 i need the " 4" as the input number of my function. how to get it. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Aug 5 22:03:00 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 5 Aug 2013 22:03:00 -0500 Subject: [petsc-users] how to get the input value from the input command In-Reply-To: <7040174a.190d8.140518b7e11.Coremail.ztdepyahoo@163.com> References: <7040174a.190d8.140518b7e11.Coremail.ztdepyahoo@163.com> Message-ID: <33A955AF-610F-41D9-B3CF-B1ECF82A2E5B@mcs.anl.gov> On Aug 5, 2013, at 9:53 PM, ??? wrote: > i need to get the number of processor in the command line as the input of my function.for example, > mpiexec -np 4 > i need the " 4" as the input number of my function. how to get it. MPI_Comm_rank(MPI_COMM_WORLD,&rank); > > > From bisheshkh at gmail.com Tue Aug 6 08:06:23 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Tue, 6 Aug 2013 15:06:23 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: > On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: > >> >> >> >> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >> >>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>> >>>>> Bishesh Khanal writes: >>>>> >>>>> > Now, I implemented two different approaches, each for both 2D and >>>>> 3D, in >>>>> > MATLAB. It works for the smaller sizes but I have problems solving >>>>> it for >>>>> > the problem size I need (250^3 grid size). >>>>> > I use staggered grid with p on cell centers, and components of v on >>>>> cell >>>>> > faces. Similar split up of K to cell center and faces to account for >>>>> the >>>>> > variable viscosity case) >>>>> >>>>> Okay, you're using a staggered-grid finite difference discretization of >>>>> variable-viscosity Stokes. This is a common problem and I recommend >>>>> starting with PCFieldSplit with Schur complement reduction (make that >>>>> work first, then switch to block preconditioner). You can use PCLSC or >>>>> (probably better for you), assemble a preconditioning matrix containing >>>>> the inverse viscosity in the pressure-pressure block. This diagonal >>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>> discretization) approximation of the Schur complement. The velocity >>>>> block can be solved with algebraic multigrid. Read the PCFieldSplit >>>>> docs (follow papers as appropriate) and let us know if you get stuck. >>>>> >>>> >>>> I was trying to assemble the inverse viscosity diagonal matrix to use >>>> as the preconditioner for the Schur complement solve step as you suggested. >>>> I've few questions about the ways to implement this in Petsc: >>>> A naive approach that I can think of would be to create a vector with >>>> its components as reciprocal viscosities of the cell centers corresponding >>>> to the pressure variables, and then create a diagonal matrix from this >>>> vector. However I'm not sure about: >>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>> distribution of the different rows of the main system matrix over different >>>> processors ? The main matrix was created using the DMDA structure with 4 >>>> dof as explained before. >>>> The main matrix correspond to the DMDA with 4 dofs but for the S_p >>>> matrix would correspond to only pressure space. Should the distribution of >>>> the rows of S_p among different processor not correspond to the >>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>> where S = A11 inv(A00) A01 ? >>>> >>> >>> PETSc distributed vertices, not dofs, so it never breaks blocks. The P >>> distribution is the same as the entire problem divided by 4. >>> >> >> Thanks Matt. So if I create a new DMDA with same grid size but with dof=1 >> instead of 4, the vertices for this new DMDA will be identically >> distributed as for the original DMDA ? Or should I inform PETSc by calling >> a particular function to make these two DMDA have identical distribution of >> the vertices ? >> > > Yes. > > >> Even then I think there might be a problem due to the presence of >> "fictitious pressure vertices". The system matrix (A) contains an identity >> corresponding to these fictitious pressure nodes, thus when using a >> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >> for the Schur complement outer solve with Sp = h' will also need to >> correspond to only the non-fictitious P-nodes. This means its size does not >> directly correspond to the DMDA grid defined for the original problem. >> Could you please suggest an efficient way of assembling this S_p matrix ? >> > > Don't use detect_saddle, but split it by fields -pc_fieldsplit_0_fields > 0,1,2 -pc_fieldsplit_1_fields 4 > How can I set this split in the code itself without giving it as a command line option when the system matrix is assembled from the DMDA for the whole system with 4 dofs. (i.e. *without* using the DMComposite or *without*using the nested block matrices to assemble different blocks separately and then combine them together). I need the split to get access to the fieldsplit_1_ksp in my code, because not using detect_saddle_point means I cannot use -fieldsplit_1_ksp_constant_null_space due to the presence of identity for the fictitious pressure nodes present in the fieldsplit_1_ block. I need to use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. > > Matt > > >> >>> Matt >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Aug 6 09:40:28 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Aug 2013 09:40:28 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: > > > > On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: > >> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >>> >>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>>> >>>>>> Bishesh Khanal writes: >>>>>> >>>>>> > Now, I implemented two different approaches, each for both 2D and >>>>>> 3D, in >>>>>> > MATLAB. It works for the smaller sizes but I have problems solving >>>>>> it for >>>>>> > the problem size I need (250^3 grid size). >>>>>> > I use staggered grid with p on cell centers, and components of v on >>>>>> cell >>>>>> > faces. Similar split up of K to cell center and faces to account >>>>>> for the >>>>>> > variable viscosity case) >>>>>> >>>>>> Okay, you're using a staggered-grid finite difference discretization >>>>>> of >>>>>> variable-viscosity Stokes. This is a common problem and I recommend >>>>>> starting with PCFieldSplit with Schur complement reduction (make that >>>>>> work first, then switch to block preconditioner). You can use PCLSC >>>>>> or >>>>>> (probably better for you), assemble a preconditioning matrix >>>>>> containing >>>>>> the inverse viscosity in the pressure-pressure block. This diagonal >>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>> discretization) approximation of the Schur complement. The velocity >>>>>> block can be solved with algebraic multigrid. Read the PCFieldSplit >>>>>> docs (follow papers as appropriate) and let us know if you get stuck. >>>>>> >>>>> >>>>> I was trying to assemble the inverse viscosity diagonal matrix to use >>>>> as the preconditioner for the Schur complement solve step as you suggested. >>>>> I've few questions about the ways to implement this in Petsc: >>>>> A naive approach that I can think of would be to create a vector with >>>>> its components as reciprocal viscosities of the cell centers corresponding >>>>> to the pressure variables, and then create a diagonal matrix from this >>>>> vector. However I'm not sure about: >>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>> distribution of the different rows of the main system matrix over different >>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>> dof as explained before. >>>>> The main matrix correspond to the DMDA with 4 dofs but for the S_p >>>>> matrix would correspond to only pressure space. Should the distribution of >>>>> the rows of S_p among different processor not correspond to the >>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>> where S = A11 inv(A00) A01 ? >>>>> >>>> >>>> PETSc distributed vertices, not dofs, so it never breaks blocks. The P >>>> distribution is the same as the entire problem divided by 4. >>>> >>> >>> Thanks Matt. So if I create a new DMDA with same grid size but with >>> dof=1 instead of 4, the vertices for this new DMDA will be identically >>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>> a particular function to make these two DMDA have identical distribution of >>> the vertices ? >>> >> >> Yes. >> >> >>> Even then I think there might be a problem due to the presence of >>> "fictitious pressure vertices". The system matrix (A) contains an identity >>> corresponding to these fictitious pressure nodes, thus when using a >>> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>> for the Schur complement outer solve with Sp = h' will also need to >>> correspond to only the non-fictitious P-nodes. This means its size does not >>> directly correspond to the DMDA grid defined for the original problem. >>> Could you please suggest an efficient way of assembling this S_p matrix ? >>> >> >> Don't use detect_saddle, but split it by fields -pc_fieldsplit_0_fields >> 0,1,2 -pc_fieldsplit_1_fields 4 >> > > How can I set this split in the code itself without giving it as a command > line option when the system matrix is assembled from the DMDA for the whole > system with 4 dofs. (i.e. *without* using the DMComposite or *without*using the nested block matrices to assemble different blocks separately and > then combine them together). > I need the split to get access to the fieldsplit_1_ksp in my code, because > not using detect_saddle_point means I cannot use > -fieldsplit_1_ksp_constant_null_space due to the presence of identity for > the fictitious pressure nodes present in the fieldsplit_1_ block. I need to > use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. > This is currently a real problem with the DMDA. In the unstructured case, where we always need specialized spaces, you can use something like PetscObject pressure; MatNullSpace nullSpacePres; ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) nullSpacePres);CHKERRQ(ierr); ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); and then DMGetSubDM() uses this information to attach the null space to the IS that is created using the information in the PetscSection. If you use a PetscSection to set the data layout over the DMDA, I think this works correctly, but this has not been tested at all and is very new code. Eventually, I think we want all DMs to use this mechanism, but we are still working it out. Bottom line: For custom null spaces using the default layout in DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, which is somewhat subtle. You need to call KSPSetUp() and then reach in and get the PC, and the subKSPs. I don't like this at all, but we have not reorganized that code (which could be very simple and inflexible since its very structured). Matt > >> Matt >> >> >>> >>>> Matt >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Tue Aug 6 10:59:10 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Tue, 6 Aug 2013 17:59:10 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: > On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: > >> >> >> >> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >> >>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >>>> >>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>>>> >>>>>>> Bishesh Khanal writes: >>>>>>> >>>>>>> > Now, I implemented two different approaches, each for both 2D and >>>>>>> 3D, in >>>>>>> > MATLAB. It works for the smaller sizes but I have problems solving >>>>>>> it for >>>>>>> > the problem size I need (250^3 grid size). >>>>>>> > I use staggered grid with p on cell centers, and components of v >>>>>>> on cell >>>>>>> > faces. Similar split up of K to cell center and faces to account >>>>>>> for the >>>>>>> > variable viscosity case) >>>>>>> >>>>>>> Okay, you're using a staggered-grid finite difference discretization >>>>>>> of >>>>>>> variable-viscosity Stokes. This is a common problem and I recommend >>>>>>> starting with PCFieldSplit with Schur complement reduction (make that >>>>>>> work first, then switch to block preconditioner). You can use PCLSC >>>>>>> or >>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>> containing >>>>>>> the inverse viscosity in the pressure-pressure block. This diagonal >>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>> discretization) approximation of the Schur complement. The velocity >>>>>>> block can be solved with algebraic multigrid. Read the PCFieldSplit >>>>>>> docs (follow papers as appropriate) and let us know if you get stuck. >>>>>>> >>>>>> >>>>>> I was trying to assemble the inverse viscosity diagonal matrix to use >>>>>> as the preconditioner for the Schur complement solve step as you suggested. >>>>>> I've few questions about the ways to implement this in Petsc: >>>>>> A naive approach that I can think of would be to create a vector with >>>>>> its components as reciprocal viscosities of the cell centers corresponding >>>>>> to the pressure variables, and then create a diagonal matrix from this >>>>>> vector. However I'm not sure about: >>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>> distribution of the different rows of the main system matrix over different >>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>> dof as explained before. >>>>>> The main matrix correspond to the DMDA with 4 dofs but for the S_p >>>>>> matrix would correspond to only pressure space. Should the distribution of >>>>>> the rows of S_p among different processor not correspond to the >>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>> where S = A11 inv(A00) A01 ? >>>>>> >>>>> >>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. The P >>>>> distribution is the same as the entire problem divided by 4. >>>>> >>>> >>>> Thanks Matt. So if I create a new DMDA with same grid size but with >>>> dof=1 instead of 4, the vertices for this new DMDA will be identically >>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>> a particular function to make these two DMDA have identical distribution of >>>> the vertices ? >>>> >>> >>> Yes. >>> >>> >>>> Even then I think there might be a problem due to the presence of >>>> "fictitious pressure vertices". The system matrix (A) contains an identity >>>> corresponding to these fictitious pressure nodes, thus when using a >>>> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>> for the Schur complement outer solve with Sp = h' will also need to >>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>> directly correspond to the DMDA grid defined for the original problem. >>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>> >>> >>> Don't use detect_saddle, but split it by fields -pc_fieldsplit_0_fields >>> 0,1,2 -pc_fieldsplit_1_fields 4 >>> >> >> How can I set this split in the code itself without giving it as a >> command line option when the system matrix is assembled from the DMDA for >> the whole system with 4 dofs. (i.e. *without* using the DMComposite or * >> without* using the nested block matrices to assemble different blocks >> separately and then combine them together). >> I need the split to get access to the fieldsplit_1_ksp in my code, >> because not using detect_saddle_point means I cannot use >> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >> > > This is currently a real problem with the DMDA. In the unstructured case, > where we always need specialized spaces, you can > use something like > > PetscObject pressure; > MatNullSpace nullSpacePres; > > ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); > ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, 0, > NULL, &nullSpacePres);CHKERRQ(ierr); > ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) > nullSpacePres);CHKERRQ(ierr); > ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); > > and then DMGetSubDM() uses this information to attach the null space to > the IS that is created using the information in the PetscSection. > If you use a PetscSection to set the data layout over the DMDA, I think > this works correctly, but this has not been tested at all and is very > new code. Eventually, I think we want all DMs to use this mechanism, but > we are still working it out. > Currently I do not use PetscSection. If this makes a cleaner approach, I'd try it too but may a bit later (right now I'd like test my model with a quickfix even if it means a little dirty code!) > > Bottom line: For custom null spaces using the default layout in DMDA, you > need to take apart the PCFIELDSPLIT after it has been setup, > which is somewhat subtle. You need to call KSPSetUp() and then reach in > and get the PC, and the subKSPs. I don't like this at all, but we > have not reorganized that code (which could be very simple and inflexible > since its very structured). > So I tried to get this approach working but I could not succeed and encountered some errors. Here is a code snippet: //mDa is the DMDA that describes the whole grid with all 4 dofs (3 velocity components and 1 pressure comp.) ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); ierr = DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the mNullSpaceSystem based on mDa, that contains a null space basis for the complete system. ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); //This I expect would register these options I give:-pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 //-pc_fieldsplit_1_fields 3 ierr = KSPSetUp(mKsp);CHKERRQ(ierr); ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was obtained from the options (fieldsplit) ierr = PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); //I have created the matrix mPcForSc using a DMDA with identical //size to mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); KSP *kspSchur; PetscInt kspSchurPos = 1; ierr = PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); ierr = KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); //The null space is the one that correspond to only pressure nodes, created using the mDaPressure. ierr = PetscFree(kspSchur);CHKERRQ(ierr); ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); The errors I get when running with options: -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: No support for this operation for this object type! [0]PETSC ERROR: Support only implemented for 2d! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards by bkhanal Tue Aug 6 17:35:30 2013 [0]PETSC ERROR: Libraries linked from /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 -with-clanguage=cxx --download-hypre=1 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c [0]PETSC ERROR: DMCreateSubDM() line 1267 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0]PETSC ERROR: PCSetUp() line 890 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUp() line 278 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: solveModel() line 181 in "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx WARNING! There are options you set that were not used! WARNING! could be spelling mistake, etc! Option left: name:-pc_fieldsplit_1_fields value: 3 > > Matt > > >> >>> Matt >>> >>> >>>> >>>>> Matt >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhenglun.wei at gmail.com Tue Aug 6 14:22:43 2013 From: zhenglun.wei at gmail.com (Alan) Date: Tue, 06 Aug 2013 14:22:43 -0500 Subject: [petsc-users] KSP solver for single process Message-ID: <52014D03.6000302@gmail.com> Dear all, I hope you're having a nice day. I have a quick question on solving Poisson equation with KSP solvers (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 It performs very well in parallel computation and scalability is fine. However, if I run it with a single process, the KSP solver is much slower than direct ones, i.e. Mudpack. Briefly, the speed difference between the KSP solver and the direct solver is negligible on dealing with small problems (i.e.36k DoFs ) but becomes very huge for moderate large problems (i.e. 180k DoFs). Although the direct solver inherently has better performance for moderate large problems in the single process, I wonder if any setup or approach can improve the performance of this KSP Poisson solver with the single process? or even make it obtain competitive speed (a little bit slower is fine) against direct solvers. thanks in advance, Alan From rupp at mcs.anl.gov Tue Aug 6 14:31:41 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Tue, 06 Aug 2013 14:31:41 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <52014D03.6000302@gmail.com> References: <52014D03.6000302@gmail.com> Message-ID: <52014F1D.90400@mcs.anl.gov> Hi Alan, please use -log_summary to get profiling information on the run. What is the bottleneck? Is it the number of solver iterations increasing significantly? If so, consider changing the preconditioner options (more levels!). I don't expect a direct solver to be any faster in the 180k case for a Poisson problem. Best regards, Karli On 08/06/2013 02:22 PM, Alan wrote: > Dear all, > I hope you're having a nice day. > I have a quick question on solving Poisson equation with KSP solvers > (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: > -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it > 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 > It performs very well in parallel computation and scalability is fine. > However, if I run it with a single process, the KSP solver is much > slower than direct ones, i.e. Mudpack. Briefly, the speed difference > between the KSP solver and the direct solver is negligible on dealing > with small problems (i.e.36k DoFs ) but becomes very huge for moderate > large problems (i.e. 180k DoFs). Although the direct solver inherently > has better performance for moderate large problems in the single > process, I wonder if any setup or approach can improve the performance > of this KSP Poisson solver with the single process? or even make it > obtain competitive speed (a little bit slower is fine) against direct > solvers. > > thanks in advance, > Alan > From knepley at gmail.com Tue Aug 6 14:36:34 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Aug 2013 14:36:34 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <52014F1D.90400@mcs.anl.gov> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> Message-ID: On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp wrote: > Hi Alan, > > please use -log_summary to get profiling information on the run. What is > the bottleneck? Is it the number of solver iterations increasing > significantly? If so, consider changing the preconditioner options (more > levels!). I don't expect a direct solver to be any faster in the 180k > case for a Poisson problem. > Mudpack is geometric multigrid: http://www2.cisl.ucar.edu/resources/legacy/mudpack This should be faster. Matt > Best regards, > Karli > > > On 08/06/2013 02:22 PM, Alan wrote: > > Dear all, > > I hope you're having a nice day. > > I have a quick question on solving Poisson equation with KSP solvers > > (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver > with: > > -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it > > 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 > > It performs very well in parallel computation and scalability is fine. > > However, if I run it with a single process, the KSP solver is much > > slower than direct ones, i.e. Mudpack. Briefly, the speed difference > > between the KSP solver and the direct solver is negligible on dealing > > with small problems (i.e.36k DoFs ) but becomes very huge for moderate > > large problems (i.e. 180k DoFs). Although the direct solver inherently > > has better performance for moderate large problems in the single > > process, I wonder if any setup or approach can improve the performance > > of this KSP Poisson solver with the single process? or even make it > > obtain competitive speed (a little bit slower is fine) against direct > > solvers. > > > > thanks in advance, > > Alan > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ling.zou at inl.gov Tue Aug 6 14:38:48 2013 From: ling.zou at inl.gov (Zou (Non-US), Ling) Date: Tue, 6 Aug 2013 13:38:48 -0600 Subject: [petsc-users] understanding the eigenvalues of my system Message-ID: Dear All, I just explored the user manual v3.4 and I noticed the option -ksp_compute_eigenvalues I test it with my problem with ~1000 number of dofs, and it printed out Iteratively computed eigenvalues 0.840692 + 0i 0.857247 - 0.235747i 0.857247 + 0.235747i 0.999993 + 0i 1.03457 + 0i 2.69763 + 0i I noticed there are complex numbers... I wonder what does the printout info indicate, a good system, a bad system or nothing special? Best, Ling -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhenglun.wei at gmail.com Tue Aug 6 14:56:35 2013 From: zhenglun.wei at gmail.com (Alan) Date: Tue, 06 Aug 2013 14:56:35 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> Message-ID: <520154F3.2010203@gmail.com> Thanks for replies. Here I attached the log_summary for the large and small problems. The DoFs for the large problem is 4 times of that for the small problem. Few observations are listed here: 1, the total number of iterations does not change much from the small problem to the large one; 2, the time elapsed for KSPSolve() for the large problem is less than 4 times of that for the small problem; 3, the time elapsed for PCSet() for the large problem is more than 10 times of that for the small problem; 4, the time elapsed for PCGAMGProl_AGG for the large problem is more than 20 times of that for the small problem; In my code, I have solved the Poisson equation for twice with difference RHS; however, the observation above is almost consistent for these two times. Do these observation indicate that I should switch my PC from GAMG to MG for solving Poisson equation in a single process? best, Alan > On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp > wrote: > > Hi Alan, > > please use -log_summary to get profiling information on the run. > What is > the bottleneck? Is it the number of solver iterations increasing > significantly? If so, consider changing the preconditioner options > (more > levels!). I don't expect a direct solver to be any faster in the 180k > case for a Poisson problem. > > > Mudpack is geometric multigrid: > http://www2.cisl.ucar.edu/resources/legacy/mudpack > This should be faster. > > Matt > > Best regards, > Karli > > > On 08/06/2013 02:22 PM, Alan wrote: > > Dear all, > > I hope you're having a nice day. > > I have a quick question on solving Poisson equation with KSP solvers > > (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this > solver with: > > -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 > -mg_levels_ksp_max_it > > 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 > > It performs very well in parallel computation and scalability is > fine. > > However, if I run it with a single process, the KSP solver is much > > slower than direct ones, i.e. Mudpack. Briefly, the speed difference > > between the KSP solver and the direct solver is negligible on > dealing > > with small problems (i.e.36k DoFs ) but becomes very huge for > moderate > > large problems (i.e. 180k DoFs). Although the direct solver > inherently > > has better performance for moderate large problems in the single > > process, I wonder if any setup or approach can improve the > performance > > of this KSP Poisson solver with the single process? or even make it > > obtain competitive speed (a little bit slower is fine) against > direct > > solvers. > > > > thanks in advance, > > Alan > > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- node = 0 mx = 600 my = 240 mm = 1 nn = 1 node = 0 xs = 0 ys = 0 xw = 600 yw = 240 rank = 0: left BC = 1, rightBC = 1, bottom BC = 1, top BC = 1 rank = 0: xc = 599, yc = 240, xw = 600, yw = 240 rank 0 Cylinder: neighbor = left: 128 right: 152 bottom: 48 top: 72 Current Computation Time Step = 0, Total Computation Time step number = 1 Heave_dH = 0.001841, PhysVel.y = 0.169646 Rank = 0, Average velocity on IB points: U-ib = 1.000000, V-ib = 0.000000 0 KSP Residual norm 1.766720535653e+01 1 KSP Residual norm 8.320548736317e+00 2 KSP Residual norm 3.497137130771e+00 3 KSP Residual norm 1.003445717739e+00 4 KSP Residual norm 3.869439823358e-01 5 KSP Residual norm 2.475103239062e-01 6 KSP Residual norm 2.281611491375e-01 7 KSP Residual norm 1.084138785055e-01 8 KSP Residual norm 5.141423441920e-02 9 KSP Residual norm 3.347748825553e-02 10 KSP Residual norm 2.628353299859e-02 11 KSP Residual norm 2.096532648662e-02 12 KSP Residual norm 8.618284456392e-03 13 KSP Residual norm 5.565127181073e-03 14 KSP Residual norm 6.314337218164e-03 15 KSP Residual norm 3.131142787500e-03 16 KSP Residual norm 3.068804030745e-03 17 KSP Residual norm 2.349857536588e-03 18 KSP Residual norm 8.503110026710e-04 19 KSP Residual norm 7.687867061945e-04 20 KSP Residual norm 4.742409404804e-04 21 KSP Residual norm 5.672769845689e-04 22 KSP Residual norm 4.808829820485e-04 23 KSP Residual norm 2.857419449644e-04 24 KSP Residual norm 1.438427631790e-04 25 KSP Residual norm 4.860115232885e-05 26 KSP Residual norm 3.225934842340e-05 27 KSP Residual norm 1.397147991245e-05 28 KSP Residual norm 1.260818970574e-05 29 KSP Residual norm 1.047031075597e-05 30 KSP Residual norm 5.920108684271e-06 31 KSP Residual norm 3.051540274780e-06 32 KSP Residual norm 1.496508480442e-06 Pressure Check Iteration = 1, Error = 8.812422e-09, Max Pressure = 1.377380 @ (134,61) Pressure Corrector RHS Calculated!!!!! Pressure Corrector RHS Calculated!!!!! 0 KSP Residual norm 1.055813500392e+01 1 KSP Residual norm 5.111549490211e+00 2 KSP Residual norm 2.431687757980e+00 3 KSP Residual norm 7.380477067726e-01 4 KSP Residual norm 2.649467853279e-01 5 KSP Residual norm 1.581771354806e-01 6 KSP Residual norm 1.813719861751e-01 7 KSP Residual norm 8.239048633455e-02 8 KSP Residual norm 3.243134600574e-02 9 KSP Residual norm 2.196187959685e-02 10 KSP Residual norm 1.990772184469e-02 11 KSP Residual norm 1.408691713651e-02 12 KSP Residual norm 5.553521214754e-03 13 KSP Residual norm 3.627141705183e-03 14 KSP Residual norm 4.194181361457e-03 15 KSP Residual norm 2.317138149101e-03 16 KSP Residual norm 2.352605973283e-03 17 KSP Residual norm 1.884396223781e-03 18 KSP Residual norm 8.511003652931e-04 19 KSP Residual norm 5.716104892684e-04 20 KSP Residual norm 3.455582593757e-04 21 KSP Residual norm 3.943724766808e-04 22 KSP Residual norm 3.558747195132e-04 23 KSP Residual norm 2.440517227429e-04 24 KSP Residual norm 1.220784864643e-04 25 KSP Residual norm 4.312636138662e-05 26 KSP Residual norm 2.491805473018e-05 27 KSP Residual norm 1.092946022663e-05 28 KSP Residual norm 9.586541934346e-06 29 KSP Residual norm 8.338858229099e-06 30 KSP Residual norm 5.256635417170e-06 31 KSP Residual norm 2.650948822714e-06 32 KSP Residual norm 1.273360000962e-06 33 KSP Residual norm 1.159289546119e-06 34 KSP Residual norm 6.729897324730e-07 Rank#0, Max dp = 7.959984e-01 @ (134, 66) Rank#0, time step = 0, continuity = 2.829245e-07 @ (134, 90) Rank = 0, Computation for time step 0 is done!! 0 time steps left Rank = 0, W time = 122.641372 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ex29 on a arch-linux2-c-debug named l2118a-linux.soecs.ku.edu with 1 processor, by zlwei Tue Aug 6 14:45:13 2013 Using Petsc Development GIT revision: 7a0108da53bbe8dff949efa7a5ab1303a7fb1560 GIT Date: 2013-06-20 10:11:56 +0200 Max Max/Min Avg Total Time (sec): 1.238e+02 1.00000 1.238e+02 Objects: 4.010e+02 1.00000 4.010e+02 Flops: 8.165e+08 1.00000 8.165e+08 8.165e+08 Flops/sec: 6.596e+06 1.00000 6.596e+06 6.596e+06 Memory: 8.291e+07 1.00000 8.291e+07 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 3.629e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.6329e+00 1.3% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+00 0.2% 1: DMMG Setup: 9.4195e-02 0.1% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+01 1.7% 2: Pressure RHS Setup: 6.3622e-02 0.1% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+01 1.7% 3: Pressure Solve: 6.0762e+01 49.1% 3.9881e+08 48.8% 0.000e+00 0.0% 0.000e+00 0.0% 1.695e+03 46.7% 4: Corrector RHS Setup: 4.9592e-02 0.0% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+01 1.7% 5: Corrector Solve: 6.1190e+01 49.4% 4.1768e+08 51.2% 0.000e+00 0.0% 0.000e+00 0.0% 1.747e+03 48.1% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %f - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ ########################################################## # # # WARNING!!! # # # # This code was compiled with a debugging option, # # To get timing results run ./configure # # using --with-debugging=no, the performance will # # be generally two or three times faster. # # # ########################################################## Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecSet 1 1.0 4.1831e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 1 1.0 2.2550e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 --- Event Stage 1: DMMG Setup ThreadCommRunKer 1 1.0 7.1526e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ThreadCommBarrie 1 1.0 3.8147e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 1 1.0 4.5800e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 5 0 0 0 0 0 --- Event Stage 2: Pressure RHS Setup VecSet 1 1.0 4.2260e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 7 0 0 0 0 0 --- Event Stage 3: Pressure Solve KSPGMRESOrthog 30 1.0 6.1212e-02 1.0 3.64e+07 1.0 0.0e+00 0.0e+00 1.6e+02 0 4 0 0 5 0 9 0 0 10 595 KSPSetUp 10 1.0 1.4705e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.3e+02 0 0 0 0 4 0 0 0 0 8 0 KSPSolve 1 1.0 4.4591e+00 1.0 3.08e+08 1.0 0.0e+00 0.0e+00 8.8e+02 4 38 0 0 24 7 77 0 0 52 69 VecMDot 30 1.0 2.5030e-02 1.0 1.82e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 727 VecTDot 64 1.0 4.8102e-02 1.0 1.84e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 383 VecNorm 66 1.0 2.3875e-02 1.0 1.31e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 551 VecScale 33 1.0 3.5070e-02 1.0 1.82e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 52 VecCopy 104 1.0 1.2174e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 414 1.0 4.0807e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 265 1.0 1.7370e-01 1.0 4.06e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 5 0 0 0 0 10 0 0 0 234 VecAYPX 229 1.0 1.0965e-01 1.0 1.99e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 181 VecMAXPY 33 1.0 4.1362e-02 1.0 2.15e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 5 0 0 0 520 VecAssemblyBegin 5 1.0 2.2888e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 5 1.0 2.0027e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 231 1.0 9.4288e-02 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 135 VecSetRandom 3 1.0 2.9329e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 33 1.0 4.2642e-02 1.0 5.46e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 128 MatMult 260 1.0 9.8725e-01 1.0 1.74e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 21 0 0 0 2 44 0 0 0 176 MatMultAdd 99 1.0 2.1511e-01 1.0 2.64e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 7 0 0 0 123 MatMultTranspose 99 1.0 2.3953e-01 1.0 2.64e+07 1.0 0.0e+00 0.0e+00 9.9e+01 0 3 0 0 3 0 7 0 0 6 110 MatSolve 33 1.0 1.2226e-03 1.0 7.49e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 61 MatLUFactorSym 1 1.0 5.2905e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 2.8586e-04 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 50 MatConvert 3 1.0 2.4556e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.0e+00 0 0 0 0 0 0 0 0 0 1 0 MatScale 9 1.0 2.1022e-02 1.0 2.71e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 129 MatAssemblyBegin 29 1.0 1.4186e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 29 1.0 1.9785e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 496479 1.0 2.0033e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 3 0 0 0 0 0 MatGetRowIJ 1 1.0 3.5048e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 4.7588e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 3 1.0 1.1538e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 1 0 MatAXPY 3 1.0 8.1131e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 3 1.0 2.1595e-01 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 1 0 0 1 11 MatMatMultSym 3 1.0 1.7298e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 1 0 MatMatMultNum 3 1.0 4.2782e-02 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 54 MatPtAP 3 1.0 3.9002e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 1.8e+01 0 1 0 0 0 1 2 0 0 1 21 MatPtAPSymbolic 3 1.0 1.2223e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 1 0 MatPtAPNumeric 3 1.0 2.6774e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 2 0 0 0 31 MatTrnMatMult 3 1.0 8.6899e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 3.6e+01 1 2 0 0 1 1 3 0 0 2 14 MatTrnMatMultSym 3 1.0 6.8294e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 1 0 0 0 1 1 0 0 0 2 0 MatTrnMatMultNum 3 1.0 1.8598e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 67 MatGetSymTrans 6 1.0 2.9898e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCGAMGgraph_AGG 3 1.0 7.7426e+00 1.0 1.91e+06 1.0 0.0e+00 0.0e+00 4.5e+01 6 0 0 0 1 13 0 0 0 3 0 PCGAMGcoarse_AGG 3 1.0 1.1194e+00 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 5.1e+01 1 2 0 0 1 2 3 0 0 3 11 PCGAMGProl_AGG 3 1.0 4.1053e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 33 0 0 0 0 68 0 0 0 1 0 PCGAMGPOpt_AGG 3 1.0 2.9895e+00 1.0 6.79e+07 1.0 0.0e+00 0.0e+00 4.9e+02 2 8 0 0 13 5 17 0 0 29 23 PCSetUp 2 1.0 5.3339e+01 1.0 9.05e+07 1.0 0.0e+00 0.0e+00 8.0e+02 43 11 0 0 22 88 23 0 0 47 2 PCSetUpOnBlocks 33 1.0 2.3496e-03 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 1 6 PCApply 33 1.0 3.3544e+00 1.0 2.12e+08 1.0 0.0e+00 0.0e+00 7.8e+02 3 26 0 0 22 6 53 0 0 46 63 --- Event Stage 4: Corrector RHS Setup VecSet 1 1.0 3.9101e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 1 0 0 0 0 0 --- Event Stage 5: Corrector Solve KSPGMRESOrthog 30 1.0 6.1215e-02 1.0 3.64e+07 1.0 0.0e+00 0.0e+00 1.6e+02 0 4 0 0 5 0 9 0 0 9 595 KSPSetUp 10 1.0 4.3111e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.3e+02 0 0 0 0 4 0 0 0 0 7 0 KSPSolve 1 1.0 4.7401e+00 1.0 3.27e+08 1.0 0.0e+00 0.0e+00 9.3e+02 4 40 0 0 26 8 78 0 0 53 69 VecMDot 30 1.0 2.5080e-02 1.0 1.82e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 4 0 0 0 726 VecTDot 68 1.0 5.2035e-02 1.0 1.96e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 376 VecNorm 68 1.0 2.4876e-02 1.0 1.37e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 552 VecScale 33 1.0 3.5061e-02 1.0 1.82e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 52 VecCopy 110 1.0 1.2474e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 432 1.0 2.8626e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 281 1.0 1.8523e-01 1.0 4.31e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 5 0 0 0 0 10 0 0 0 233 VecAYPX 243 1.0 1.1632e-01 1.0 2.11e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 5 0 0 0 181 VecMAXPY 33 1.0 4.1347e-02 1.0 2.15e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 5 0 0 0 520 VecAssemblyBegin 5 1.0 2.2888e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 5 1.0 1.9073e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 243 1.0 9.9632e-02 1.0 1.34e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 135 VecSetRandom 3 1.0 2.9260e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 33 1.0 4.2624e-02 1.0 5.46e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 128 MatMult 274 1.0 1.0408e+00 1.0 1.83e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 22 0 0 0 2 44 0 0 0 176 MatMultAdd 105 1.0 2.2814e-01 1.0 2.80e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 7 0 0 0 123 MatMultTranspose 105 1.0 2.5417e-01 1.0 2.80e+07 1.0 0.0e+00 0.0e+00 1.0e+02 0 3 0 0 3 0 7 0 0 6 110 MatSolve 35 1.0 1.2560e-03 1.0 7.95e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 63 MatLUFactorSym 1 1.0 2.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 2.6894e-04 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 53 MatConvert 3 1.0 6.8882e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.0e+00 0 0 0 0 0 0 0 0 0 1 0 MatScale 9 1.0 2.1209e-02 1.0 2.71e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 128 MatAssemblyBegin 29 1.0 1.4400e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 29 1.0 1.9717e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 496479 1.0 2.0019e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 3 0 0 0 0 0 MatGetRowIJ 1 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 2.5821e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 3 1.0 1.0246e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 1 0 MatAXPY 3 1.0 8.0140e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 3 1.0 2.1582e-01 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 1 0 0 1 11 MatMatMultSym 3 1.0 1.7282e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 1 0 MatMatMultNum 3 1.0 4.2815e-02 1.0 2.31e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 54 MatPtAP 3 1.0 3.8948e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 1.8e+01 0 1 0 0 0 1 2 0 0 1 21 MatPtAPSymbolic 3 1.0 1.2175e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 1 0 MatPtAPNumeric 3 1.0 2.6768e-01 1.0 8.24e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 2 0 0 0 31 MatTrnMatMult 3 1.0 8.6146e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 3.6e+01 1 2 0 0 1 1 3 0 0 2 14 MatTrnMatMultSym 3 1.0 6.7687e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+01 1 0 0 0 1 1 0 0 0 2 0 MatTrnMatMultNum 3 1.0 1.8455e-01 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 68 MatGetSymTrans 6 1.0 2.7702e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCGAMGgraph_AGG 3 1.0 7.7286e+00 1.0 1.91e+06 1.0 0.0e+00 0.0e+00 4.5e+01 6 0 0 0 1 13 0 0 0 3 0 PCGAMGcoarse_AGG 3 1.0 1.0950e+00 1.0 1.25e+07 1.0 0.0e+00 0.0e+00 5.1e+01 1 2 0 0 1 2 3 0 0 3 11 PCGAMGProl_AGG 3 1.0 4.1285e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+01 33 0 0 0 0 67 0 0 0 1 0 PCGAMGPOpt_AGG 3 1.0 2.9775e+00 1.0 6.79e+07 1.0 0.0e+00 0.0e+00 4.9e+02 2 8 0 0 13 5 16 0 0 28 23 PCSetUp 2 1.0 5.3523e+01 1.0 9.05e+07 1.0 0.0e+00 0.0e+00 8.0e+02 43 11 0 0 22 87 22 0 0 46 2 PCSetUpOnBlocks 35 1.0 1.3256e-03 1.0 1.42e+04 1.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 1 11 PCApply 35 1.0 3.5666e+00 1.0 2.24e+08 1.0 0.0e+00 0.0e+00 8.3e+02 3 27 0 0 23 6 54 0 0 48 63 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Krylov Solver 0 13 50296 0 DMKSP interface 0 2 1296 0 Vector 1 56 34403968 0 Vector Scatter 0 12 7632 0 Matrix 0 22 32698344 0 Distributed Mesh 0 6 8669184 0 Bipartite Graph 0 12 9504 0 Index Set 0 6 5440 0 IS L to G Mapping 0 9 5189220 0 Preconditioner 0 12 12360 0 --- Event Stage 1: DMMG Setup Krylov Solver 1 0 0 0 Vector 5 4 6048 0 Vector Scatter 4 0 0 0 Distributed Mesh 2 0 0 0 Bipartite Graph 4 0 0 0 Index Set 10 10 1159584 0 IS L to G Mapping 3 0 0 0 --- Event Stage 2: Pressure RHS Setup Krylov Solver 1 0 0 0 DMKSP interface 1 0 0 0 Vector 5 4 6048 0 Vector Scatter 4 0 0 0 Distributed Mesh 3 1 4376 0 Bipartite Graph 6 2 1584 0 Index Set 10 10 1159584 0 IS L to G Mapping 3 0 0 0 Preconditioner 1 0 0 0 Viewer 1 0 0 0 --- Event Stage 3: Pressure Solve Krylov Solver 8 3 90576 0 Vector 100 74 28259312 0 Matrix 23 12 29559028 0 Matrix Coarsen 3 3 1884 0 Index Set 6 3 2280 0 Preconditioner 8 3 3024 0 PetscRandom 3 3 1872 0 --- Event Stage 4: Corrector RHS Setup Krylov Solver 1 0 0 0 DMKSP interface 1 0 0 0 Vector 5 4 6048 0 Vector Scatter 4 0 0 0 Distributed Mesh 3 1 4376 0 Bipartite Graph 6 2 1584 0 Index Set 10 10 1159584 0 IS L to G Mapping 3 0 0 0 Preconditioner 1 0 0 0 --- Event Stage 5: Corrector Solve Krylov Solver 8 3 90576 0 Vector 100 74 28259312 0 Matrix 23 12 29559028 0 Matrix Coarsen 3 3 1884 0 Index Set 6 3 2280 0 Preconditioner 8 3 3024 0 PetscRandom 3 3 1872 0 ======================================================================================================================== Average time to get PetscTime(): 1.90735e-07 #PETSc Option Table entries: -ksp_monitor -ksp_rtol 1.0e-7 -ksp_type cg -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -pc_gamg_agg_nsmooths 1 -pc_type gamg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Mon Jun 24 19:50:10 2013 Configure options: --download-f-blas-lapack --download-hypre --download-mpich --with-cc=gcc --with-fc=gfortran PETSC_ARCH=arch-linux2-c-debug ----------------------------------------- Libraries compiled on Mon Jun 24 19:50:10 2013 on l2118a-linux.soecs.ku.edu Machine characteristics: Linux-2.6.18-128.el5-x86_64-with-redhat-5.3-Tikanga Using PETSc directory: /home/zlwei/soft/mercurial/petsc-dev Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g3 -fno-inline -O0 ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90 -fPIC -Wall -Wno-unused-variable -g ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include ----------------------------------------- Using C linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc Using Fortran linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90 Using libraries: -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -lmpichcxx -lstdc++ -lflapack -lfblas -lpthread -lmpichf90 -lgfortran -lm -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lopa -lmpl -lrt -lpthread -lgcc_s -ldl ----------------------------------------- -------------- next part -------------- node = 0 mx = 300 my = 120 mm = 1 nn = 1 node = 0 xs = 0 ys = 0 xw = 300 yw = 120 rank = 0: left BC = 1, rightBC = 1, bottom BC = 1, top BC = 1 rank = 0: xc = 299, yc = 120, xw = 300, yw = 120 rank 0 Cylinder: neighbor = left: 128 right: 152 bottom: 48 top: 72 Current Computation Time Step = 0, Total Computation Time step number = 1 Heave_dH = 0.001841, PhysVel.y = 0.169646 Rank = 0, Average velocity on IB points: U-ib = 1.000000, V-ib = 0.000000 0 KSP Residual norm 3.219859842275e+01 1 KSP Residual norm 1.212793919099e+01 2 KSP Residual norm 1.000475318395e+00 3 KSP Residual norm 1.048356151684e+00 4 KSP Residual norm 2.417191252714e-01 5 KSP Residual norm 1.614749788884e-01 6 KSP Residual norm 5.607699469795e-02 7 KSP Residual norm 5.015849450869e-02 8 KSP Residual norm 3.767902711948e-02 9 KSP Residual norm 1.331120203189e-02 10 KSP Residual norm 1.486268056233e-02 11 KSP Residual norm 5.251536657590e-03 12 KSP Residual norm 4.794291514649e-03 13 KSP Residual norm 2.460495800806e-03 14 KSP Residual norm 2.248817042552e-03 15 KSP Residual norm 2.211309778295e-03 16 KSP Residual norm 2.287471668574e-03 17 KSP Residual norm 1.262579985084e-03 18 KSP Residual norm 4.163719864597e-04 19 KSP Residual norm 2.326361502572e-04 20 KSP Residual norm 2.841935932373e-04 21 KSP Residual norm 3.047482003586e-04 22 KSP Residual norm 3.582477628286e-04 23 KSP Residual norm 2.822803681240e-04 24 KSP Residual norm 1.256577194451e-04 25 KSP Residual norm 6.006337667087e-05 26 KSP Residual norm 5.482035386006e-05 27 KSP Residual norm 4.716042773817e-05 28 KSP Residual norm 3.438093185462e-05 29 KSP Residual norm 2.216599861020e-05 30 KSP Residual norm 1.188485020621e-05 31 KSP Residual norm 5.332709179154e-06 32 KSP Residual norm 5.571699960743e-06 33 KSP Residual norm 4.072911342628e-06 34 KSP Residual norm 2.429372703445e-06 Pressure Check Iteration = 1, Error = 2.108138e-08, Max Pressure = 1.528494 @ (134,61) Pressure Corrector RHS Calculated!!!!! Pressure Corrector RHS Calculated!!!!! 0 KSP Residual norm 1.829819563087e+01 1 KSP Residual norm 7.909965123659e+00 2 KSP Residual norm 1.074505340791e+00 3 KSP Residual norm 7.169735199209e-01 4 KSP Residual norm 2.244554419387e-01 5 KSP Residual norm 1.754655965410e-01 6 KSP Residual norm 4.825693892894e-02 7 KSP Residual norm 5.251256707979e-02 8 KSP Residual norm 3.804056366048e-02 9 KSP Residual norm 1.200096281517e-02 10 KSP Residual norm 1.103128679453e-02 11 KSP Residual norm 3.997720166314e-03 12 KSP Residual norm 3.655688571252e-03 13 KSP Residual norm 1.543345959344e-03 14 KSP Residual norm 1.292885619415e-03 15 KSP Residual norm 1.248963602650e-03 16 KSP Residual norm 4.801522102369e-04 17 KSP Residual norm 5.672996517558e-04 18 KSP Residual norm 2.999258586999e-04 19 KSP Residual norm 2.844401232531e-04 20 KSP Residual norm 2.100112658645e-04 21 KSP Residual norm 1.057637855557e-04 22 KSP Residual norm 5.175131849164e-05 23 KSP Residual norm 5.284783973564e-05 24 KSP Residual norm 2.617328421458e-05 25 KSP Residual norm 2.066100891352e-05 26 KSP Residual norm 1.682497454183e-05 27 KSP Residual norm 1.172461379204e-05 28 KSP Residual norm 7.434528247622e-06 29 KSP Residual norm 7.721209094005e-06 30 KSP Residual norm 7.748882349033e-06 31 KSP Residual norm 8.982990172645e-06 32 KSP Residual norm 1.114273786849e-05 33 KSP Residual norm 7.667014380348e-06 34 KSP Residual norm 3.720460282534e-06 35 KSP Residual norm 2.203328100797e-06 36 KSP Residual norm 1.590063516358e-06 Rank#0, Max dp = 8.902003e-01 @ (134, 66) Rank#0, time step = 0, continuity = 7.809215e-07 @ (145, 66) Rank = 0, Computation for time step 0 is done!! 0 time steps left Rank = 0, W time = 13.260636 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ex29 on a arch-linux2-c-debug named l2118a-linux.soecs.ku.edu with 1 processor, by zlwei Tue Aug 6 14:46:54 2013 Using Petsc Development GIT revision: 7a0108da53bbe8dff949efa7a5ab1303a7fb1560 GIT Date: 2013-06-20 10:11:56 +0200 Max Max/Min Avg Total Time (sec): 1.355e+01 1.00000 1.355e+01 Objects: 3.150e+02 1.00000 3.150e+02 Flops: 2.138e+08 1.00000 2.138e+08 2.138e+08 Flops/sec: 1.577e+07 1.00000 1.577e+07 1.577e+07 Memory: 2.090e+07 1.00000 2.090e+07 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 2.797e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 4.5979e-01 3.4% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+00 0.2% 1: DMMG Setup: 2.7542e-02 0.2% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+01 2.1% 2: Pressure RHS Setup: 1.9010e-02 0.1% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+01 2.1% 3: Pressure Solve: 6.5101e+00 48.0% 1.0454e+08 48.9% 0.000e+00 0.0% 0.000e+00 0.0% 1.285e+03 45.9% 4: Corrector RHS Setup: 1.4123e-02 0.1% 0.0000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+01 2.1% 5: Corrector Solve: 6.5235e+00 48.1% 1.0926e+08 51.1% 0.000e+00 0.0% 0.000e+00 0.0% 1.325e+03 47.4% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %f - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ ########################################################## # # # WARNING!!! # # # # This code was compiled with a debugging option, # # To get timing results run ./configure # # using --with-debugging=no, the performance will # # be generally two or three times faster. # # # ########################################################## Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecSet 1 1.0 1.4708e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 1 1.0 5.1093e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 --- Event Stage 1: DMMG Setup ThreadCommRunKer 1 1.0 6.9141e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ThreadCommBarrie 1 1.0 5.0068e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 1 1.0 1.3750e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 5 0 0 0 0 0 --- Event Stage 2: Pressure RHS Setup VecSet 1 1.0 9.4509e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 5 0 0 0 0 0 --- Event Stage 3: Pressure Solve KSPGMRESOrthog 20 1.0 1.4830e-02 1.0 9.05e+06 1.0 0.0e+00 0.0e+00 1.1e+02 0 4 0 0 4 0 9 0 0 9 610 KSPSetUp 8 1.0 5.8279e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.6e+01 0 0 0 0 3 0 0 0 0 7 0 KSPSolve 1 1.0 1.2430e+00 1.0 8.23e+07 1.0 0.0e+00 0.0e+00 7.2e+02 9 38 0 0 26 19 79 0 0 56 66 VecMDot 20 1.0 6.0251e-03 1.0 4.52e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 4 0 0 0 751 VecTDot 68 1.0 1.5597e-02 1.0 4.90e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 314 VecNorm 57 1.0 6.5038e-03 1.0 3.42e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 527 VecScale 22 1.0 9.2158e-03 1.0 4.52e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 49 VecCopy 74 1.0 2.9335e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 327 1.0 1.6136e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 210 1.0 5.2465e-02 1.0 1.07e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 5 0 0 0 1 10 0 0 0 205 VecAYPX 173 1.0 2.9601e-02 1.0 5.25e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 177 VecMAXPY 22 1.0 9.6974e-03 1.0 5.34e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 551 VecAssemblyBegin 4 1.0 2.0027e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 4 1.0 1.6212e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 162 1.0 2.5438e-02 1.0 3.33e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 131 VecSetRandom 2 1.0 7.2423e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecNormalize 22 1.0 1.1368e-02 1.0 1.36e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 119 MatMult 194 1.0 2.4963e-01 1.0 4.50e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 21 0 0 0 4 43 0 0 0 180 MatMultAdd 70 1.0 5.5392e-02 1.0 6.93e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 1 7 0 0 0 125 MatMultTranspose 70 1.0 6.2724e-02 1.0 6.93e+06 1.0 0.0e+00 0.0e+00 7.0e+01 0 3 0 0 3 1 7 0 0 5 110 MatSolve 35 1.0 5.5137e-03 1.0 1.05e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 190 MatLUFactorSym 1 1.0 1.7319e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 3.7150e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 115 MatConvert 2 1.0 7.7617e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 6 1.0 5.1129e-03 1.0 6.65e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 130 MatAssemblyBegin 20 1.0 9.2030e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 20 1.0 5.3781e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 1 0 0 0 0 0 MatGetRow 123345 1.0 4.9806e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 4 0 0 0 0 8 0 0 0 0 0 MatGetRowIJ 1 1.0 9.7990e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 1.0302e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 2 1.0 3.7993e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 0 0 0 0 0 1 0 0 0 1 0 MatAXPY 2 1.0 1.9550e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 2 1.0 5.3621e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 1 1 0 0 1 11 MatMatMultSym 2 1.0 4.3117e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 1 0 0 0 1 0 MatMatMultNum 2 1.0 1.0380e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 55 MatPtAP 2 1.0 9.5418e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 1.2e+01 1 1 0 0 0 1 2 0 0 1 21 MatPtAPSymbolic 2 1.0 2.9883e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 1 0 MatPtAPNumeric 2 1.0 6.5507e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 2 0 0 0 31 MatTrnMatMult 2 1.0 2.3324e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 2.4e+01 2 1 0 0 1 4 3 0 0 2 13 MatTrnMatMultSym 2 1.0 1.8737e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 1 0 0 0 1 3 0 0 0 2 0 MatTrnMatMultNum 2 1.0 4.5826e-02 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 3 0 0 0 65 MatGetSymTrans 4 1.0 1.0321e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCGAMGgraph_AGG 2 1.0 1.9121e+00 1.0 4.67e+05 1.0 0.0e+00 0.0e+00 3.0e+01 14 0 0 0 1 29 0 0 0 2 0 PCGAMGcoarse_AGG 2 1.0 3.0292e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 3.4e+01 2 1 0 0 1 5 3 0 0 3 10 PCGAMGProl_AGG 2 1.0 1.4573e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 11 0 0 0 0 22 0 0 0 1 0 PCGAMGPOpt_AGG 2 1.0 7.4503e-01 1.0 1.68e+07 1.0 0.0e+00 0.0e+00 3.2e+02 5 8 0 0 12 11 16 0 0 25 23 PCSetUp 2 1.0 4.5365e+00 1.0 2.27e+07 1.0 0.0e+00 0.0e+00 5.5e+02 33 11 0 0 20 70 22 0 0 43 5 PCSetUpOnBlocks 35 1.0 7.3206e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 1 58 PCApply 35 1.0 9.3794e-01 1.0 5.67e+07 1.0 0.0e+00 0.0e+00 6.2e+02 7 27 0 0 22 14 54 0 0 48 60 --- Event Stage 4: Corrector RHS Setup VecSet 1 1.0 1.0395e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 1 0 0 0 0 0 --- Event Stage 5: Corrector Solve KSPGMRESOrthog 20 1.0 1.4795e-02 1.0 9.05e+06 1.0 0.0e+00 0.0e+00 1.1e+02 0 4 0 0 4 0 8 0 0 8 611 KSPSetUp 8 1.0 2.1520e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 9.6e+01 0 0 0 0 3 0 0 0 0 7 0 KSPSolve 1 1.0 1.3155e+00 1.0 8.70e+07 1.0 0.0e+00 0.0e+00 7.6e+02 10 41 0 0 27 20 80 0 0 57 66 VecMDot 20 1.0 6.0053e-03 1.0 4.52e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 4 0 0 0 753 VecTDot 72 1.0 1.6134e-02 1.0 5.18e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 321 VecNorm 59 1.0 6.7673e-03 1.0 3.57e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 527 VecScale 22 1.0 9.1701e-03 1.0 4.52e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 49 VecCopy 78 1.0 3.1137e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 341 1.0 1.0112e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 222 1.0 5.5460e-02 1.0 1.14e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 5 0 0 0 1 10 0 0 0 205 VecAYPX 183 1.0 3.1363e-02 1.0 5.56e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 5 0 0 0 177 VecMAXPY 22 1.0 9.7005e-03 1.0 5.34e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 5 0 0 0 551 VecAssemblyBegin 4 1.0 1.8835e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 4 1.0 1.5974e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecPointwiseMult 170 1.0 2.6755e-02 1.0 3.49e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 3 0 0 0 131 VecSetRandom 2 1.0 7.2606e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecNormalize 22 1.0 1.1306e-02 1.0 1.36e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 120 MatMult 204 1.0 2.6262e-01 1.0 4.74e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 22 0 0 0 4 43 0 0 0 180 MatMultAdd 74 1.0 5.8598e-02 1.0 7.33e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 1 7 0 0 0 125 MatMultTranspose 74 1.0 6.6470e-02 1.0 7.33e+06 1.0 0.0e+00 0.0e+00 7.4e+01 0 3 0 0 3 1 7 0 0 6 110 MatSolve 37 1.0 5.7158e-03 1.0 1.11e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 193 MatLUFactorSym 1 1.0 1.5109e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 3.5410e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 120 MatConvert 2 1.0 2.0061e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 6 1.0 5.1038e-03 1.0 6.65e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 130 MatAssemblyBegin 20 1.0 9.3222e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 20 1.0 4.8141e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 1 0 0 0 0 0 MatGetRow 123345 1.0 4.9688e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 4 0 0 0 0 8 0 0 0 0 0 MatGetRowIJ 1 1.0 9.5129e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 8.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCoarsen 2 1.0 2.8723e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 0 0 0 0 0 0 0 0 0 1 0 MatAXPY 2 1.0 1.9500e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 2 1.0 5.3714e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 1 1 0 0 1 11 MatMatMultSym 2 1.0 4.3223e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 1 0 0 0 1 0 MatMatMultNum 2 1.0 1.0369e-02 1.0 5.66e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 1 0 0 0 55 MatPtAP 2 1.0 9.5119e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 1.2e+01 1 1 0 0 0 1 2 0 0 1 21 MatPtAPSymbolic 2 1.0 2.9528e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 1 0 MatPtAPNumeric 2 1.0 6.5562e-02 1.0 2.02e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 2 0 0 0 31 MatTrnMatMult 2 1.0 2.1705e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 2.4e+01 2 1 0 0 1 3 3 0 0 2 14 MatTrnMatMultSym 2 1.0 1.7135e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 1 0 0 0 1 3 0 0 0 2 0 MatTrnMatMultNum 2 1.0 4.5670e-02 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 3 0 0 0 65 MatGetSymTrans 4 1.0 6.6414e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCGAMGgraph_AGG 2 1.0 1.8934e+00 1.0 4.67e+05 1.0 0.0e+00 0.0e+00 3.0e+01 14 0 0 0 1 29 0 0 0 2 0 PCGAMGcoarse_AGG 2 1.0 2.7539e-01 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 3.4e+01 2 1 0 0 1 4 3 0 0 3 11 PCGAMGProl_AGG 2 1.0 1.4554e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 11 0 0 0 0 22 0 0 0 1 0 PCGAMGPOpt_AGG 2 1.0 7.4179e-01 1.0 1.68e+07 1.0 0.0e+00 0.0e+00 3.2e+02 5 8 0 0 12 11 15 0 0 24 23 PCSetUp 2 1.0 4.4874e+00 1.0 2.27e+07 1.0 0.0e+00 0.0e+00 5.5e+02 33 11 0 0 20 69 21 0 0 42 5 PCSetUpOnBlocks 37 1.0 6.4981e-03 1.0 4.26e+05 1.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 1 66 PCApply 37 1.0 9.9311e-01 1.0 5.99e+07 1.0 0.0e+00 0.0e+00 6.5e+02 7 28 0 0 23 15 55 0 0 49 60 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Krylov Solver 0 11 47784 0 DMKSP interface 0 2 1296 0 Vector 1 44 8630368 0 Vector Scatter 0 12 7632 0 Matrix 0 16 8531872 0 Distributed Mesh 0 6 2189184 0 Bipartite Graph 0 12 9504 0 Index Set 0 6 9664 0 IS L to G Mapping 0 9 1301220 0 Preconditioner 0 10 10408 0 --- Event Stage 1: DMMG Setup Krylov Solver 1 0 0 0 Vector 5 4 6048 0 Vector Scatter 4 0 0 0 Distributed Mesh 2 0 0 0 Bipartite Graph 4 0 0 0 Index Set 10 10 295584 0 IS L to G Mapping 3 0 0 0 --- Event Stage 2: Pressure RHS Setup Krylov Solver 1 0 0 0 DMKSP interface 1 0 0 0 Vector 5 4 6048 0 Vector Scatter 4 0 0 0 Distributed Mesh 3 1 4376 0 Bipartite Graph 6 2 1584 0 Index Set 10 10 295584 0 IS L to G Mapping 3 0 0 0 Preconditioner 1 0 0 0 Viewer 1 0 0 0 --- Event Stage 3: Pressure Solve Krylov Solver 6 2 60384 0 Vector 71 51 7082504 0 Matrix 16 8 7271944 0 Matrix Coarsen 2 2 1256 0 Index Set 5 2 1520 0 Preconditioner 6 2 2016 0 PetscRandom 2 2 1248 0 --- Event Stage 4: Corrector RHS Setup Krylov Solver 1 0 0 0 DMKSP interface 1 0 0 0 Vector 5 4 6048 0 Vector Scatter 4 0 0 0 Distributed Mesh 3 1 4376 0 Bipartite Graph 6 2 1584 0 Index Set 10 10 295584 0 IS L to G Mapping 3 0 0 0 Preconditioner 1 0 0 0 --- Event Stage 5: Corrector Solve Krylov Solver 6 2 60384 0 Vector 71 51 7082504 0 Matrix 16 8 7271944 0 Matrix Coarsen 2 2 1256 0 Index Set 5 2 1520 0 Preconditioner 6 2 2016 0 PetscRandom 2 2 1248 0 ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 #PETSc Option Table entries: -ksp_monitor -ksp_rtol 1.0e-7 -ksp_type cg -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -pc_gamg_agg_nsmooths 1 -pc_type gamg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Mon Jun 24 19:50:10 2013 Configure options: --download-f-blas-lapack --download-hypre --download-mpich --with-cc=gcc --with-fc=gfortran PETSC_ARCH=arch-linux2-c-debug ----------------------------------------- Libraries compiled on Mon Jun 24 19:50:10 2013 on l2118a-linux.soecs.ku.edu Machine characteristics: Linux-2.6.18-128.el5-x86_64-with-redhat-5.3-Tikanga Using PETSc directory: /home/zlwei/soft/mercurial/petsc-dev Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g3 -fno-inline -O0 ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90 -fPIC -Wall -Wno-unused-variable -g ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/include -I/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/include ----------------------------------------- Using C linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpicc Using Fortran linker: /home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/bin/mpif90 Using libraries: -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -L/home/zlwei/soft/mercurial/petsc-dev/arch-linux2-c-debug/lib -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -lmpichcxx -lstdc++ -lflapack -lfblas -lpthread -lmpichf90 -lgfortran -lm -lm -lmpichcxx -lstdc++ -lmpichcxx -lstdc++ -ldl -lmpich -lopa -lmpl -lrt -lpthread -lgcc_s -ldl ----------------------------------------- From knepley at gmail.com Tue Aug 6 14:57:12 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Aug 2013 14:57:12 -0500 Subject: [petsc-users] understanding the eigenvalues of my system In-Reply-To: References: Message-ID: On Tue, Aug 6, 2013 at 2:38 PM, Zou (Non-US), Ling wrote: > Dear All, > > I just explored the user manual v3.4 and I noticed the option > > -ksp_compute_eigenvalues > > I test it with my problem with ~1000 number of dofs, and it printed out > > Iteratively computed eigenvalues > 0.840692 + 0i > 0.857247 - 0.235747i > 0.857247 + 0.235747i > 0.999993 + 0i > 1.03457 + 0i > 2.69763 + 0i > > I noticed there are complex numbers... I wonder what does the printout > info indicate, a good system, a bad system or nothing special? > Eigenvalues are complex. This looks like a well-conditioned system, and its not Hermitian. Matt > > Best, > > Ling > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From rupp at mcs.anl.gov Tue Aug 6 15:10:38 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Tue, 06 Aug 2013 15:10:38 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <520154F3.2010203@gmail.com> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> Message-ID: <5201583E.7080807@mcs.anl.gov> Hi Alan, with Mudpack being a geometric multigrid solver (from your description I thought that this is one of the sparse direct solvers like UMFPACK - thanks, Matt), the behavior is fairly consistent: Algebraic multigrid preconditioners need to reconstruct the coarse grids, while this is information is immediately available for geometric multigrid. Thus, if you can use geometric multigrid and it works well (which is particularly the case for structured grids), use it. Consider the same for multiple processes. There is no bullet-proof general recipe, so some experimentation is necessary. Best regards, Karli On 08/06/2013 02:56 PM, Alan wrote: > Thanks for replies. Here I attached the log_summary for the large > and small problems. The DoFs for the large problem is 4 times of that > for the small problem. Few observations are listed here: > 1, the total number of iterations does not change much from the small > problem to the large one; > 2, the time elapsed for KSPSolve() for the large problem is less than 4 > times of that for the small problem; > 3, the time elapsed for PCSet() for the large problem is more than 10 > times of that for the small problem; > 4, the time elapsed for PCGAMGProl_AGG for the large problem is more > than 20 times of that for the small problem; > In my code, I have solved the Poisson equation for twice with > difference RHS; however, the observation above is almost consistent for > these two times. > Do these observation indicate that I should switch my PC from GAMG > to MG for solving Poisson equation in a single process? > > best, > Alan > >> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp > > wrote: >> >> Hi Alan, >> >> please use -log_summary to get profiling information on the run. >> What is >> the bottleneck? Is it the number of solver iterations increasing >> significantly? If so, consider changing the preconditioner options >> (more >> levels!). I don't expect a direct solver to be any faster in the 180k >> case for a Poisson problem. >> >> >> Mudpack is geometric multigrid: >> http://www2.cisl.ucar.edu/resources/legacy/mudpack >> This should be faster. >> >> Matt >> >> Best regards, >> Karli >> >> >> On 08/06/2013 02:22 PM, Alan wrote: >> > Dear all, >> > I hope you're having a nice day. >> > I have a quick question on solving Poisson equation with KSP solvers >> > (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this >> solver with: >> > -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 >> -mg_levels_ksp_max_it >> > 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 >> > It performs very well in parallel computation and scalability is >> fine. >> > However, if I run it with a single process, the KSP solver is much >> > slower than direct ones, i.e. Mudpack. Briefly, the speed difference >> > between the KSP solver and the direct solver is negligible on >> dealing >> > with small problems (i.e.36k DoFs ) but becomes very huge for >> moderate >> > large problems (i.e. 180k DoFs). Although the direct solver >> inherently >> > has better performance for moderate large problems in the single >> > process, I wonder if any setup or approach can improve the >> performance >> > of this KSP Poisson solver with the single process? or even make it >> > obtain competitive speed (a little bit slower is fine) against >> direct >> > solvers. >> > >> > thanks in advance, >> > Alan >> > >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener > From bsmith at mcs.anl.gov Tue Aug 6 16:03:02 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 6 Aug 2013 16:03:02 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <520154F3.2010203@gmail.com> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> Message-ID: <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> Alan, If you can use MUDPACK then you can also use PETSc's geometric multigrid, both sequential and parallel and its performance should be fairly close to mudpack on one process. Barry On Aug 6, 2013, at 2:56 PM, Alan wrote: > Thanks for replies. Here I attached the log_summary for the large and small problems. The DoFs for the large problem is 4 times of that for the small problem. Few observations are listed here: > 1, the total number of iterations does not change much from the small problem to the large one; > 2, the time elapsed for KSPSolve() for the large problem is less than 4 times of that for the small problem; > 3, the time elapsed for PCSet() for the large problem is more than 10 times of that for the small problem; > 4, the time elapsed for PCGAMGProl_AGG for the large problem is more than 20 times of that for the small problem; > In my code, I have solved the Poisson equation for twice with difference RHS; however, the observation above is almost consistent for these two times. > Do these observation indicate that I should switch my PC from GAMG to MG for solving Poisson equation in a single process? > > best, > Alan > >> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp wrote: >> Hi Alan, >> >> please use -log_summary to get profiling information on the run. What is >> the bottleneck? Is it the number of solver iterations increasing >> significantly? If so, consider changing the preconditioner options (more >> levels!). I don't expect a direct solver to be any faster in the 180k >> case for a Poisson problem. >> >> Mudpack is geometric multigrid: http://www2.cisl.ucar.edu/resources/legacy/mudpack >> This should be faster. >> >> Matt >> >> Best regards, >> Karli >> >> >> On 08/06/2013 02:22 PM, Alan wrote: >> > Dear all, >> > I hope you're having a nice day. >> > I have a quick question on solving Poisson equation with KSP solvers >> > (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: >> > -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it >> > 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 >> > It performs very well in parallel computation and scalability is fine. >> > However, if I run it with a single process, the KSP solver is much >> > slower than direct ones, i.e. Mudpack. Briefly, the speed difference >> > between the KSP solver and the direct solver is negligible on dealing >> > with small problems (i.e.36k DoFs ) but becomes very huge for moderate >> > large problems (i.e. 180k DoFs). Although the direct solver inherently >> > has better performance for moderate large problems in the single >> > process, I wonder if any setup or approach can improve the performance >> > of this KSP Poisson solver with the single process? or even make it >> > obtain competitive speed (a little bit slower is fine) against direct >> > solvers. >> > >> > thanks in advance, >> > Alan >> > >> >> >> >> >> -- >> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >> -- Norbert Wiener > > From knepley at gmail.com Tue Aug 6 16:34:58 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 6 Aug 2013 16:34:58 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: > > > > On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: > >> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>> >>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >>>>> >>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>>>>> >>>>>>>> Bishesh Khanal writes: >>>>>>>> >>>>>>>> > Now, I implemented two different approaches, each for both 2D and >>>>>>>> 3D, in >>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>> solving it for >>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>> > I use staggered grid with p on cell centers, and components of v >>>>>>>> on cell >>>>>>>> > faces. Similar split up of K to cell center and faces to account >>>>>>>> for the >>>>>>>> > variable viscosity case) >>>>>>>> >>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>> discretization of >>>>>>>> variable-viscosity Stokes. This is a common problem and I recommend >>>>>>>> starting with PCFieldSplit with Schur complement reduction (make >>>>>>>> that >>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>> PCLSC or >>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>> containing >>>>>>>> the inverse viscosity in the pressure-pressure block. This diagonal >>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>> discretization) approximation of the Schur complement. The velocity >>>>>>>> block can be solved with algebraic multigrid. Read the PCFieldSplit >>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>> stuck. >>>>>>>> >>>>>>> >>>>>>> I was trying to assemble the inverse viscosity diagonal matrix to >>>>>>> use as the preconditioner for the Schur complement solve step as you >>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>> A naive approach that I can think of would be to create a vector >>>>>>> with its components as reciprocal viscosities of the cell centers >>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>> from this vector. However I'm not sure about: >>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>> distribution of the different rows of the main system matrix over different >>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>> dof as explained before. >>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the S_p >>>>>>> matrix would correspond to only pressure space. Should the distribution of >>>>>>> the rows of S_p among different processor not correspond to the >>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>> where S = A11 inv(A00) A01 ? >>>>>>> >>>>>> >>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. The >>>>>> P distribution is the same as the entire problem divided by 4. >>>>>> >>>>> >>>>> Thanks Matt. So if I create a new DMDA with same grid size but with >>>>> dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>> a particular function to make these two DMDA have identical distribution of >>>>> the vertices ? >>>>> >>>> >>>> Yes. >>>> >>>> >>>>> Even then I think there might be a problem due to the presence of >>>>> "fictitious pressure vertices". The system matrix (A) contains an identity >>>>> corresponding to these fictitious pressure nodes, thus when using a >>>>> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>> directly correspond to the DMDA grid defined for the original problem. >>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>> >>>> >>>> Don't use detect_saddle, but split it by fields -pc_fieldsplit_0_fields >>>> 0,1,2 -pc_fieldsplit_1_fields 4 >>>> >>> >>> How can I set this split in the code itself without giving it as a >>> command line option when the system matrix is assembled from the DMDA for >>> the whole system with 4 dofs. (i.e. *without* using the DMComposite or * >>> without* using the nested block matrices to assemble different blocks >>> separately and then combine them together). >>> I need the split to get access to the fieldsplit_1_ksp in my code, >>> because not using detect_saddle_point means I cannot use >>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>> >> >> This is currently a real problem with the DMDA. In the unstructured case, >> where we always need specialized spaces, you can >> use something like >> >> PetscObject pressure; >> MatNullSpace nullSpacePres; >> >> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, 0, >> NULL, &nullSpacePres);CHKERRQ(ierr); >> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >> nullSpacePres);CHKERRQ(ierr); >> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >> >> and then DMGetSubDM() uses this information to attach the null space to >> the IS that is created using the information in the PetscSection. >> If you use a PetscSection to set the data layout over the DMDA, I think >> this works correctly, but this has not been tested at all and is very >> new code. Eventually, I think we want all DMs to use this mechanism, but >> we are still working it out. >> > > Currently I do not use PetscSection. If this makes a cleaner approach, I'd > try it too but may a bit later (right now I'd like test my model with a > quickfix even if it means a little dirty code!) > > >> >> Bottom line: For custom null spaces using the default layout in DMDA, you >> need to take apart the PCFIELDSPLIT after it has been setup, >> which is somewhat subtle. You need to call KSPSetUp() and then reach in >> and get the PC, and the subKSPs. I don't like this at all, but we >> have not reorganized that code (which could be very simple and inflexible >> since its very structured). >> > > So I tried to get this approach working but I could not succeed and > encountered some errors. Here is a code snippet: > > //mDa is the DMDA that describes the whole grid with all 4 dofs (3 > velocity components and 1 pressure comp.) > ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); > ierr = > DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); > ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); > ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've > the mNullSpaceSystem based on mDa, that contains a null space basis for the > complete system. > ierr = > KSPSetFromOptions(mKsp);CHKERRQ(ierr); > //This I expect would register these options I give:-pc_type fieldsplit > -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 > //-pc_fieldsplit_1_fields 3 > > ierr = KSPSetUp(mKsp);CHKERRQ(ierr); > > ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was > obtained from the options (fieldsplit) > > ierr = > PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); > //I have created the matrix mPcForSc using a DMDA with identical //size to > mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). > > ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); > > KSP *kspSchur; > PetscInt kspSchurPos = 1; > ierr = > PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); > ierr = > KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); > //The null space is the one that correspond to only pressure nodes, created > using the mDaPressure. > ierr = PetscFree(kspSchur);CHKERRQ(ierr); > > ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); > Sorry, you need to return to the old DMDA behavior, so you want -pc_fieldsplit_dm_splits 0 or PCFieldSplitSetDMSplits(pc, PETSC_FALSE) Thanks, Matt > The errors I get when running with options: -pc_type fieldsplit > -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 > -pc_fieldsplit_1_fields 3 > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: No support for this operation for this object type! > [0]PETSC ERROR: Support only implemented for 2d! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards by > bkhanal Tue Aug 6 17:35:30 2013 > [0]PETSC ERROR: Libraries linked from > /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib > [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 > [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 > --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 > -with-clanguage=cxx --download-hypre=1 > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c > [0]PETSC ERROR: DMCreateSubDM() line 1267 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c > [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: PCSetUp() line 890 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: solveModel() line 181 in > "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx > WARNING! There are options you set that were not used! > WARNING! could be spelling mistake, etc! > Option left: name:-pc_fieldsplit_1_fields value: 3 > > > > > > > > >> >> Matt >> >> >>> >>>> Matt >>>> >>>> >>>>> >>>>>> Matt >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Wed Aug 7 05:46:00 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Wed, 7 Aug 2013 18:46:00 +0800 (CST) Subject: [petsc-users] how to know the original global index after the partition. Message-ID: <8dabd0b.c996.14058621e8e.Coremail.ztdepyahoo@163.com> In the ISPartitioningToNumbering(IS part,IS *is) is define thee index set that defines the global numbers on each part. but how to get the original global index. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Wed Aug 7 07:07:06 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Wed, 7 Aug 2013 14:07:06 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: > On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: > >> >> >> >> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >> >>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>>> >>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>>>>>> >>>>>>>>> Bishesh Khanal writes: >>>>>>>>> >>>>>>>>> > Now, I implemented two different approaches, each for both 2D >>>>>>>>> and 3D, in >>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>> solving it for >>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>> > I use staggered grid with p on cell centers, and components of v >>>>>>>>> on cell >>>>>>>>> > faces. Similar split up of K to cell center and faces to account >>>>>>>>> for the >>>>>>>>> > variable viscosity case) >>>>>>>>> >>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>> discretization of >>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>> recommend >>>>>>>>> starting with PCFieldSplit with Schur complement reduction (make >>>>>>>>> that >>>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>>> PCLSC or >>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>> containing >>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>> diagonal >>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>> velocity >>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>> PCFieldSplit >>>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>>> stuck. >>>>>>>>> >>>>>>>> >>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix to >>>>>>>> use as the preconditioner for the Schur complement solve step as you >>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>> A naive approach that I can think of would be to create a vector >>>>>>>> with its components as reciprocal viscosities of the cell centers >>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>> from this vector. However I'm not sure about: >>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>> dof as explained before. >>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the S_p >>>>>>>> matrix would correspond to only pressure space. Should the distribution of >>>>>>>> the rows of S_p among different processor not correspond to the >>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>> >>>>>>> >>>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. The >>>>>>> P distribution is the same as the entire problem divided by 4. >>>>>>> >>>>>> >>>>>> Thanks Matt. So if I create a new DMDA with same grid size but with >>>>>> dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>> a particular function to make these two DMDA have identical distribution of >>>>>> the vertices ? >>>>>> >>>>> >>>>> Yes. >>>>> >>>>> >>>>>> Even then I think there might be a problem due to the presence of >>>>>> "fictitious pressure vertices". The system matrix (A) contains an identity >>>>>> corresponding to these fictitious pressure nodes, thus when using a >>>>>> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>> >>>>> >>>>> Don't use detect_saddle, but split it by fields >>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>> >>>> >>>> How can I set this split in the code itself without giving it as a >>>> command line option when the system matrix is assembled from the DMDA for >>>> the whole system with 4 dofs. (i.e. *without* using the DMComposite or >>>> *without* using the nested block matrices to assemble different blocks >>>> separately and then combine them together). >>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>> because not using detect_saddle_point means I cannot use >>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>> >>> >>> This is currently a real problem with the DMDA. In the unstructured >>> case, where we always need specialized spaces, you can >>> use something like >>> >>> PetscObject pressure; >>> MatNullSpace nullSpacePres; >>> >>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, 0, >>> NULL, &nullSpacePres);CHKERRQ(ierr); >>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>> nullSpacePres);CHKERRQ(ierr); >>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>> >>> and then DMGetSubDM() uses this information to attach the null space to >>> the IS that is created using the information in the PetscSection. >>> If you use a PetscSection to set the data layout over the DMDA, I think >>> this works correctly, but this has not been tested at all and is very >>> new code. Eventually, I think we want all DMs to use this mechanism, but >>> we are still working it out. >>> >> >> Currently I do not use PetscSection. If this makes a cleaner approach, >> I'd try it too but may a bit later (right now I'd like test my model with a >> quickfix even if it means a little dirty code!) >> >> >>> >>> Bottom line: For custom null spaces using the default layout in DMDA, >>> you need to take apart the PCFIELDSPLIT after it has been setup, >>> which is somewhat subtle. You need to call KSPSetUp() and then reach in >>> and get the PC, and the subKSPs. I don't like this at all, but we >>> have not reorganized that code (which could be very simple and >>> inflexible since its very structured). >>> >> >> So I tried to get this approach working but I could not succeed and >> encountered some errors. Here is a code snippet: >> >> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >> velocity components and 1 pressure comp.) >> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >> ierr = >> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've >> the mNullSpaceSystem based on mDa, that contains a null space basis for the >> complete system. >> ierr = >> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >> //This I expect would register these options I give:-pc_type fieldsplit >> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >> //-pc_fieldsplit_1_fields 3 >> >> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >> >> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >> obtained from the options (fieldsplit) >> >> ierr = >> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >> //I have created the matrix mPcForSc using a DMDA with identical //size to >> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >> >> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >> >> KSP *kspSchur; >> PetscInt kspSchurPos = 1; >> ierr = >> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >> ierr = >> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >> //The null space is the one that correspond to only pressure nodes, created >> using the mDaPressure. >> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >> >> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >> > > Sorry, you need to return to the old DMDA behavior, so you want > > -pc_fieldsplit_dm_splits 0 > Thanks, with this it seems I can attach the null space properly, but I have a question regarding whether the Schur complement ksp solver is actually using the preconditioner matrix I provide. When using -ksp_view, the outer level pc object of type fieldsplit does report that: "Preconditioner for the Schur complement formed from user provided matrix", but in the KSP solver for Schur complement S, the pc object (fieldsplit_1_) is of type ilu and doesn't say that it is using the matrix I provide. Am I missing something here ? Below are the relevant commented code snippet and the output of the -ksp_view (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) Code snippet: ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The nullspace for the whole system ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up mKsp with the options provided with fieldsplit and the fields associated with the two splits. ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); //Get the fieldsplit pc set up from the options ierr = PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); //Use mPcForSc as the preconditioner for Schur Complement KSP *kspSchur; PetscInt kspSchurPos = 1; ierr = PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); ierr = KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); //Attach the null-space for the Schur complement ksp solver. ierr = PetscFree(kspSchur);CHKERRQ(ierr); ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); the output of the -ksp_view KSP Object: 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning has attached null space using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, blocksize = 4, factorization FULL Preconditioner for the Schur complement formed from user provided matrix Split info: Split number 0 Fields 0, 1, 2 Split number 1 Fields 3 KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=2187, cols=2187 package used to perform factorization: petsc total: nonzeros=140625, allocated nonzeros=140625 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 729 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=2187, cols=2187 total: nonzeros=140625, allocated nonzeros=140625 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 729 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning has attached null space using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=729, cols=729 package used to perform factorization: petsc total: nonzeros=15625, allocated nonzeros=15625 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix followed by preconditioner matrix: Matrix Object: 1 MPI processes type: schurcomplement rows=729, cols=729 Schur complement A11 - A10 inv(A00) A01 A11 Matrix Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=15625 total number of mallocs used during MatSetValues calls =0 not using I-node routines A10 Matrix Object: 1 MPI processes type: seqaij rows=729, cols=2187 total: nonzeros=46875, allocated nonzeros=46875 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=2187, cols=2187 package used to perform factorization: petsc total: nonzeros=140625, allocated nonzeros=140625 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 729 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=2187, cols=2187 total: nonzeros=140625, allocated nonzeros=140625 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 729 nodes, limit used is 5 A01 Matrix Object: 1 MPI processes type: seqaij rows=2187, cols=729 total: nonzeros=46875, allocated nonzeros=46875 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 729 nodes, limit used is 5 Matrix Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=15625 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=2916, cols=2916, bs=4 total: nonzeros=250000, allocated nonzeros=250000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 729 nodes, limit used is 5 > > or > > PCFieldSplitSetDMSplits(pc, PETSC_FALSE) > > Thanks, > > Matt > > >> The errors I get when running with options: -pc_type fieldsplit >> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >> -pc_fieldsplit_1_fields 3 >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: No support for this operation for this object type! >> [0]PETSC ERROR: Support only implemented for 2d! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards by >> bkhanal Tue Aug 6 17:35:30 2013 >> [0]PETSC ERROR: Libraries linked from >> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >> -with-clanguage=cxx --download-hypre=1 >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >> [0]PETSC ERROR: PCSetUp() line 890 in >> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUp() line 278 in >> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: solveModel() line 181 in >> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >> WARNING! There are options you set that were not used! >> WARNING! could be spelling mistake, etc! >> Option left: name:-pc_fieldsplit_1_fields value: 3 >> >> >> >> >> >> >> >> >>> >>> Matt >>> >>> >>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>>> Matt >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 7 07:15:01 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 7 Aug 2013 07:15:01 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: > > > > On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: > >> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>> >>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>>>> >>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal >>>>>>> > wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>>>>>>> >>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>> >>>>>>>>>> > Now, I implemented two different approaches, each for both 2D >>>>>>>>>> and 3D, in >>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>> solving it for >>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>> > I use staggered grid with p on cell centers, and components of >>>>>>>>>> v on cell >>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>> account for the >>>>>>>>>> > variable viscosity case) >>>>>>>>>> >>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>> discretization of >>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>> recommend >>>>>>>>>> starting with PCFieldSplit with Schur complement reduction (make >>>>>>>>>> that >>>>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>>>> PCLSC or >>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>> containing >>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>> diagonal >>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>> velocity >>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>> PCFieldSplit >>>>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>>>> stuck. >>>>>>>>>> >>>>>>>>> >>>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix to >>>>>>>>> use as the preconditioner for the Schur complement solve step as you >>>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>> A naive approach that I can think of would be to create a vector >>>>>>>>> with its components as reciprocal viscosities of the cell centers >>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>> dof as explained before. >>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the S_p >>>>>>>>> matrix would correspond to only pressure space. Should the distribution of >>>>>>>>> the rows of S_p among different processor not correspond to the >>>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>>> >>>>>>>> >>>>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. >>>>>>>> The P distribution is the same as the entire problem divided by 4. >>>>>>>> >>>>>>> >>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but with >>>>>>> dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>> the vertices ? >>>>>>> >>>>>> >>>>>> Yes. >>>>>> >>>>>> >>>>>>> Even then I think there might be a problem due to the presence of >>>>>>> "fictitious pressure vertices". The system matrix (A) contains an identity >>>>>>> corresponding to these fictitious pressure nodes, thus when using a >>>>>>> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>> >>>>>> >>>>>> Don't use detect_saddle, but split it by fields >>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>> >>>>> >>>>> How can I set this split in the code itself without giving it as a >>>>> command line option when the system matrix is assembled from the DMDA for >>>>> the whole system with 4 dofs. (i.e. *without* using the DMComposite >>>>> or *without* using the nested block matrices to assemble different >>>>> blocks separately and then combine them together). >>>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>>> because not using detect_saddle_point means I cannot use >>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>> >>>> >>>> This is currently a real problem with the DMDA. In the unstructured >>>> case, where we always need specialized spaces, you can >>>> use something like >>>> >>>> PetscObject pressure; >>>> MatNullSpace nullSpacePres; >>>> >>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, 0, >>>> NULL, &nullSpacePres);CHKERRQ(ierr); >>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>> nullSpacePres);CHKERRQ(ierr); >>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>> >>>> and then DMGetSubDM() uses this information to attach the null space to >>>> the IS that is created using the information in the PetscSection. >>>> If you use a PetscSection to set the data layout over the DMDA, I think >>>> this works correctly, but this has not been tested at all and is very >>>> new code. Eventually, I think we want all DMs to use this mechanism, >>>> but we are still working it out. >>>> >>> >>> Currently I do not use PetscSection. If this makes a cleaner approach, >>> I'd try it too but may a bit later (right now I'd like test my model with a >>> quickfix even if it means a little dirty code!) >>> >>> >>>> >>>> Bottom line: For custom null spaces using the default layout in DMDA, >>>> you need to take apart the PCFIELDSPLIT after it has been setup, >>>> which is somewhat subtle. You need to call KSPSetUp() and then reach in >>>> and get the PC, and the subKSPs. I don't like this at all, but we >>>> have not reorganized that code (which could be very simple and >>>> inflexible since its very structured). >>>> >>> >>> So I tried to get this approach working but I could not succeed and >>> encountered some errors. Here is a code snippet: >>> >>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>> velocity components and 1 pressure comp.) >>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>> ierr = >>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>> for the complete system. >>> ierr = >>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>> //This I expect would register these options I give:-pc_type fieldsplit >>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>> //-pc_fieldsplit_1_fields 3 >>> >>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>> >>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>> obtained from the options (fieldsplit) >>> >>> ierr = >>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>> >>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>> >>> KSP *kspSchur; >>> PetscInt kspSchurPos = 1; >>> ierr = >>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>> ierr = >>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>> //The null space is the one that correspond to only pressure nodes, created >>> using the mDaPressure. >>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>> >>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>> >> >> Sorry, you need to return to the old DMDA behavior, so you want >> >> -pc_fieldsplit_dm_splits 0 >> > > Thanks, with this it seems I can attach the null space properly, but I > have a question regarding whether the Schur complement ksp solver is > actually using the preconditioner matrix I provide. > When using -ksp_view, the outer level pc object of type fieldsplit does > report that: "Preconditioner for the Schur complement formed from user > provided matrix", but in the KSP solver for Schur complement S, the pc > object (fieldsplit_1_) is of type ilu and doesn't say that it is using the > matrix I provide. Am I missing something here ? > Below are the relevant commented code snippet and the output of the > -ksp_view > (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 > -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) > If ILU does not error, it means it is using your matrix, because the Schur complement matrix cannot be factored, and FS says it is using your matrix. Matt > Code snippet: > ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The > nullspace for the whole system > ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); > ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up mKsp > with the options provided with fieldsplit and the fields associated with > the two splits. > > ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); //Get > the fieldsplit pc set up from the options > > ierr = > PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); > //Use mPcForSc as the preconditioner for Schur Complement > > KSP *kspSchur; > PetscInt kspSchurPos = 1; > ierr = > PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); > ierr = > KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); > //Attach the null-space for the Schur complement ksp solver. > ierr = PetscFree(kspSchur);CHKERRQ(ierr); > > ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); > > > > the output of the -ksp_view > KSP Object: 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > has attached null space > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, blocksize = 4, factorization FULL > Preconditioner for the Schur complement formed from user provided > matrix > Split info: > Split number 0 Fields 0, 1, 2 > Split number 1 Fields 3 > KSP solver for A00 block > KSP Object: (fieldsplit_0_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_0_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: natural > factor fill ratio given 1, needed 1 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=2187, cols=2187 > package used to perform factorization: petsc > total: nonzeros=140625, allocated nonzeros=140625 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 729 nodes, limit used is 5 > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=2187, cols=2187 > total: nonzeros=140625, allocated nonzeros=140625 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 729 nodes, limit used is 5 > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (fieldsplit_1_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > has attached null space > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_1_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: natural > factor fill ratio given 1, needed 1 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=729, cols=729 > package used to perform factorization: petsc > total: nonzeros=15625, allocated nonzeros=15625 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Matrix Object: 1 MPI processes > type: schurcomplement > rows=729, cols=729 > Schur complement A11 - A10 inv(A00) A01 > A11 > Matrix Object: 1 MPI processes > type: seqaij > rows=729, cols=729 > total: nonzeros=15625, allocated nonzeros=15625 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A10 > Matrix Object: 1 MPI processes > type: seqaij > rows=729, cols=2187 > total: nonzeros=46875, allocated nonzeros=46875 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP of A00 > KSP Object: (fieldsplit_0_) 1 MPI > processes > type: gmres > GMRES: restart=30, using Classical (unmodified) > Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_0_) 1 MPI > processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: natural > factor fill ratio given 1, needed 1 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=2187, cols=2187 > package used to perform factorization: petsc > total: nonzeros=140625, allocated nonzeros=140625 > total number of mallocs used during MatSetValues > calls =0 > using I-node routines: found 729 nodes, limit > used is 5 > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=2187, cols=2187 > total: nonzeros=140625, allocated nonzeros=140625 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 729 nodes, limit used is 5 > A01 > Matrix Object: 1 MPI processes > type: seqaij > rows=2187, cols=729 > total: nonzeros=46875, allocated nonzeros=46875 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 729 nodes, limit used is 5 > Matrix Object: 1 MPI processes > type: seqaij > rows=729, cols=729 > total: nonzeros=15625, allocated nonzeros=15625 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=2916, cols=2916, bs=4 > total: nonzeros=250000, allocated nonzeros=250000 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 729 nodes, limit used is 5 > > > > > >> >> or >> >> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >> >> Thanks, >> >> Matt >> >> >>> The errors I get when running with options: -pc_type fieldsplit >>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>> -pc_fieldsplit_1_fields 3 >>> [0]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [0]PETSC ERROR: No support for this operation for this object type! >>> [0]PETSC ERROR: Support only implemented for 2d! >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards >>> by bkhanal Tue Aug 6 17:35:30 2013 >>> [0]PETSC ERROR: Libraries linked from >>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>> -with-clanguage=cxx --download-hypre=1 >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>> [0]PETSC ERROR: PCSetUp() line 890 in >>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSetUp() line 278 in >>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: solveModel() line 181 in >>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>> WARNING! There are options you set that were not used! >>> WARNING! could be spelling mistake, etc! >>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>> >>> >>> >>> >>> >>> >>> >>> >>>> >>>> Matt >>>> >>>> >>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Wed Aug 7 07:26:26 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Wed, 7 Aug 2013 14:26:26 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: > On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: > >> >> >> >> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >> >>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>>>>>>>> >>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>> >>>>>>>>>>> > Now, I implemented two different approaches, each for both 2D >>>>>>>>>>> and 3D, in >>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>>> solving it for >>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>> > I use staggered grid with p on cell centers, and components of >>>>>>>>>>> v on cell >>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>> account for the >>>>>>>>>>> > variable viscosity case) >>>>>>>>>>> >>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>> discretization of >>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>> recommend >>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction (make >>>>>>>>>>> that >>>>>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>>>>> PCLSC or >>>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>>> containing >>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>> diagonal >>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>> velocity >>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>> PCFieldSplit >>>>>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>>>>> stuck. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix to >>>>>>>>>> use as the preconditioner for the Schur complement solve step as you >>>>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>> A naive approach that I can think of would be to create a vector >>>>>>>>>> with its components as reciprocal viscosities of the cell centers >>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>>> dof as explained before. >>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the >>>>>>>>>> S_p matrix would correspond to only pressure space. Should the distribution >>>>>>>>>> of the rows of S_p among different processor not correspond to the >>>>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>>>> >>>>>>>>> >>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. >>>>>>>>> The P distribution is the same as the entire problem divided by 4. >>>>>>>>> >>>>>>>> >>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but with >>>>>>>> dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>> the vertices ? >>>>>>>> >>>>>>> >>>>>>> Yes. >>>>>>> >>>>>>> >>>>>>>> Even then I think there might be a problem due to the presence of >>>>>>>> "fictitious pressure vertices". The system matrix (A) contains an identity >>>>>>>> corresponding to these fictitious pressure nodes, thus when using a >>>>>>>> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>> >>>>>>> >>>>>>> Don't use detect_saddle, but split it by fields >>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>> >>>>>> >>>>>> How can I set this split in the code itself without giving it as a >>>>>> command line option when the system matrix is assembled from the DMDA for >>>>>> the whole system with 4 dofs. (i.e. *without* using the DMComposite >>>>>> or *without* using the nested block matrices to assemble different >>>>>> blocks separately and then combine them together). >>>>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>>>> because not using detect_saddle_point means I cannot use >>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>> >>>>> >>>>> This is currently a real problem with the DMDA. In the unstructured >>>>> case, where we always need specialized spaces, you can >>>>> use something like >>>>> >>>>> PetscObject pressure; >>>>> MatNullSpace nullSpacePres; >>>>> >>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, >>>>> 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>>> nullSpacePres);CHKERRQ(ierr); >>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>> >>>>> and then DMGetSubDM() uses this information to attach the null space >>>>> to the IS that is created using the information in the PetscSection. >>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>> think this works correctly, but this has not been tested at all and is very >>>>> new code. Eventually, I think we want all DMs to use this mechanism, >>>>> but we are still working it out. >>>>> >>>> >>>> Currently I do not use PetscSection. If this makes a cleaner approach, >>>> I'd try it too but may a bit later (right now I'd like test my model with a >>>> quickfix even if it means a little dirty code!) >>>> >>>> >>>>> >>>>> Bottom line: For custom null spaces using the default layout in DMDA, >>>>> you need to take apart the PCFIELDSPLIT after it has been setup, >>>>> which is somewhat subtle. You need to call KSPSetUp() and then reach >>>>> in and get the PC, and the subKSPs. I don't like this at all, but we >>>>> have not reorganized that code (which could be very simple and >>>>> inflexible since its very structured). >>>>> >>>> >>>> So I tried to get this approach working but I could not succeed and >>>> encountered some errors. Here is a code snippet: >>>> >>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>> velocity components and 1 pressure comp.) >>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>> ierr = >>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>> for the complete system. >>>> ierr = >>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>> //This I expect would register these options I give:-pc_type fieldsplit >>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>> //-pc_fieldsplit_1_fields 3 >>>> >>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>> >>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>> obtained from the options (fieldsplit) >>>> >>>> ierr = >>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>> >>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>> >>>> KSP *kspSchur; >>>> PetscInt kspSchurPos = 1; >>>> ierr = >>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>> ierr = >>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>> //The null space is the one that correspond to only pressure nodes, created >>>> using the mDaPressure. >>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>> >>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>> >>> >>> Sorry, you need to return to the old DMDA behavior, so you want >>> >>> -pc_fieldsplit_dm_splits 0 >>> >> >> Thanks, with this it seems I can attach the null space properly, but I >> have a question regarding whether the Schur complement ksp solver is >> actually using the preconditioner matrix I provide. >> When using -ksp_view, the outer level pc object of type fieldsplit does >> report that: "Preconditioner for the Schur complement formed from user >> provided matrix", but in the KSP solver for Schur complement S, the pc >> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >> matrix I provide. Am I missing something here ? >> Below are the relevant commented code snippet and the output of the >> -ksp_view >> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >> > > If ILU does not error, it means it is using your matrix, because the Schur > complement matrix cannot be factored, and FS says it is using your matrix. > Thanks Matt! By the way, what do these statements mean in -ksp_view results: not using I-node routines or using I-node routines: found 729 nodes, limit used is 5 > > Matt > > >> Code snippet: >> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >> nullspace for the whole system >> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up mKsp >> with the options provided with fieldsplit and the fields associated with >> the two splits. >> >> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); //Get >> the fieldsplit pc set up from the options >> >> ierr = >> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >> //Use mPcForSc as the preconditioner for Schur Complement >> >> KSP *kspSchur; >> PetscInt kspSchurPos = 1; >> ierr = >> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >> ierr = >> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >> //Attach the null-space for the Schur complement ksp solver. >> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >> >> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >> >> >> >> the output of the -ksp_view >> KSP Object: 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> has attached null space >> using PRECONDITIONED norm type for convergence test >> PC Object: 1 MPI processes >> type: fieldsplit >> FieldSplit with Schur preconditioner, blocksize = 4, factorization >> FULL >> Preconditioner for the Schur complement formed from user provided >> matrix >> Split info: >> Split number 0 Fields 0, 1, 2 >> Split number 1 Fields 3 >> KSP solver for A00 block >> KSP Object: (fieldsplit_0_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> package used to perform factorization: petsc >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> KSP solver for S = A11 - A10 inv(A00) A01 >> KSP Object: (fieldsplit_1_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> has attached null space >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_1_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=729 >> package used to perform factorization: petsc >> total: nonzeros=15625, allocated nonzeros=15625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Matrix Object: 1 MPI processes >> type: schurcomplement >> rows=729, cols=729 >> Schur complement A11 - A10 inv(A00) A01 >> A11 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=729 >> total: nonzeros=15625, allocated nonzeros=15625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> A10 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=2187 >> total: nonzeros=46875, allocated nonzeros=46875 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP of A00 >> KSP Object: (fieldsplit_0_) 1 >> MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) >> Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI >> processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> package used to perform factorization: petsc >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues >> calls =0 >> using I-node routines: found 729 nodes, limit >> used is 5 >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues calls >> =0 >> using I-node routines: found 729 nodes, limit used is >> 5 >> A01 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=729 >> total: nonzeros=46875, allocated nonzeros=46875 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=729 >> total: nonzeros=15625, allocated nonzeros=15625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2916, cols=2916, bs=4 >> total: nonzeros=250000, allocated nonzeros=250000 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> >> >> >> >> >>> >>> or >>> >>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> The errors I get when running with options: -pc_type fieldsplit >>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>> -pc_fieldsplit_1_fields 3 >>>> [0]PETSC ERROR: --------------------- Error Message >>>> ------------------------------------ >>>> [0]PETSC ERROR: No support for this operation for this object type! >>>> [0]PETSC ERROR: Support only implemented for 2d! >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards >>>> by bkhanal Tue Aug 6 17:35:30 2013 >>>> [0]PETSC ERROR: Libraries linked from >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>> -with-clanguage=cxx --download-hypre=1 >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: solveModel() line 181 in >>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>> WARNING! There are options you set that were not used! >>>> WARNING! could be spelling mistake, etc! >>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Wed Aug 7 10:37:15 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 07 Aug 2013 09:37:15 -0600 Subject: [petsc-users] how to know the original global index after the partition. In-Reply-To: <8dabd0b.c996.14058621e8e.Coremail.ztdepyahoo@163.com> References: <8dabd0b.c996.14058621e8e.Coremail.ztdepyahoo@163.com> Message-ID: <87ob995x78.fsf@mcs.anl.gov> ??? writes: > In the ISPartitioningToNumbering(IS part,IS *is) > is define thee index set that defines the global numbers on each part. > but how to get the original global index. The IS maps from original indices to new indices, so I don't know what you're asking. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From abhyshr at mcs.anl.gov Wed Aug 7 11:18:52 2013 From: abhyshr at mcs.anl.gov (Shri) Date: Wed, 7 Aug 2013 11:18:52 -0500 Subject: [petsc-users] how to know the original global index after the partition. In-Reply-To: <8dabd0b.c996.14058621e8e.Coremail.ztdepyahoo@163.com> References: <8dabd0b.c996.14058621e8e.Coremail.ztdepyahoo@163.com> Message-ID: <37BC6BC4-0FCC-4E68-8653-805324124692@mcs.anl.gov> One way to do this is to call ISPartitioningCount() first to get the number of indices on each process and then call ISInvertPermutation() to get the IS that has the original global index. Here's what I've done in one of my code. /* Convert the processor mapping IS to new global numbering */ ierr = ISPartitioningToNumbering(is,&is_globalnew);CHKERRQ(ierr); PetscInt *nloc; /* Convert new global numbering to old global numbering */ ierr = PetscMalloc(nps*sizeof(PetscInt),&nloc);CHKERRQ(ierr); ierr = ISPartitioningCount(is,nps,nloc);CHKERRQ(ierr); // nps is the number of processors. ierr = ISDestroy(is);CHKERRQ(ierr); ierr = ISInvertPermutation(is_globalnew,nloc[rank],&is);CHKERRQ(ierr); The resultant "is" is the index set in the old global numbering. Shri On Aug 7, 2013, at 5:46 AM, ??? wrote: > In the ISPartitioningToNumbering(IS part,IS *is) > is define thee index set that defines the global numbers on each part. > but how to get the original global index. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhenglun.wei at gmail.com Wed Aug 7 12:30:35 2013 From: zhenglun.wei at gmail.com (Alan) Date: Wed, 07 Aug 2013 12:30:35 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> Message-ID: <5202843B.5050900@gmail.com> Hi Dr. Smith, Thank you so much for your reply. I tried to use the geometric multigrid to speed up the KSP solver with setup: mpiexec -np 1 ./ex29 -ksp_type cg -pc_type mg -da_refine 2 -ksp_rtol 1.0e-7 It do have almost the same computational rate compared with mudpack right now. Whereas, I have few questions here: 1. After KSPCreate(), I used a DMDACreate2d() and DMDASetUniformCoordinates() to build a uniform Cartesian mesh from PETSc. If I input imax and jmax to DMDACreate2d() as the global dimension of the grid, the real number of grid of x- and y-direction are imax and jmax, respectively, for the code with PC = GAMG; while they are (imax-1)*4 and (jmax-1)*4, respectively, for the code with PC = MG with -da_refine = 2. Is this normal? Does this indicate that the imax, jmax I inputed for the code with PC = MG is the global dimension for the coarsest level in the multi-grid? 2, Is there any command of PETSc that I can used in my code to detect what is the type of my preconditioner? 3, Is there any command of PETSc that I can used to know what is the value of -da_refine if the MG is used? 4, What is 'PCMGType'? Should I just keep it as default? In the original makefile for /src/ksp/ksp/tutorial/example/ex29.c, pc_mg_type was 'full'. I tried it; it is slightly slower than the default setting. 5, What other settings I can play with to further speed up the computational rate? thanks, Alan > Alan, > > If you can use MUDPACK then you can also use PETSc's geometric multigrid, both sequential and parallel and its performance should be fairly close to mudpack on one process. > > Barry > > On Aug 6, 2013, at 2:56 PM, Alan wrote: > >> Thanks for replies. Here I attached the log_summary for the large and small problems. The DoFs for the large problem is 4 times of that for the small problem. Few observations are listed here: >> 1, the total number of iterations does not change much from the small problem to the large one; >> 2, the time elapsed for KSPSolve() for the large problem is less than 4 times of that for the small problem; >> 3, the time elapsed for PCSet() for the large problem is more than 10 times of that for the small problem; >> 4, the time elapsed for PCGAMGProl_AGG for the large problem is more than 20 times of that for the small problem; >> In my code, I have solved the Poisson equation for twice with difference RHS; however, the observation above is almost consistent for these two times. >> Do these observation indicate that I should switch my PC from GAMG to MG for solving Poisson equation in a single process? >> >> best, >> Alan >> >>> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp wrote: >>> Hi Alan, >>> >>> please use -log_summary to get profiling information on the run. What is >>> the bottleneck? Is it the number of solver iterations increasing >>> significantly? If so, consider changing the preconditioner options (more >>> levels!). I don't expect a direct solver to be any faster in the 180k >>> case for a Poisson problem. >>> >>> Mudpack is geometric multigrid: http://www2.cisl.ucar.edu/resources/legacy/mudpack >>> This should be faster. >>> >>> Matt >>> >>> Best regards, >>> Karli >>> >>> >>> On 08/06/2013 02:22 PM, Alan wrote: >>>> Dear all, >>>> I hope you're having a nice day. >>>> I have a quick question on solving Poisson equation with KSP solvers >>>> (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: >>>> -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it >>>> 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 >>>> It performs very well in parallel computation and scalability is fine. >>>> However, if I run it with a single process, the KSP solver is much >>>> slower than direct ones, i.e. Mudpack. Briefly, the speed difference >>>> between the KSP solver and the direct solver is negligible on dealing >>>> with small problems (i.e.36k DoFs ) but becomes very huge for moderate >>>> large problems (i.e. 180k DoFs). Although the direct solver inherently >>>> has better performance for moderate large problems in the single >>>> process, I wonder if any setup or approach can improve the performance >>>> of this KSP Poisson solver with the single process? or even make it >>>> obtain competitive speed (a little bit slower is fine) against direct >>>> solvers. >>>> >>>> thanks in advance, >>>> Alan >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>> -- Norbert Wiener >> From bsmith at mcs.anl.gov Wed Aug 7 13:49:14 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 7 Aug 2013 13:49:14 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <5202843B.5050900@gmail.com> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> <5202843B.5050900@gmail.com> Message-ID: <0317F74B-3790-47EA-85C0-A1F891ED782D@mcs.anl.gov> On Aug 7, 2013, at 12:30 PM, Alan wrote: > Hi Dr. Smith, > Thank you so much for your reply. I tried to use the geometric multigrid to speed up the KSP solver with setup: > mpiexec -np 1 ./ex29 -ksp_type cg -pc_type mg -da_refine 2 -ksp_rtol 1.0e-7 > It do have almost the same computational rate compared with mudpack right now. Whereas, I have few questions here: > 1. After KSPCreate(), I used a DMDACreate2d() and DMDASetUniformCoordinates() to build a uniform Cartesian mesh from PETSc. If I input imax and jmax to DMDACreate2d() as the global dimension of the grid, the real number of grid of x- and y-direction are imax and jmax, respectively, for the code with PC = GAMG; while they are (imax-1)*4 and (jmax-1)*4, respectively, for the code with PC = MG with -da_refine = 2. Is this normal? Does this indicate that the imax, jmax I inputed for the code with PC = MG is the global dimension for the coarsest level in the multi-grid? The -da_refine n causes the DA you provided to be refined n times giving you the final grid. Each refinement is a factor of 2 in each direction so yes the fine grid would be (imax-1)*4 and (jmax-1)*4,. You do not need to use -da_refine n you can just set the imax and jmax and use -pc_mg_levels p where p is the number of multigrid levels you wish to use. Note that imax and jmax must be large enough to be coarsened p times and must have appropriate integer values that can be coarsened. > 2, Is there any command of PETSc that I can used in my code to detect what is the type of my preconditioner? PCGetType() > 3, Is there any command of PETSc that I can used to know what is the value of -da_refine if the MG is used? PCGetMGLevels() tells you how many levels of multigrid it is using. -ksp_view shows full details on the solver being used. > 4, What is 'PCMGType'? Should I just keep it as default? In the original makefile for /src/ksp/ksp/tutorial/example/ex29.c, pc_mg_type was 'full'. I tried it; it is slightly slower than the default setting. You can use whatever is faster. > 5, What other settings I can play with to further speed up the computational rate? There are many options for multigrid. In particular how many levels you use and what smoother you use on each level. What results in the fastest solver depends on the machine and exact problem you are solving and how many processes you are using. The defaults in PETSc 3.4 should be reasonably good. Barry > > thanks, > Alan > >> Alan, >> >> If you can use MUDPACK then you can also use PETSc's geometric multigrid, both sequential and parallel and its performance should be fairly close to mudpack on one process. >> >> Barry >> >> On Aug 6, 2013, at 2:56 PM, Alan wrote: >> >>> Thanks for replies. Here I attached the log_summary for the large and small problems. The DoFs for the large problem is 4 times of that for the small problem. Few observations are listed here: >>> 1, the total number of iterations does not change much from the small problem to the large one; >>> 2, the time elapsed for KSPSolve() for the large problem is less than 4 times of that for the small problem; >>> 3, the time elapsed for PCSet() for the large problem is more than 10 times of that for the small problem; >>> 4, the time elapsed for PCGAMGProl_AGG for the large problem is more than 20 times of that for the small problem; >>> In my code, I have solved the Poisson equation for twice with difference RHS; however, the observation above is almost consistent for these two times. >>> Do these observation indicate that I should switch my PC from GAMG to MG for solving Poisson equation in a single process? >>> >>> best, >>> Alan >>> >>>> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp wrote: >>>> Hi Alan, >>>> >>>> please use -log_summary to get profiling information on the run. What is >>>> the bottleneck? Is it the number of solver iterations increasing >>>> significantly? If so, consider changing the preconditioner options (more >>>> levels!). I don't expect a direct solver to be any faster in the 180k >>>> case for a Poisson problem. >>>> >>>> Mudpack is geometric multigrid: http://www2.cisl.ucar.edu/resources/legacy/mudpack >>>> This should be faster. >>>> >>>> Matt >>>> Best regards, >>>> Karli >>>> >>>> >>>> On 08/06/2013 02:22 PM, Alan wrote: >>>>> Dear all, >>>>> I hope you're having a nice day. >>>>> I have a quick question on solving Poisson equation with KSP solvers >>>>> (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: >>>>> -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it >>>>> 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 >>>>> It performs very well in parallel computation and scalability is fine. >>>>> However, if I run it with a single process, the KSP solver is much >>>>> slower than direct ones, i.e. Mudpack. Briefly, the speed difference >>>>> between the KSP solver and the direct solver is negligible on dealing >>>>> with small problems (i.e.36k DoFs ) but becomes very huge for moderate >>>>> large problems (i.e. 180k DoFs). Although the direct solver inherently >>>>> has better performance for moderate large problems in the single >>>>> process, I wonder if any setup or approach can improve the performance >>>>> of this KSP Poisson solver with the single process? or even make it >>>>> obtain competitive speed (a little bit slower is fine) against direct >>>>> solvers. >>>>> >>>>> thanks in advance, >>>>> Alan >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>> -- Norbert Wiener >>> > From knepley at gmail.com Wed Aug 7 15:50:52 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 7 Aug 2013 15:50:52 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 7, 2013 at 7:26 AM, Bishesh Khanal wrote: > > > > On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: > >> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >>> >>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal >>>>>>> > wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown >>>>>>>>>> > wrote: >>>>>>>>>>> >>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>> >>>>>>>>>>>> > Now, I implemented two different approaches, each for both 2D >>>>>>>>>>>> and 3D, in >>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>>>> solving it for >>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>> > I use staggered grid with p on cell centers, and components >>>>>>>>>>>> of v on cell >>>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>>> account for the >>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>> >>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>> discretization of >>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>>> recommend >>>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction >>>>>>>>>>>> (make that >>>>>>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>>>>>> PCLSC or >>>>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>>>> containing >>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>>> diagonal >>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>>> velocity >>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>>>>>> stuck. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix >>>>>>>>>>> to use as the preconditioner for the Schur complement solve step as you >>>>>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>> A naive approach that I can think of would be to create a vector >>>>>>>>>>> with its components as reciprocal viscosities of the cell centers >>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>>>> dof as explained before. >>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the >>>>>>>>>>> S_p matrix would correspond to only pressure space. Should the distribution >>>>>>>>>>> of the rows of S_p among different processor not correspond to the >>>>>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. >>>>>>>>>> The P distribution is the same as the entire problem divided by 4. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but >>>>>>>>> with dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>>> the vertices ? >>>>>>>>> >>>>>>>> >>>>>>>> Yes. >>>>>>>> >>>>>>>> >>>>>>>>> Even then I think there might be a problem due to the presence >>>>>>>>> of "fictitious pressure vertices". The system matrix (A) contains an >>>>>>>>> identity corresponding to these fictitious pressure nodes, thus when using >>>>>>>>> a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>> >>>>>>>> >>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>> >>>>>>> >>>>>>> How can I set this split in the code itself without giving it as a >>>>>>> command line option when the system matrix is assembled from the DMDA for >>>>>>> the whole system with 4 dofs. (i.e. *without* using the DMComposite >>>>>>> or *without* using the nested block matrices to assemble different >>>>>>> blocks separately and then combine them together). >>>>>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>>>>> because not using detect_saddle_point means I cannot use >>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>> >>>>>> >>>>>> This is currently a real problem with the DMDA. In the unstructured >>>>>> case, where we always need specialized spaces, you can >>>>>> use something like >>>>>> >>>>>> PetscObject pressure; >>>>>> MatNullSpace nullSpacePres; >>>>>> >>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, >>>>>> 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>>>> nullSpacePres);CHKERRQ(ierr); >>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>> >>>>>> and then DMGetSubDM() uses this information to attach the null space >>>>>> to the IS that is created using the information in the PetscSection. >>>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>>> think this works correctly, but this has not been tested at all and is very >>>>>> new code. Eventually, I think we want all DMs to use this mechanism, >>>>>> but we are still working it out. >>>>>> >>>>> >>>>> Currently I do not use PetscSection. If this makes a cleaner approach, >>>>> I'd try it too but may a bit later (right now I'd like test my model with a >>>>> quickfix even if it means a little dirty code!) >>>>> >>>>> >>>>>> >>>>>> Bottom line: For custom null spaces using the default layout in DMDA, >>>>>> you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>> which is somewhat subtle. You need to call KSPSetUp() and then reach >>>>>> in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>> have not reorganized that code (which could be very simple and >>>>>> inflexible since its very structured). >>>>>> >>>>> >>>>> So I tried to get this approach working but I could not succeed and >>>>> encountered some errors. Here is a code snippet: >>>>> >>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>>> velocity components and 1 pressure comp.) >>>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>> ierr = >>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>>> for the complete system. >>>>> ierr = >>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>> //-pc_fieldsplit_1_fields 3 >>>>> >>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>> >>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>>> obtained from the options (fieldsplit) >>>>> >>>>> ierr = >>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>> >>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>> >>>>> KSP *kspSchur; >>>>> PetscInt kspSchurPos = 1; >>>>> ierr = >>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>> ierr = >>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>> //The null space is the one that correspond to only pressure nodes, created >>>>> using the mDaPressure. >>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>> >>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>> >>>> >>>> Sorry, you need to return to the old DMDA behavior, so you want >>>> >>>> -pc_fieldsplit_dm_splits 0 >>>> >>> >>> Thanks, with this it seems I can attach the null space properly, but I >>> have a question regarding whether the Schur complement ksp solver is >>> actually using the preconditioner matrix I provide. >>> When using -ksp_view, the outer level pc object of type fieldsplit does >>> report that: "Preconditioner for the Schur complement formed from user >>> provided matrix", but in the KSP solver for Schur complement S, the pc >>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>> matrix I provide. Am I missing something here ? >>> Below are the relevant commented code snippet and the output of the >>> -ksp_view >>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>> >> >> If ILU does not error, it means it is using your matrix, because the >> Schur complement matrix cannot be factored, and FS says it is using your >> matrix. >> > > Thanks Matt! By the way, what do these statements mean in -ksp_view > results: > not using I-node routines > or > using I-node routines: found 729 nodes, limit used is 5 > This is an optimization in the sparse matrix storage format for rows with identical nonzero structure. Matt > >> Matt >> >> >>> Code snippet: >>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >>> nullspace for the whole system >>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up mKsp >>> with the options provided with fieldsplit and the fields associated with >>> the two splits. >>> >>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); //Get >>> the fieldsplit pc set up from the options >>> >>> ierr = >>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>> //Use mPcForSc as the preconditioner for Schur Complement >>> >>> KSP *kspSchur; >>> PetscInt kspSchurPos = 1; >>> ierr = >>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>> ierr = >>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>> //Attach the null-space for the Schur complement ksp solver. >>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>> >>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>> >>> >>> >>> the output of the -ksp_view >>> KSP Object: 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> has attached null space >>> using PRECONDITIONED norm type for convergence test >>> PC Object: 1 MPI processes >>> type: fieldsplit >>> FieldSplit with Schur preconditioner, blocksize = 4, factorization >>> FULL >>> Preconditioner for the Schur complement formed from user provided >>> matrix >>> Split info: >>> Split number 0 Fields 0, 1, 2 >>> Split number 1 Fields 3 >>> KSP solver for A00 block >>> KSP Object: (fieldsplit_0_) 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_0_) 1 MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=2187 >>> package used to perform factorization: petsc >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=2187 >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> KSP solver for S = A11 - A10 inv(A00) A01 >>> KSP Object: (fieldsplit_1_) 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> has attached null space >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_1_) 1 MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=729 >>> package used to perform factorization: petsc >>> total: nonzeros=15625, allocated nonzeros=15625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix followed by preconditioner matrix: >>> Matrix Object: 1 MPI processes >>> type: schurcomplement >>> rows=729, cols=729 >>> Schur complement A11 - A10 inv(A00) A01 >>> A11 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=729 >>> total: nonzeros=15625, allocated nonzeros=15625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> A10 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=2187 >>> total: nonzeros=46875, allocated nonzeros=46875 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> KSP of A00 >>> KSP Object: (fieldsplit_0_) 1 >>> MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) >>> Gram-Schmidt Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, >>> divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_0_) 1 >>> MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI >>> processes >>> type: seqaij >>> rows=2187, cols=2187 >>> package used to perform factorization: petsc >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues >>> calls =0 >>> using I-node routines: found 729 nodes, limit >>> used is 5 >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=2187 >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues calls >>> =0 >>> using I-node routines: found 729 nodes, limit used >>> is 5 >>> A01 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=729 >>> total: nonzeros=46875, allocated nonzeros=46875 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=729 >>> total: nonzeros=15625, allocated nonzeros=15625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2916, cols=2916, bs=4 >>> total: nonzeros=250000, allocated nonzeros=250000 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> >>> >>> >>> >>> >>>> >>>> or >>>> >>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> The errors I get when running with options: -pc_type fieldsplit >>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>> -pc_fieldsplit_1_fields 3 >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>> ------------------------------------ >>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards >>>>> by bkhanal Tue Aug 6 17:35:30 2013 >>>>> [0]PETSC ERROR: Libraries linked from >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>> -with-clanguage=cxx --download-hypre=1 >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>> WARNING! There are options you set that were not used! >>>>> WARNING! could be spelling mistake, etc! >>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Thu Aug 8 06:32:21 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Thu, 8 Aug 2013 13:32:21 +0200 Subject: [petsc-users] creating a global vector of one particular field from a global vector created from multiple dof dmda Message-ID: Hi all, Let's say I have two DMDAs with identical size but with different dofs. E.g. da1 with dof=4; da2 with dof=1; I have global vectors associated with each one of them, say, gv1 and gv2 respectively. How can I copy/scatter values of one particular field from gv1 to gv2 ? Looking at the manual it seems I should be able to use sth like following: VecScatterCreate(Vec gv1, IS iGv1, Vec gv2, IS iGv2, VecScatter *ctx) VecScatterBegin(VecScatter ctx, Vec gv1,Vec gv2, INSERT_VALUES,SCATTER_FORWARD); But I do not know how to get the Index sets iGv1 and iGv2 in this case. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 8 06:43:20 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 8 Aug 2013 06:43:20 -0500 Subject: [petsc-users] creating a global vector of one particular field from a global vector created from multiple dof dmda In-Reply-To: References: Message-ID: On Thu, Aug 8, 2013 at 6:32 AM, Bishesh Khanal wrote: > Hi all, > Let's say I have two DMDAs with identical size but with different dofs. > E.g. da1 with dof=4; da2 with dof=1; > I have global vectors associated with each one of them, say, gv1 and gv2 > respectively. > How can I copy/scatter values of one particular field from gv1 to gv2 ? > Looking at the manual it seems I should be able to use sth like following: > VecScatterCreate(Vec gv1, IS iGv1, Vec gv2, IS iGv2, VecScatter *ctx) > VecScatterBegin(VecScatter ctx, Vec gv1,Vec gv2, > INSERT_VALUES,SCATTER_FORWARD); > But I do not know how to get the Index sets iGv1 and iGv2 in this case. > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Vec/VecStrideGather.html Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From pengxwang at hotmail.com Thu Aug 8 13:29:03 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Thu, 8 Aug 2013 13:29:03 -0500 Subject: [petsc-users] implementation of multi-level grid in petsc Message-ID: Hi, I am working on multi-level grid for Poisson equation. I need to refine some sub-region in the computational domain. To this, I plan to build some boxes (patches) based on the coarsest level. I am using DM to manage the data. I found there is a new function DMPatachCreate() in the version 3.4. Is this function the right one I should use for the refined region? If it is not, which ones I should use? My proposed approach is to start with code dm/impls/patch/examples/tests/ex1.c. And then follow the code /dm/examples/tutorials/ex65dm.c. Is this approach the right way to my goal? In addition, I need to use not only the nodes but also the cells including nodes. Should I use DMMesh to create the cells? I noticed DMMesh is mainly for unstructured grid, but I didn't find other class that implements structured cells. Can anybody give me some suggestions on multi-level grid or let me know which examples I should start with? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 8 14:03:53 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 8 Aug 2013 14:03:53 -0500 Subject: [petsc-users] implementation of multi-level grid in petsc In-Reply-To: References: Message-ID: On Thu, Aug 8, 2013 at 1:29 PM, Roc Wang wrote: > Hi, > > I am working on multi-level grid for Poisson equation. I need to > refine some sub-region in the computational domain. To this, I plan to > build some boxes (patches) based on the coarsest level. I am using DM to > manage the data. I found there is a new function DMPatachCreate() in the > version 3.4. Is this function the right one I should use for the refined > region? If it is not, which ones I should use? > That is an experiment and does not work. > My proposed approach is to start with code > dm/impls/patch/examples/tests/ex1.c. And then follow the code > /dm/examples/tutorials/ex65dm.c. Is this approach the right way to my goal? > > In addition, I need to use not only the nodes but also the cells > including nodes. Should I use DMMesh to create the cells? I noticed DMMesh > is mainly for unstructured grid, but I didn't find other class that > implements structured cells. Can anybody give me some suggestions on > multi-level grid or let me know which examples I should start with? Thanks. > No, that is not appropriate. It sounds like you want structured AMR. PETSc does not do this, and there are packages that do it.: a) Chombo b) SAMRAI which are both patch-based AMR. If you want octree-style AMR you could use p4est, but it would mean a lot of coding along the lines of http://arxiv.org/abs/1308.1472, or Deal.II which is a complete package. I think Deal is the closest to using PETSc solvers. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From pengxwang at hotmail.com Thu Aug 8 14:32:49 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Thu, 8 Aug 2013 14:32:49 -0500 Subject: [petsc-users] implementation of multi-level grid in petsc In-Reply-To: References: , Message-ID: Thanks Mat, I tried Chombo for implementing AMR but not tried SAMRAI yet. Chombo can do AMR, but it seems the data structure is quite complicated for customizing usage. What I want to do with petsc is to compose a simple "home-made" like blocked multi-level grid, though it is not automatically adaptive. However, I don't have too much experiences on petsc. As of now, I suppose to use DM to manage the data for the big domain and all small sub-domains. I am not sure whether it is a good idea. So, any suggestions are appreciated very much. Thanks again. Best, Date: Thu, 8 Aug 2013 14:03:53 -0500 Subject: Re: [petsc-users] implementation of multi-level grid in petsc From: knepley at gmail.com To: pengxwang at hotmail.com CC: petsc-users at mcs.anl.gov On Thu, Aug 8, 2013 at 1:29 PM, Roc Wang wrote: Hi, I am working on multi-level grid for Poisson equation. I need to refine some sub-region in the computational domain. To this, I plan to build some boxes (patches) based on the coarsest level. I am using DM to manage the data. I found there is a new function DMPatachCreate() in the version 3.4. Is this function the right one I should use for the refined region? If it is not, which ones I should use? That is an experiment and does not work. My proposed approach is to start with code dm/impls/patch/examples/tests/ex1.c. And then follow the code /dm/examples/tutorials/ex65dm.c. Is this approach the right way to my goal? In addition, I need to use not only the nodes but also the cells including nodes. Should I use DMMesh to create the cells? I noticed DMMesh is mainly for unstructured grid, but I didn't find other class that implements structured cells. Can anybody give me some suggestions on multi-level grid or let me know which examples I should start with? Thanks. No, that is not appropriate. It sounds like you want structured AMR. PETSc does not do this, and there are packages that do it.: a) Chombo b) SAMRAI which are both patch-based AMR. If you want octree-style AMR you could use p4est, but it would mean a lot of coding along the lines of http://arxiv.org/abs/1308.1472, or Deal.II which is a complete package.I think Deal is the closest to using PETSc solvers. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mirzadeh at gmail.com Thu Aug 8 14:56:46 2013 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 8 Aug 2013 12:56:46 -0700 Subject: [petsc-users] VecGhost memory layout Message-ID: Hi guys, I'm running into a bug that has made me question my understanding of memory layout in VecGhost. First, I remember reading somewhere before (in the manual or mailing list which I cannot find now) that the way they are internally organized is all local values followed by all ghost values. In other words ghost values are ALWAYS padded to the end of the array. Is this correct? Second, when I want to access both local and ghosted values, I do the following, VecGhostGetLocalForm(F, &F_loc); VecGetArray(F_loc, &F_loc_ptr); // do computation on F_loc_ptr VecRestoreArray(F_loc, &F_loc_ptr); VecGhostRestoreLocalForm(F, &F_loc); here I assume that in accessing F_loc_ptr, all indecies from [0, numLocal) are local values and the rest are ghost values. Once I'm done, I need every processor to update its ghost region and I call VecGhostUpdateBegin(F, INSERT_VALUES, SCATTER_FORWARD); VecGhostUpdateEnd(F, INSERT_VALUES, SCATTER_FORWARD); Is there any flaw in what I'm doing? Also, as a side question, if I call VecGetArray directly on F (and not F_loc) do I get junk values? Thanks, M From mfadams at lbl.gov Thu Aug 8 15:28:43 2013 From: mfadams at lbl.gov (Mark F. Adams) Date: Thu, 8 Aug 2013 16:28:43 -0400 Subject: [petsc-users] implementation of multi-level grid in petsc In-Reply-To: References: , Message-ID: On Aug 8, 2013, at 3:32 PM, Roc Wang wrote: > Thanks Mat, > > I tried Chombo for implementing AMR but not tried SAMRAI yet. Chombo can do AMR, but it seems the data structure is quite complicated for customizing usage. What I want to do with petsc is to compose a simple "home-made" like blocked multi-level grid, though it is not automatically adaptive. However, I don't have too much experiences on petsc. As of now, I suppose to use DM to manage the data for the big domain and all small sub-domains. I am not sure whether it is a good idea. So, any suggestions are appreciated very much. Thanks again. > As Matt said, this is not what you want to do, most likely. Building AMR on DM/DA is a lot of work unless you have a simple application and have a clear idea of how to do it. Chombo is flexible but it is complex and takes time to get started. I'm not familiar wit SAMARI but I would guess it is like Chombo. Deall.II might be worth looking into. I'm not familiar. > Best, > > > > > Date: Thu, 8 Aug 2013 14:03:53 -0500 > Subject: Re: [petsc-users] implementation of multi-level grid in petsc > From: knepley at gmail.com > To: pengxwang at hotmail.com > CC: petsc-users at mcs.anl.gov > > On Thu, Aug 8, 2013 at 1:29 PM, Roc Wang wrote: > Hi, > > I am working on multi-level grid for Poisson equation. I need to refine some sub-region in the computational domain. To this, I plan to build some boxes (patches) based on the coarsest level. I am using DM to manage the data. I found there is a new function DMPatachCreate() in the version 3.4. Is this function the right one I should use for the refined region? If it is not, which ones I should use? > > That is an experiment and does not work. > > My proposed approach is to start with code dm/impls/patch/examples/tests/ex1.c. And then follow the code /dm/examples/tutorials/ex65dm.c. Is this approach the right way to my goal? > > In addition, I need to use not only the nodes but also the cells including nodes. Should I use DMMesh to create the cells? I noticed DMMesh is mainly for unstructured grid, but I didn't find other class that implements structured cells. Can anybody give me some suggestions on multi-level grid or let me know which examples I should start with? Thanks. > > No, that is not appropriate. > > It sounds like you want structured AMR. PETSc does not do this, and there are packages that do it.: > > a) Chombo > > b) SAMRAI > > which are both patch-based AMR. If you want octree-style AMR you could use p4est, but it would mean > a lot of coding along the lines of http://arxiv.org/abs/1308.1472, or Deal.II which is a complete package. > I think Deal is the closest to using PETSc solvers. > > Thanks, > > Matt > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Aug 8 15:42:32 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 8 Aug 2013 15:42:32 -0500 Subject: [petsc-users] VecGhost memory layout In-Reply-To: References: Message-ID: On Aug 8, 2013, at 2:56 PM, Mohammad Mirzadeh wrote: > Hi guys, > > I'm running into a bug that has made me question my understanding of > memory layout in VecGhost. > > First, I remember reading somewhere before (in the manual or mailing > list which I cannot find now) that the way they are internally > organized is all local values followed by all ghost values. In other > words ghost values are ALWAYS padded to the end of the array. Is this > correct? > Yes > Second, when I want to access both local and ghosted values, I do the following, > > VecGhostGetLocalForm(F, &F_loc); > VecGetArray(F_loc, &F_loc_ptr); > // do computation on F_loc_ptr > VecRestoreArray(F_loc, &F_loc_ptr); > VecGhostRestoreLocalForm(F, &F_loc); > > here I assume that in accessing F_loc_ptr, all indecies from [0, > numLocal) are local values and the rest are ghost values. Once I'm > done, I need every processor to update its ghost region and I call > Good > VecGhostUpdateBegin(F, INSERT_VALUES, SCATTER_FORWARD); > VecGhostUpdateEnd(F, INSERT_VALUES, SCATTER_FORWARD); > > Is there any flaw in what I'm doing? No > Also, as a side question, if I > call VecGetArray directly on F (and not F_loc) do I get junk values? No, they actually share the same array so you will get the same values. Barry > > Thanks, > M From mirzadeh at gmail.com Thu Aug 8 15:44:25 2013 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 8 Aug 2013 13:44:25 -0700 Subject: [petsc-users] implementation of multi-level grid in petsc In-Reply-To: References: Message-ID: How big of an application are you looking into? If you are thinking in the range of couple of 10M grid points on couple of hundred processors, then I'd say the simplest approach is to create grid in serial and then use PETSc's interface to ParMetis to handle partitioning. I did this with my quadtree code and could easily scale quadtrees on the order of 16.5M grid points upto 75% on 256 processors for a Poisson equation test. If you are thinking way larger problem (think couple of 100M grids and order several thousands processors), I could recommend p4est if you want to do tree-based grids. In that case using deal.II interface will be really beneficial as p4est alone is really a bare bone package. I do not have enough experience with block-structured AMR so I cannot comment on that. On Thu, Aug 8, 2013 at 1:28 PM, Mark F. Adams wrote: > > On Aug 8, 2013, at 3:32 PM, Roc Wang wrote: > > Thanks Mat, > > I tried Chombo for implementing AMR but not tried SAMRAI yet. Chombo can > do AMR, but it seems the data structure is quite complicated for customizing > usage. What I want to do with petsc is to compose a simple "home-made" like > blocked multi-level grid, though it is not automatically adaptive. However, > I don't have too much experiences on petsc. As of now, I suppose to use DM > to manage the data for the big domain and all small sub-domains. I am not > sure whether it is a good idea. So, any suggestions are appreciated very > much. Thanks again. > > > As Matt said, this is not what you want to do, most likely. Building AMR on > DM/DA is a lot of work unless you have a simple application and have a clear > idea of how to do it. Chombo is flexible but it is complex and takes time > to get started. I'm not familiar wit SAMARI but I would guess it is like > Chombo. Deall.II might be worth looking into. I'm not familiar. > > Best, > > > > > ________________________________ > Date: Thu, 8 Aug 2013 14:03:53 -0500 > Subject: Re: [petsc-users] implementation of multi-level grid in petsc > From: knepley at gmail.com > To: pengxwang at hotmail.com > CC: petsc-users at mcs.anl.gov > > On Thu, Aug 8, 2013 at 1:29 PM, Roc Wang wrote: > > Hi, > > I am working on multi-level grid for Poisson equation. I need to refine > some sub-region in the computational domain. To this, I plan to build some > boxes (patches) based on the coarsest level. I am using DM to manage the > data. I found there is a new function DMPatachCreate() in the version 3.4. > Is this function the right one I should use for the refined region? If it > is not, which ones I should use? > > > That is an experiment and does not work. > > > My proposed approach is to start with code > dm/impls/patch/examples/tests/ex1.c. And then follow the code > /dm/examples/tutorials/ex65dm.c. Is this approach the right way to my goal? > > In addition, I need to use not only the nodes but also the cells > including nodes. Should I use DMMesh to create the cells? I noticed DMMesh > is mainly for unstructured grid, but I didn't find other class that > implements structured cells. Can anybody give me some suggestions on > multi-level grid or let me know which examples I should start with? Thanks. > > > No, that is not appropriate. > > It sounds like you want structured AMR. PETSc does not do this, and there > are packages that do it.: > > a) Chombo > > b) SAMRAI > > which are both patch-based AMR. If you want octree-style AMR you could use > p4est, but it would mean > a lot of coding along the lines of http://arxiv.org/abs/1308.1472, or > Deal.II which is a complete package. > I think Deal is the closest to using PETSc solvers. > > Thanks, > > Matt > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener > > From mirzadeh at gmail.com Thu Aug 8 15:50:16 2013 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Thu, 8 Aug 2013 13:50:16 -0700 Subject: [petsc-users] VecGhost memory layout In-Reply-To: References: Message-ID: Awesome. Thanks Barry for the quick response. On Thu, Aug 8, 2013 at 1:42 PM, Barry Smith wrote: > > On Aug 8, 2013, at 2:56 PM, Mohammad Mirzadeh wrote: > >> Hi guys, >> >> I'm running into a bug that has made me question my understanding of >> memory layout in VecGhost. >> >> First, I remember reading somewhere before (in the manual or mailing >> list which I cannot find now) that the way they are internally >> organized is all local values followed by all ghost values. In other >> words ghost values are ALWAYS padded to the end of the array. Is this >> correct? >> > > Yes > >> Second, when I want to access both local and ghosted values, I do the following, >> >> VecGhostGetLocalForm(F, &F_loc); >> VecGetArray(F_loc, &F_loc_ptr); >> // do computation on F_loc_ptr >> VecRestoreArray(F_loc, &F_loc_ptr); >> VecGhostRestoreLocalForm(F, &F_loc); >> >> here I assume that in accessing F_loc_ptr, all indecies from [0, >> numLocal) are local values and the rest are ghost values. Once I'm >> done, I need every processor to update its ghost region and I call >> > Good > >> VecGhostUpdateBegin(F, INSERT_VALUES, SCATTER_FORWARD); >> VecGhostUpdateEnd(F, INSERT_VALUES, SCATTER_FORWARD); >> >> Is there any flaw in what I'm doing? > > No > >> Also, as a side question, if I >> call VecGetArray directly on F (and not F_loc) do I get junk values? > > No, they actually share the same array so you will get the same values. > > Barry > >> >> Thanks, >> M > From zhenglun.wei at gmail.com Thu Aug 8 18:16:26 2013 From: zhenglun.wei at gmail.com (Alan) Date: Thu, 08 Aug 2013 18:16:26 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <0317F74B-3790-47EA-85C0-A1F891ED782D@mcs.anl.gov> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> <5202843B.5050900@gmail.com> <0317F74B-3790-47EA-85C0-A1F891ED782D@mcs.anl.gov> Message-ID: <520426CA.2000307@gmail.com> Dear Dr. Smith, I sincerely appreciate your valuable answers. My KSP Poisson solver has been significantly speed up with your help. Here, I wonder what should I do extra to employ geometric MG for non-uniform Cartesian mesh. I suppose the DMDA won't automatically generate the coarse grid for non-uniform Cartesian mesh. have a great evening, :) Alan > On Aug 7, 2013, at 12:30 PM, Alan wrote: > >> Hi Dr. Smith, >> Thank you so much for your reply. I tried to use the geometric multigrid to speed up the KSP solver with setup: >> mpiexec -np 1 ./ex29 -ksp_type cg -pc_type mg -da_refine 2 -ksp_rtol 1.0e-7 >> It do have almost the same computational rate compared with mudpack right now. Whereas, I have few questions here: >> 1. After KSPCreate(), I used a DMDACreate2d() and DMDASetUniformCoordinates() to build a uniform Cartesian mesh from PETSc. If I input imax and jmax to DMDACreate2d() as the global dimension of the grid, the real number of grid of x- and y-direction are imax and jmax, respectively, for the code with PC = GAMG; while they are (imax-1)*4 and (jmax-1)*4, respectively, for the code with PC = MG with -da_refine = 2. Is this normal? Does this indicate that the imax, jmax I inputed for the code with PC = MG is the global dimension for the coarsest level in the multi-grid? > The -da_refine n causes the DA you provided to be refined n times giving you the final grid. Each refinement is a factor of 2 in each direction so yes the fine grid would be (imax-1)*4 and (jmax-1)*4,. > > You do not need to use -da_refine n you can just set the imax and jmax and use -pc_mg_levels p where p is the number of multigrid levels you wish to use. Note that imax and jmax must be large enough to be coarsened p times and must have appropriate integer values that can be coarsened. >> 2, Is there any command of PETSc that I can used in my code to detect what is the type of my preconditioner? > PCGetType() >> 3, Is there any command of PETSc that I can used to know what is the value of -da_refine if the MG is used? > PCGetMGLevels() tells you how many levels of multigrid it is using. > > -ksp_view shows full details on the solver being used. > >> 4, What is 'PCMGType'? Should I just keep it as default? In the original makefile for /src/ksp/ksp/tutorial/example/ex29.c, pc_mg_type was 'full'. I tried it; it is slightly slower than the default setting. > You can use whatever is faster. > >> 5, What other settings I can play with to further speed up the computational rate? > There are many options for multigrid. In particular how many levels you use and what smoother you use on each level. What results in the fastest solver depends on the machine and exact problem you are solving and how many processes you are using. The defaults in PETSc 3.4 should be reasonably good. > > Barry > >> thanks, >> Alan >> >>> Alan, >>> >>> If you can use MUDPACK then you can also use PETSc's geometric multigrid, both sequential and parallel and its performance should be fairly close to mudpack on one process. >>> >>> Barry >>> >>> On Aug 6, 2013, at 2:56 PM, Alan wrote: >>> >>>> Thanks for replies. Here I attached the log_summary for the large and small problems. The DoFs for the large problem is 4 times of that for the small problem. Few observations are listed here: >>>> 1, the total number of iterations does not change much from the small problem to the large one; >>>> 2, the time elapsed for KSPSolve() for the large problem is less than 4 times of that for the small problem; >>>> 3, the time elapsed for PCSet() for the large problem is more than 10 times of that for the small problem; >>>> 4, the time elapsed for PCGAMGProl_AGG for the large problem is more than 20 times of that for the small problem; >>>> In my code, I have solved the Poisson equation for twice with difference RHS; however, the observation above is almost consistent for these two times. >>>> Do these observation indicate that I should switch my PC from GAMG to MG for solving Poisson equation in a single process? >>>> >>>> best, >>>> Alan >>>> >>>>> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp wrote: >>>>> Hi Alan, >>>>> >>>>> please use -log_summary to get profiling information on the run. What is >>>>> the bottleneck? Is it the number of solver iterations increasing >>>>> significantly? If so, consider changing the preconditioner options (more >>>>> levels!). I don't expect a direct solver to be any faster in the 180k >>>>> case for a Poisson problem. >>>>> >>>>> Mudpack is geometric multigrid: http://www2.cisl.ucar.edu/resources/legacy/mudpack >>>>> This should be faster. >>>>> >>>>> Matt >>>>> Best regards, >>>>> Karli >>>>> >>>>> >>>>> On 08/06/2013 02:22 PM, Alan wrote: >>>>>> Dear all, >>>>>> I hope you're having a nice day. >>>>>> I have a quick question on solving Poisson equation with KSP solvers >>>>>> (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: >>>>>> -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it >>>>>> 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 >>>>>> It performs very well in parallel computation and scalability is fine. >>>>>> However, if I run it with a single process, the KSP solver is much >>>>>> slower than direct ones, i.e. Mudpack. Briefly, the speed difference >>>>>> between the KSP solver and the direct solver is negligible on dealing >>>>>> with small problems (i.e.36k DoFs ) but becomes very huge for moderate >>>>>> large problems (i.e. 180k DoFs). Although the direct solver inherently >>>>>> has better performance for moderate large problems in the single >>>>>> process, I wonder if any setup or approach can improve the performance >>>>>> of this KSP Poisson solver with the single process? or even make it >>>>>> obtain competitive speed (a little bit slower is fine) against direct >>>>>> solvers. >>>>>> >>>>>> thanks in advance, >>>>>> Alan >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>> -- Norbert Wiener >>>> From bsmith at mcs.anl.gov Thu Aug 8 18:22:52 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 8 Aug 2013 18:22:52 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <520426CA.2000307@gmail.com> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> <5202843B.5050900@gmail.com> <0317F74B-3790-47EA-85C0-A1F891ED782D@mcs.anl.gov> <520426CA.2000307@gmail.com> Message-ID: <418ADD63-7C8C-444E-97C7-71ADDEF1F102@mcs.anl.gov> On Aug 8, 2013, at 6:16 PM, Alan wrote: > Dear Dr. Smith, > I sincerely appreciate your valuable answers. My KSP Poisson solver has been significantly speed up with your help. Here, I wonder what should I do extra to employ geometric MG for non-uniform Cartesian mesh. I suppose the DMDA won't automatically generate the coarse grid for non-uniform Cartesian mesh. If the mesh is not too non-uniform then it should "just work". Barry > > have a great evening, :) > Alan > >> On Aug 7, 2013, at 12:30 PM, Alan wrote: >> >>> Hi Dr. Smith, >>> Thank you so much for your reply. I tried to use the geometric multigrid to speed up the KSP solver with setup: >>> mpiexec -np 1 ./ex29 -ksp_type cg -pc_type mg -da_refine 2 -ksp_rtol 1.0e-7 >>> It do have almost the same computational rate compared with mudpack right now. Whereas, I have few questions here: >>> 1. After KSPCreate(), I used a DMDACreate2d() and DMDASetUniformCoordinates() to build a uniform Cartesian mesh from PETSc. If I input imax and jmax to DMDACreate2d() as the global dimension of the grid, the real number of grid of x- and y-direction are imax and jmax, respectively, for the code with PC = GAMG; while they are (imax-1)*4 and (jmax-1)*4, respectively, for the code with PC = MG with -da_refine = 2. Is this normal? Does this indicate that the imax, jmax I inputed for the code with PC = MG is the global dimension for the coarsest level in the multi-grid? >> The -da_refine n causes the DA you provided to be refined n times giving you the final grid. Each refinement is a factor of 2 in each direction so yes the fine grid would be (imax-1)*4 and (jmax-1)*4,. >> >> You do not need to use -da_refine n you can just set the imax and jmax and use -pc_mg_levels p where p is the number of multigrid levels you wish to use. Note that imax and jmax must be large enough to be coarsened p times and must have appropriate integer values that can be coarsened. >>> 2, Is there any command of PETSc that I can used in my code to detect what is the type of my preconditioner? >> PCGetType() >>> 3, Is there any command of PETSc that I can used to know what is the value of -da_refine if the MG is used? >> PCGetMGLevels() tells you how many levels of multigrid it is using. >> >> -ksp_view shows full details on the solver being used. >> >>> 4, What is 'PCMGType'? Should I just keep it as default? In the original makefile for /src/ksp/ksp/tutorial/example/ex29.c, pc_mg_type was 'full'. I tried it; it is slightly slower than the default setting. >> You can use whatever is faster. >> >>> 5, What other settings I can play with to further speed up the computational rate? >> There are many options for multigrid. In particular how many levels you use and what smoother you use on each level. What results in the fastest solver depends on the machine and exact problem you are solving and how many processes you are using. The defaults in PETSc 3.4 should be reasonably good. >> >> Barry >> >>> thanks, >>> Alan >>> >>>> Alan, >>>> >>>> If you can use MUDPACK then you can also use PETSc's geometric multigrid, both sequential and parallel and its performance should be fairly close to mudpack on one process. >>>> >>>> Barry >>>> >>>> On Aug 6, 2013, at 2:56 PM, Alan wrote: >>>> >>>>> Thanks for replies. Here I attached the log_summary for the large and small problems. The DoFs for the large problem is 4 times of that for the small problem. Few observations are listed here: >>>>> 1, the total number of iterations does not change much from the small problem to the large one; >>>>> 2, the time elapsed for KSPSolve() for the large problem is less than 4 times of that for the small problem; >>>>> 3, the time elapsed for PCSet() for the large problem is more than 10 times of that for the small problem; >>>>> 4, the time elapsed for PCGAMGProl_AGG for the large problem is more than 20 times of that for the small problem; >>>>> In my code, I have solved the Poisson equation for twice with difference RHS; however, the observation above is almost consistent for these two times. >>>>> Do these observation indicate that I should switch my PC from GAMG to MG for solving Poisson equation in a single process? >>>>> >>>>> best, >>>>> Alan >>>>> >>>>>> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp wrote: >>>>>> Hi Alan, >>>>>> >>>>>> please use -log_summary to get profiling information on the run. What is >>>>>> the bottleneck? Is it the number of solver iterations increasing >>>>>> significantly? If so, consider changing the preconditioner options (more >>>>>> levels!). I don't expect a direct solver to be any faster in the 180k >>>>>> case for a Poisson problem. >>>>>> >>>>>> Mudpack is geometric multigrid: http://www2.cisl.ucar.edu/resources/legacy/mudpack >>>>>> This should be faster. >>>>>> >>>>>> Matt >>>>>> Best regards, >>>>>> Karli >>>>>> >>>>>> >>>>>> On 08/06/2013 02:22 PM, Alan wrote: >>>>>>> Dear all, >>>>>>> I hope you're having a nice day. >>>>>>> I have a quick question on solving Poisson equation with KSP solvers >>>>>>> (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: >>>>>>> -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it >>>>>>> 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 >>>>>>> It performs very well in parallel computation and scalability is fine. >>>>>>> However, if I run it with a single process, the KSP solver is much >>>>>>> slower than direct ones, i.e. Mudpack. Briefly, the speed difference >>>>>>> between the KSP solver and the direct solver is negligible on dealing >>>>>>> with small problems (i.e.36k DoFs ) but becomes very huge for moderate >>>>>>> large problems (i.e. 180k DoFs). Although the direct solver inherently >>>>>>> has better performance for moderate large problems in the single >>>>>>> process, I wonder if any setup or approach can improve the performance >>>>>>> of this KSP Poisson solver with the single process? or even make it >>>>>>> obtain competitive speed (a little bit slower is fine) against direct >>>>>>> solvers. >>>>>>> >>>>>>> thanks in advance, >>>>>>> Alan >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>>> -- Norbert Wiener >>>>> > From zhenglun.wei at gmail.com Fri Aug 9 09:27:31 2013 From: zhenglun.wei at gmail.com (Alan) Date: Fri, 09 Aug 2013 09:27:31 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <418ADD63-7C8C-444E-97C7-71ADDEF1F102@mcs.anl.gov> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> <5202843B.5050900@gmail.com> <0317F74B-3790-47EA-85C0-A1F891ED782D@mcs.anl.gov> <520426CA.2000307@gmail.com> <418ADD63-7C8C-444E-97C7-71ADDEF1F102@mcs.anl.gov> Message-ID: <5204FC53.2020601@gmail.com> Morning, Dr. Smith, I attached a picture shows the coarse mesh of my case. Is this fine? thanks, Alan > On Aug 8, 2013, at 6:16 PM, Alan wrote: > >> Dear Dr. Smith, >> I sincerely appreciate your valuable answers. My KSP Poisson solver has been significantly speed up with your help. Here, I wonder what should I do extra to employ geometric MG for non-uniform Cartesian mesh. I suppose the DMDA won't automatically generate the coarse grid for non-uniform Cartesian mesh. > If the mesh is not too non-uniform then it should "just work". > > Barry > >> have a great evening, :) >> Alan >> >>> On Aug 7, 2013, at 12:30 PM, Alan wrote: >>> >>>> Hi Dr. Smith, >>>> Thank you so much for your reply. I tried to use the geometric multigrid to speed up the KSP solver with setup: >>>> mpiexec -np 1 ./ex29 -ksp_type cg -pc_type mg -da_refine 2 -ksp_rtol 1.0e-7 >>>> It do have almost the same computational rate compared with mudpack right now. Whereas, I have few questions here: >>>> 1. After KSPCreate(), I used a DMDACreate2d() and DMDASetUniformCoordinates() to build a uniform Cartesian mesh from PETSc. If I input imax and jmax to DMDACreate2d() as the global dimension of the grid, the real number of grid of x- and y-direction are imax and jmax, respectively, for the code with PC = GAMG; while they are (imax-1)*4 and (jmax-1)*4, respectively, for the code with PC = MG with -da_refine = 2. Is this normal? Does this indicate that the imax, jmax I inputed for the code with PC = MG is the global dimension for the coarsest level in the multi-grid? >>> The -da_refine n causes the DA you provided to be refined n times giving you the final grid. Each refinement is a factor of 2 in each direction so yes the fine grid would be (imax-1)*4 and (jmax-1)*4,. >>> >>> You do not need to use -da_refine n you can just set the imax and jmax and use -pc_mg_levels p where p is the number of multigrid levels you wish to use. Note that imax and jmax must be large enough to be coarsened p times and must have appropriate integer values that can be coarsened. >>>> 2, Is there any command of PETSc that I can used in my code to detect what is the type of my preconditioner? >>> PCGetType() >>>> 3, Is there any command of PETSc that I can used to know what is the value of -da_refine if the MG is used? >>> PCGetMGLevels() tells you how many levels of multigrid it is using. >>> >>> -ksp_view shows full details on the solver being used. >>> >>>> 4, What is 'PCMGType'? Should I just keep it as default? In the original makefile for /src/ksp/ksp/tutorial/example/ex29.c, pc_mg_type was 'full'. I tried it; it is slightly slower than the default setting. >>> You can use whatever is faster. >>> >>>> 5, What other settings I can play with to further speed up the computational rate? >>> There are many options for multigrid. In particular how many levels you use and what smoother you use on each level. What results in the fastest solver depends on the machine and exact problem you are solving and how many processes you are using. The defaults in PETSc 3.4 should be reasonably good. >>> >>> Barry >>> >>>> thanks, >>>> Alan >>>> >>>>> Alan, >>>>> >>>>> If you can use MUDPACK then you can also use PETSc's geometric multigrid, both sequential and parallel and its performance should be fairly close to mudpack on one process. >>>>> >>>>> Barry >>>>> >>>>> On Aug 6, 2013, at 2:56 PM, Alan wrote: >>>>> >>>>>> Thanks for replies. Here I attached the log_summary for the large and small problems. The DoFs for the large problem is 4 times of that for the small problem. Few observations are listed here: >>>>>> 1, the total number of iterations does not change much from the small problem to the large one; >>>>>> 2, the time elapsed for KSPSolve() for the large problem is less than 4 times of that for the small problem; >>>>>> 3, the time elapsed for PCSet() for the large problem is more than 10 times of that for the small problem; >>>>>> 4, the time elapsed for PCGAMGProl_AGG for the large problem is more than 20 times of that for the small problem; >>>>>> In my code, I have solved the Poisson equation for twice with difference RHS; however, the observation above is almost consistent for these two times. >>>>>> Do these observation indicate that I should switch my PC from GAMG to MG for solving Poisson equation in a single process? >>>>>> >>>>>> best, >>>>>> Alan >>>>>> >>>>>>> On Tue, Aug 6, 2013 at 2:31 PM, Karl Rupp wrote: >>>>>>> Hi Alan, >>>>>>> >>>>>>> please use -log_summary to get profiling information on the run. What is >>>>>>> the bottleneck? Is it the number of solver iterations increasing >>>>>>> significantly? If so, consider changing the preconditioner options (more >>>>>>> levels!). I don't expect a direct solver to be any faster in the 180k >>>>>>> case for a Poisson problem. >>>>>>> >>>>>>> Mudpack is geometric multigrid: http://www2.cisl.ucar.edu/resources/legacy/mudpack >>>>>>> This should be faster. >>>>>>> >>>>>>> Matt >>>>>>> Best regards, >>>>>>> Karli >>>>>>> >>>>>>> >>>>>>> On 08/06/2013 02:22 PM, Alan wrote: >>>>>>>> Dear all, >>>>>>>> I hope you're having a nice day. >>>>>>>> I have a quick question on solving Poisson equation with KSP solvers >>>>>>>> (/src/ksp/ksp/example/tutorial/ex29.c). Currently, I run this solver with: >>>>>>>> -pc_type gamg -ksp_type cg -pc_gamg_agg_nsmooths 1 -mg_levels_ksp_max_it >>>>>>>> 1 -mg_levels_ksp_type richardson -ksp_rtol 1.0e-7 >>>>>>>> It performs very well in parallel computation and scalability is fine. >>>>>>>> However, if I run it with a single process, the KSP solver is much >>>>>>>> slower than direct ones, i.e. Mudpack. Briefly, the speed difference >>>>>>>> between the KSP solver and the direct solver is negligible on dealing >>>>>>>> with small problems (i.e.36k DoFs ) but becomes very huge for moderate >>>>>>>> large problems (i.e. 180k DoFs). Although the direct solver inherently >>>>>>>> has better performance for moderate large problems in the single >>>>>>>> process, I wonder if any setup or approach can improve the performance >>>>>>>> of this KSP Poisson solver with the single process? or even make it >>>>>>>> obtain competitive speed (a little bit slower is fine) against direct >>>>>>>> solvers. >>>>>>>> >>>>>>>> thanks in advance, >>>>>>>> Alan >>>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. >>>>>>> -- Norbert Wiener >>>>>> -------------- next part -------------- A non-text attachment was scrubbed... Name: Non-Uniform Cartesian Mesh.jpg Type: image/jpeg Size: 186752 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Fri Aug 9 09:32:57 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 09 Aug 2013 09:32:57 -0500 Subject: [petsc-users] KSP solver for single process In-Reply-To: <5204FC53.2020601@gmail.com> References: <52014D03.6000302@gmail.com> <52014F1D.90400@mcs.anl.gov> <520154F3.2010203@gmail.com> <4917D537-34AC-48B2-BDF5-CDEB79165DA1@mcs.anl.gov> <5202843B.5050900@gmail.com> <0317F74B-3790-47EA-85C0-A1F891ED782D@mcs.anl.gov> <520426CA.2000307@gmail.com> <418ADD63-7C8C-444E-97C7-71ADDEF1F102@mcs.anl.gov> <5204FC53.2020601@gmail.com> Message-ID: <87vc3fvsrq.fsf@mcs.anl.gov> Alan writes: > Morning, Dr. Smith, > I attached a picture shows the coarse mesh of my case. Is this fine? That should be okay. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bisheshkh at gmail.com Fri Aug 9 11:52:11 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 9 Aug 2013 18:52:11 +0200 Subject: [petsc-users] problem (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster Message-ID: Dear all, I was experimenting with my stokes problem in 3D staggered grid with high viscosity jump using -pc_fieldsplit of type schur complement. Using hypre pilut preconditioner for the ksp for A00 block seemed to be giving nice results for smaller size. Using the following options in my laptop, or in the cluster I'm using with ONE node multiple cores WORKS fine: -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view -fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5 -fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut But when I try to submit jobs with multiple nodes, the process never seem to end! When using gamg instead of hypre, the same program works with multiple nodes in the same cluster. But gamg gave much slower convergence than the hypre, so I wanted to use the hypre. When I kill the job and look at the error file, the error it reports: [8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [8]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [8]PETSC ERROR: likely location of problem given in stack below [8]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [8]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [8]PETSC ERROR: INSTEAD the line number of the start of the function [8]PETSC ERROR: is given. [8]PETSC ERROR: [8] HYPRE_SetupXXX line 130 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c [8]PETSC ERROR: [8] PCSetUp_HYPRE line 94 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c [8]PETSC ERROR: [8] PCSetUp line 868 /tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c [8]PETSC ERROR: [8] KSPSetUp line 192 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c [8]PETSC ERROR: [8] KSPSolve line 356 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c [8]PETSC ERROR: [8] MatMult_SchurComplement line 75 /tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c [8]PETSC ERROR: [8] MatNullSpaceTest line 408 /tmp/petsc-3.4.1/src/mat/interface/matnull.c [8]PETSC ERROR: [8] solveModel line 133 "unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx [8]PETSC ERROR: --------------------- Error Message ------------------------------------ [8]PETSC ERROR: Signal received! [8]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013 [8]PETSC ERROR: See docs/changes/index.html for recent updates. [8]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [8]PETSC ERROR: See docs/index.html for manual pages. [8]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: /epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug 9 18:00:22 2013 [8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib [8]PETSC ERROR: Configure run at Mon Jul 1 13:44:30 2013 [8]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-gcc/current/ --with-shared-libraries --prefix=/home/bkhanal/petsc -download-f-blas-lapack=1 --download-hypre --with-clanguage=cxx [8]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: User provided function() line 0 in unknown directory unknown file -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 9 11:58:44 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 9 Aug 2013 11:58:44 -0500 Subject: [petsc-users] problem (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster In-Reply-To: References: Message-ID: On Fri, Aug 9, 2013 at 11:52 AM, Bishesh Khanal wrote: > Dear all, > I was experimenting with my stokes problem in 3D staggered grid with high > viscosity jump using -pc_fieldsplit of type schur complement. Using hypre > pilut preconditioner for the ksp for A00 block seemed to be giving nice > results for smaller size. Using the following options in my laptop, or in > the cluster I'm using with ONE node multiple cores WORKS fine: > -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 > -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view > -fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5 > -fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut > > But when I try to submit jobs with multiple nodes, the process never seem > to end! When using gamg instead of hypre, the same program works with > multiple nodes in the same cluster. > But gamg gave much slower convergence than the hypre, so I wanted to use > the hypre. > Did you give a near nullspace to GAMG (probably the 3 translational and 3 rotational modes for this problem)? Without these, convergence can be quite slow. > When I kill the job and look at the error file, the error it reports: > It looks like PILUT is just slow. Matt > [8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [8]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSCERROR: or try > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > corruption errors > [8]PETSC ERROR: likely location of problem given in stack below > [8]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [8]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [8]PETSC ERROR: INSTEAD the line number of the start of the function > [8]PETSC ERROR: is given. > [8]PETSC ERROR: [8] HYPRE_SetupXXX line 130 > /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c > [8]PETSC ERROR: [8] PCSetUp_HYPRE line 94 > /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c > [8]PETSC ERROR: [8] PCSetUp line 868 > /tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c > [8]PETSC ERROR: [8] KSPSetUp line 192 > /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c > [8]PETSC ERROR: [8] KSPSolve line 356 > /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c > [8]PETSC ERROR: [8] MatMult_SchurComplement line 75 > /tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c > [8]PETSC ERROR: [8] MatNullSpaceTest line 408 > /tmp/petsc-3.4.1/src/mat/interface/matnull.c > [8]PETSC ERROR: [8] solveModel line 133 > "unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx > [8]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [8]PETSC ERROR: Signal received! > [8]PETSC ERROR: > ------------------------------------------------------------------------ > [8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013 > [8]PETSC ERROR: See docs/changes/index.html for recent updates. > [8]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [8]PETSC ERROR: See docs/index.html for manual pages. > [8]PETSC ERROR: > ------------------------------------------------------------------------ > [8]PETSC ERROR: > /epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a > arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug 9 18:00:22 2013 > [8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib > [8]PETSC ERROR: Configure run at Mon Jul 1 13:44:30 2013 > [8]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-gcc/current/ > --with-shared-libraries --prefix=/home/bkhanal/petsc > -download-f-blas-lapack=1 --download-hypre --with-clanguage=cxx > [8]PETSC ERROR: > ------------------------------------------------------------------------ > [8]PETSC ERROR: User provided function() line 0 in unknown directory > unknown file > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Fri Aug 9 12:08:34 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 9 Aug 2013 19:08:34 +0200 Subject: [petsc-users] problem (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster In-Reply-To: References: Message-ID: On Fri, Aug 9, 2013 at 6:58 PM, Matthew Knepley wrote: > On Fri, Aug 9, 2013 at 11:52 AM, Bishesh Khanal wrote: > >> Dear all, >> I was experimenting with my stokes problem in 3D staggered grid with high >> viscosity jump using -pc_fieldsplit of type schur complement. Using hypre >> pilut preconditioner for the ksp for A00 block seemed to be giving nice >> results for smaller size. Using the following options in my laptop, or in >> the cluster I'm using with ONE node multiple cores WORKS fine: >> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >> -fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5 >> -fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut >> >> But when I try to submit jobs with multiple nodes, the process never seem >> to end! When using gamg instead of hypre, the same program works with >> multiple nodes in the same cluster. >> But gamg gave much slower convergence than the hypre, so I wanted to use >> the hypre. >> > > Did you give a near nullspace to GAMG (probably the 3 translational and 3 > rotational modes for this problem)? Without these, > convergence can be quite slow. > I have enforced the zero velocity on the boundary, i.e. on all the faces of the cube, by changing the corresponding rows of the system matrix. With this I think the nullspace would just correspond to the constant pressure. For this I set the nullspace using MatNullSpace, for both the outer level ksp and ksp object for Schur complement. Using MatNullSpaceTest seemed to return good (true) value for both of these set nullspaces. Please correct me if I'm doing sth wrong or sth not preferred! > > >> When I kill the job and look at the error file, the error it reports: >> > > It looks like PILUT is just slow. > But using PILUT gives me results for smaller sizes with a single node. And the failure with multiple nodes is still for the same problem size!! So if it were slower it wouldn't probably have given results in my laptop too right ? > Matt > > >> [8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range >> [8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> [8]PETSC ERROR: or see >> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSCERROR: or try >> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >> corruption errors >> [8]PETSC ERROR: likely location of problem given in stack below >> [8]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >> [8]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> [8]PETSC ERROR: INSTEAD the line number of the start of the function >> [8]PETSC ERROR: is given. >> [8]PETSC ERROR: [8] HYPRE_SetupXXX line 130 >> /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c >> [8]PETSC ERROR: [8] PCSetUp_HYPRE line 94 >> /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c >> [8]PETSC ERROR: [8] PCSetUp line 868 >> /tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c >> [8]PETSC ERROR: [8] KSPSetUp line 192 >> /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c >> [8]PETSC ERROR: [8] KSPSolve line 356 >> /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c >> [8]PETSC ERROR: [8] MatMult_SchurComplement line 75 >> /tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c >> [8]PETSC ERROR: [8] MatNullSpaceTest line 408 >> /tmp/petsc-3.4.1/src/mat/interface/matnull.c >> [8]PETSC ERROR: [8] solveModel line 133 >> "unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx >> [8]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [8]PETSC ERROR: Signal received! >> [8]PETSC ERROR: >> ------------------------------------------------------------------------ >> [8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013 >> [8]PETSC ERROR: See docs/changes/index.html for recent updates. >> [8]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [8]PETSC ERROR: See docs/index.html for manual pages. >> [8]PETSC ERROR: >> ------------------------------------------------------------------------ >> [8]PETSC ERROR: >> /epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a >> arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug 9 18:00:22 2013 >> [8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib >> [8]PETSC ERROR: Configure run at Mon Jul 1 13:44:30 2013 >> [8]PETSC ERROR: Configure options >> --with-mpi-dir=/opt/openmpi-gcc/current/ --with-shared-libraries >> --prefix=/home/bkhanal/petsc -download-f-blas-lapack=1 --download-hypre >> --with-clanguage=cxx >> [8]PETSC ERROR: >> ------------------------------------------------------------------------ >> [8]PETSC ERROR: User provided function() line 0 in unknown directory >> unknown file >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 9 12:21:05 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 9 Aug 2013 12:21:05 -0500 Subject: [petsc-users] problem (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster In-Reply-To: References: Message-ID: On Fri, Aug 9, 2013 at 12:08 PM, Bishesh Khanal wrote: > > > > On Fri, Aug 9, 2013 at 6:58 PM, Matthew Knepley wrote: > >> On Fri, Aug 9, 2013 at 11:52 AM, Bishesh Khanal wrote: >> >>> Dear all, >>> I was experimenting with my stokes problem in 3D staggered grid with >>> high viscosity jump using -pc_fieldsplit of type schur complement. Using >>> hypre pilut preconditioner for the ksp for A00 block seemed to be giving >>> nice results for smaller size. Using the following options in my laptop, or >>> in the cluster I'm using with ONE node multiple cores WORKS fine: >>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5 >>> -fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut >>> >>> But when I try to submit jobs with multiple nodes, the process never >>> seem to end! When using gamg instead of hypre, the same program works with >>> multiple nodes in the same cluster. >>> But gamg gave much slower convergence than the hypre, so I wanted to use >>> the hypre. >>> >> >> Did you give a near nullspace to GAMG (probably the 3 translational and 3 >> rotational modes for this problem)? Without these, >> convergence can be quite slow. >> > I have enforced the zero velocity on the boundary, i.e. on all the faces > of the cube, by changing the corresponding rows of the system matrix. With > this I think the nullspace would just correspond to the constant pressure. > For this I set the nullspace using MatNullSpace, for both the outer level > ksp and ksp object for Schur complement. Using MatNullSpaceTest seemed to > return good (true) value for both of these set nullspaces. > Please correct me if I'm doing sth wrong or sth not preferred! > You misunderstand the term "near nullspace". A nullspace is used to filter out modes for which A u = 0. A near nullspace is used to build a basis on the coarse grid for AMG. They are "near" nullspace modes since A u = \lambda u where \lambda << 1, which is what we expect for low modes on the coarse grid. Without guidance, GAMG will only use the constant function for this basis which results in slow convergence for more complex problems. > >> >>> When I kill the job and look at the error file, the error it reports: >>> >> >> It looks like PILUT is just slow. >> > > But using PILUT gives me results for smaller sizes with a single node. > And the failure with multiple nodes is still for the same problem size!! So > if it were slower it wouldn't probably have given results in my laptop too > right ? > I have no idea how slow their parallel implementation is. Matt > >> Matt >> >> >>> [8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>> probably memory access out of range >>> [8]PETSC ERROR: Try option -start_in_debugger or >>> -on_error_attach_debugger >>> [8]PETSC ERROR: or see >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSCERROR: or try >>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>> corruption errors >>> [8]PETSC ERROR: likely location of problem given in stack below >>> [8]PETSC ERROR: --------------------- Stack Frames >>> ------------------------------------ >>> [8]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>> available, >>> [8]PETSC ERROR: INSTEAD the line number of the start of the >>> function >>> [8]PETSC ERROR: is given. >>> [8]PETSC ERROR: [8] HYPRE_SetupXXX line 130 >>> /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c >>> [8]PETSC ERROR: [8] PCSetUp_HYPRE line 94 >>> /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c >>> [8]PETSC ERROR: [8] PCSetUp line 868 >>> /tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c >>> [8]PETSC ERROR: [8] KSPSetUp line 192 >>> /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c >>> [8]PETSC ERROR: [8] KSPSolve line 356 >>> /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c >>> [8]PETSC ERROR: [8] MatMult_SchurComplement line 75 >>> /tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c >>> [8]PETSC ERROR: [8] MatNullSpaceTest line 408 >>> /tmp/petsc-3.4.1/src/mat/interface/matnull.c >>> [8]PETSC ERROR: [8] solveModel line 133 >>> "unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>> [8]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [8]PETSC ERROR: Signal received! >>> [8]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013 >>> [8]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [8]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [8]PETSC ERROR: See docs/index.html for manual pages. >>> [8]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [8]PETSC ERROR: >>> /epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a >>> arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug 9 18:00:22 2013 >>> [8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib >>> [8]PETSC ERROR: Configure run at Mon Jul 1 13:44:30 2013 >>> [8]PETSC ERROR: Configure options >>> --with-mpi-dir=/opt/openmpi-gcc/current/ --with-shared-libraries >>> --prefix=/home/bkhanal/petsc -download-f-blas-lapack=1 --download-hypre >>> --with-clanguage=cxx >>> [8]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [8]PETSC ERROR: User provided function() line 0 in unknown directory >>> unknown file >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Fri Aug 9 12:22:44 2013 From: mfadams at lbl.gov (Mark F. Adams) Date: Fri, 9 Aug 2013 13:22:44 -0400 Subject: [petsc-users] problem (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster In-Reply-To: References: Message-ID: On Aug 9, 2013, at 1:08 PM, Bishesh Khanal wrote: > > > > On Fri, Aug 9, 2013 at 6:58 PM, Matthew Knepley wrote: > On Fri, Aug 9, 2013 at 11:52 AM, Bishesh Khanal wrote: > Dear all, > I was experimenting with my stokes problem in 3D staggered grid with high viscosity jump using -pc_fieldsplit of type schur complement. Using hypre pilut preconditioner for the ksp for A00 block seemed to be giving nice results for smaller size. Using the following options in my laptop, or in the cluster I'm using with ONE node multiple cores WORKS fine: > -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view -fieldsplit_0_ksp_type gcr -fieldsplit_0_ksp_rtol 1.0e-5 -fieldsplit_0_pc_type hypre -fieldsplit_0_pc_hypre_type pilut > > But when I try to submit jobs with multiple nodes, the process never seem to end! When using gamg instead of hypre, the same program works with multiple nodes in the same cluster. > But gamg gave much slower convergence than the hypre, so I wanted to use the hypre. > > Did you give a near nullspace to GAMG (probably the 3 translational and 3 rotational modes for this problem)? Without these, > convergence can be quite slow. > I have enforced the zero velocity on the boundary, i.e. on all the faces of the cube, by changing the corresponding rows of the system matrix. With this I think the nullspace would just correspond to the constant pressure. We are talking about the 00 block solver (velocity) and there are 6 (near) null space vectors. GAMG does want to be told you have 3 dof/node. Should DM do this!!! If you tell GAMG there is only one (the (1,1,1) vector) then that will mess it up. If you set the blocks size on the matrix (to 3), which should be done by PETSc, and do not set a null space, GAMG will construct the default null space: three constant functions. This should work OK. As Matt pointed out you want to give it all six (the 3 constant vectors and 3 rotational null space vectors) to get an optimal solver. GAMG is an AMG method like hypre boomeramg. Boomeramg is very robust and I would recommend using it to start. GAMG might be better for vector values problems but if hypre is probably pretty good (for low order discretizations especially). Mark > For this I set the nullspace using MatNullSpace, for both the outer level ksp and ksp object for Schur complement. Using MatNullSpaceTest seemed to return good (true) value for both of these set nullspaces. > Please correct me if I'm doing sth wrong or sth not preferred! > > > When I kill the job and look at the error file, the error it reports: > > It looks like PILUT is just slow. > > But using PILUT gives me results for smaller sizes with a single node. And the failure with multiple nodes is still for the same problem size!! So if it were slower it wouldn't probably have given results in my laptop too right ? > > > Matt > > [8]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [8]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[8]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [8]PETSC ERROR: likely location of problem given in stack below > [8]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [8]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [8]PETSC ERROR: INSTEAD the line number of the start of the function > [8]PETSC ERROR: is given. > [8]PETSC ERROR: [8] HYPRE_SetupXXX line 130 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c > [8]PETSC ERROR: [8] PCSetUp_HYPRE line 94 /tmp/petsc-3.4.1/src/ksp/pc/impls/hypre/hypre.c > [8]PETSC ERROR: [8] PCSetUp line 868 /tmp/petsc-3.4.1/src/ksp/pc/interface/precon.c > [8]PETSC ERROR: [8] KSPSetUp line 192 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c > [8]PETSC ERROR: [8] KSPSolve line 356 /tmp/petsc-3.4.1/src/ksp/ksp/interface/itfunc.c > [8]PETSC ERROR: [8] MatMult_SchurComplement line 75 /tmp/petsc-3.4.1/src/ksp/ksp/utils/schurm.c > [8]PETSC ERROR: [8] MatNullSpaceTest line 408 /tmp/petsc-3.4.1/src/mat/interface/matnull.c > [8]PETSC ERROR: [8] solveModel line 133 "unknowndirectory/"/epi/asclepios2/bkhanal/works/AdLemModel/src/PetscAdLemTaras3D.cxx > [8]PETSC ERROR: --------------------- Error Message ------------------------------------ > [8]PETSC ERROR: Signal received! > [8]PETSC ERROR: ------------------------------------------------------------------------ > [8]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013 > [8]PETSC ERROR: See docs/changes/index.html for recent updates. > [8]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [8]PETSC ERROR: See docs/index.html for manual pages. > [8]PETSC ERROR: ------------------------------------------------------------------------ > [8]PETSC ERROR: /epi/asclepios2/bkhanal/works/AdLemModel/build/src/AdLemMain on a arch-linux2-cxx-debug named nef002 by bkhanal Fri Aug 9 18:00:22 2013 > [8]PETSC ERROR: Libraries linked from /home/bkhanal/petsc/lib > [8]PETSC ERROR: Configure run at Mon Jul 1 13:44:30 2013 > [8]PETSC ERROR: Configure options --with-mpi-dir=/opt/openmpi-gcc/current/ --with-shared-libraries --prefix=/home/bkhanal/petsc -download-f-blas-lapack=1 --download-hypre --with-clanguage=cxx > [8]PETSC ERROR: ------------------------------------------------------------------------ > [8]PETSC ERROR: User provided function() line 0 in unknown directory unknown file > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From danyang.su at gmail.com Fri Aug 9 16:44:12 2013 From: danyang.su at gmail.com (Danyang Su) Date: Fri, 09 Aug 2013 14:44:12 -0700 Subject: [petsc-users] How to exclude -C compiler option (VecGetArray in Fortran) Message-ID: <520562AC.10708@gmail.com> Hi All, I get some problem in VecGetArray in Fortran, which I guess is due to the default -C compiler option. Current compiler option are: $ make ksp_inhm /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -MT -Z7 -fpp -I/cygdrive/c/cygwin/packages/petsc-3.4.2/in clude -I/cygdrive/c/cygwin/packages/petsc-3.4.2/arch-mswin-c-debug/include -I/cygdrive/c/cygwin/packages/parmetis-4.0.3/include -I/cygdrive/c/cygwin/packages/metis-5.1.0/include -I/cygdrive/c/Program\ Files/MPICH2/include -o ksp_inhm.o ksp_inhm.F90 /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -MT -wd4996 -Z7 -o ksp_inhm_d ksp_inhm.o -L/cygdrive/c/cygwin/p ackages/petsc-3.4.2/arch-mswin-c-debug/lib -lpetsc -lflapack -lfblas /cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libparmet is/Release/parmetis.lib /cygdrive/c/cygwin/packages/metis-5.1.0/build/libmetis/Release/metis.lib /cygdrive/c/Program\ Files/MPIC H2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Gdi32.lib Us er32.lib Advapi32.lib Kernel32.lib Ws2_32.lib Relevant codes are as follows: #define xx_a(ib) xx_v(xx_i + (ib)) PetscScalar :: xx_v(1) PetscOffset :: xx_i ... call VecGetArray(x, xx_v, xx_i, ierr) write(*, 90) istart+1, xx_a(istart+1), iend, xx_a(iend) !istart and iend are the ownership range 90 format ('x(', i6, ') = ',e11.4, ' x(', i6, ') = ',e11.4) call VecRestoreArray(x, xx_v, xx_i, ierr) The output (xx_a(istart+1), xx_a(iend)) is CORRECT for the processor 0, but INCORRECT for the other processors. I found in the manual that for the fortran user, the compiler option -C should not be used. How I can set this option? Thanks and regards, Danyang From balay at mcs.anl.gov Fri Aug 9 16:52:46 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 9 Aug 2013 16:52:46 -0500 (CDT) Subject: [petsc-users] How to exclude -C compiler option (VecGetArray in Fortran) In-Reply-To: <520562AC.10708@gmail.com> References: <520562AC.10708@gmail.com> Message-ID: On Fri, 9 Aug 2013, Danyang Su wrote: > Hi All, > > I get some problem in VecGetArray in Fortran, which I guess is due to the > default -C compiler option. > > Current compiler option are: > $ make ksp_inhm > /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -MT -Z7 > -fpp -I/cygdrive/c/cygwin/packages/petsc-3.4.2/in > clude -I/cygdrive/c/cygwin/packages/petsc-3.4.2/arch-mswin-c-debug/include > -I/cygdrive/c/cygwin/packages/parmetis-4.0.3/include > -I/cygdrive/c/cygwin/packages/metis-5.1.0/include -I/cygdrive/c/Program\ > Files/MPICH2/include -o ksp_inhm.o ksp_inhm.F90 > /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -MT -wd4996 -Z7 > -o ksp_inhm_d ksp_inhm.o -L/cygdrive/c/cygwin/p > ackages/petsc-3.4.2/arch-mswin-c-debug/lib -lpetsc -lflapack -lfblas > /cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libparmet > is/Release/parmetis.lib > /cygdrive/c/cygwin/packages/metis-5.1.0/build/libmetis/Release/metis.lib > /cygdrive/c/Program\ Files/MPIC > H2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib > /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Gdi32.lib Us > er32.lib Advapi32.lib Kernel32.lib Ws2_32.lib > > > Relevant codes are as follows: > > #define xx_a(ib) xx_v(xx_i + (ib)) > PetscScalar :: xx_v(1) > PetscOffset :: xx_i > ... > call VecGetArray(x, xx_v, xx_i, ierr) > write(*, 90) istart+1, xx_a(istart+1), iend, xx_a(iend) !istart and iend > are the ownership range > 90 format ('x(', i6, ') = ',e11.4, ' x(', i6, ') = ',e11.4) > call VecRestoreArray(x, xx_v, xx_i, ierr) > > The output (xx_a(istart+1), xx_a(iend)) is CORRECT for the processor 0, but > INCORRECT for the other processors. The code is buggy. You should use xx_a(1),xx_a(n) - where 'n' is the local size. You can use VecGetLocalSize() to get the local size. > > I found in the manual that for the fortran user, the compiler option -C should > not be used. How I can set this option? This issuse is not -C related. If it were - you would have to use VecGetArrayF90() - [which should be the prefered function anyway.] Satish > > Thanks and regards, > > Danyang > From danyang.su at gmail.com Fri Aug 9 17:02:18 2013 From: danyang.su at gmail.com (Danyang Su) Date: Fri, 09 Aug 2013 15:02:18 -0700 Subject: [petsc-users] How to exclude -C compiler option (VecGetArray in Fortran) In-Reply-To: References: <520562AC.10708@gmail.com> Message-ID: <520566EA.80808@gmail.com> Hi Satish, Thanks. It works now. Danyang On 09/08/2013 2:52 PM, Satish Balay wrote: > On Fri, 9 Aug 2013, Danyang Su wrote: > >> Hi All, >> >> I get some problem in VecGetArray in Fortran, which I guess is due to the >> default -C compiler option. >> >> Current compiler option are: >> $ make ksp_inhm >> /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -MT -Z7 >> -fpp -I/cygdrive/c/cygwin/packages/petsc-3.4.2/in >> clude -I/cygdrive/c/cygwin/packages/petsc-3.4.2/arch-mswin-c-debug/include >> -I/cygdrive/c/cygwin/packages/parmetis-4.0.3/include >> -I/cygdrive/c/cygwin/packages/metis-5.1.0/include -I/cygdrive/c/Program\ >> Files/MPICH2/include -o ksp_inhm.o ksp_inhm.F90 >> /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -MT -wd4996 -Z7 >> -o ksp_inhm_d ksp_inhm.o -L/cygdrive/c/cygwin/p >> ackages/petsc-3.4.2/arch-mswin-c-debug/lib -lpetsc -lflapack -lfblas >> /cygdrive/c/cygwin/packages/parmetis-4.0.3/build/libparmet >> is/Release/parmetis.lib >> /cygdrive/c/cygwin/packages/metis-5.1.0/build/libmetis/Release/metis.lib >> /cygdrive/c/Program\ Files/MPIC >> H2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib >> /cygdrive/c/Program\ Files/MPICH2/lib/mpi.lib Gdi32.lib Us >> er32.lib Advapi32.lib Kernel32.lib Ws2_32.lib >> >> >> Relevant codes are as follows: >> >> #define xx_a(ib) xx_v(xx_i + (ib)) >> PetscScalar :: xx_v(1) >> PetscOffset :: xx_i >> ... >> call VecGetArray(x, xx_v, xx_i, ierr) >> write(*, 90) istart+1, xx_a(istart+1), iend, xx_a(iend) !istart and iend >> are the ownership range >> 90 format ('x(', i6, ') = ',e11.4, ' x(', i6, ') = ',e11.4) >> call VecRestoreArray(x, xx_v, xx_i, ierr) >> >> The output (xx_a(istart+1), xx_a(iend)) is CORRECT for the processor 0, but >> INCORRECT for the other processors. > The code is buggy. You should use xx_a(1),xx_a(n) - where 'n' is the local size. > You can use VecGetLocalSize() to get the local size. > >> I found in the manual that for the fortran user, the compiler option -C should >> not be used. How I can set this option? > This issuse is not -C related. If it were - you would have to use > VecGetArrayF90() - [which should be the prefered function anyway.] > > Satish > >> Thanks and regards, >> >> Danyang >> From luqiyue at gmail.com Sun Aug 11 14:21:48 2013 From: luqiyue at gmail.com (Lu Qiyue) Date: Sun, 11 Aug 2013 14:21:48 -0500 Subject: [petsc-users] About bcgs and Viewer Message-ID: Dear All: I am using /petsc-3.3-p6/src/ksp/ksp/examples/tutorials/ex10.c.html as the driver to calculate a Ax=b system. In the job submit line, I am using: ./ex10 -f input.petsc.bin -ksp_type bcgs -ksp_rtol 1.e-5 -ksp_max_it 40000 -ksp_monitor -table Then I got the error message: [0]PETSC ERROR: No support for this operation for this object type! [0]PETSC ERROR: Viewer type string not supported for KSP cg! Looks bcgs is not compatible with -table option. Because in ex10.c mentioned above, -table has viewer operations. And I used cg and cr, both work. ./ex10 -f input.petsc.bin -ksp_type cg -ksp_rtol 1.e-5 -ksp_max_it 40000 -ksp_monitor -table ./ex10 -f input.petsc.bin -ksp_type cr -ksp_rtol 1.e-5 -ksp_max_it 40000 -ksp_monitor -table So, How could I get the iteration time and other information for bcgs option? Thanks Qiyue Lu -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Aug 11 15:00:18 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 11 Aug 2013 15:00:18 -0500 Subject: [petsc-users] About bcgs and Viewer In-Reply-To: References: Message-ID: This issue has been fixed with the 3.4 release of PETSc. Barry On Aug 11, 2013, at 2:21 PM, Lu Qiyue wrote: > Dear All: > I am using > /petsc-3.3-p6/src/ksp/ksp/examples/tutorials/ex10.c.html > as the driver to calculate a Ax=b system. > In the job submit line, I am using: > ./ex10 -f input.petsc.bin -ksp_type bcgs -ksp_rtol 1.e-5 -ksp_max_it 40000 -ksp_monitor -table > > Then I got the error message: > > [0]PETSC ERROR: No support for this operation for this object type! > [0]PETSC ERROR: Viewer type string not supported for KSP cg! > > Looks bcgs is not compatible with -table option. Because in ex10.c mentioned above, -table has viewer operations. > > And I used cg and cr, both work. > ./ex10 -f input.petsc.bin -ksp_type cg -ksp_rtol 1.e-5 -ksp_max_it 40000 -ksp_monitor -table > > ./ex10 -f input.petsc.bin -ksp_type cr -ksp_rtol 1.e-5 -ksp_max_it 40000 -ksp_monitor -table > > So, How could I get the iteration time and other information for bcgs option? > > Thanks > > Qiyue Lu From ztdepyahoo at 163.com Sun Aug 11 19:22:04 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Mon, 12 Aug 2013 08:22:04 +0800 (CST) Subject: [petsc-users] Does normal PC support --with-precision==_float128 Message-ID: <440ae62f.14903.1406fe6b250.Coremail.ztdepyahoo@163.com> -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Aug 11 19:24:25 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 11 Aug 2013 19:24:25 -0500 Subject: [petsc-users] Does normal PC support --with-precision==_float128 In-Reply-To: <440ae62f.14903.1406fe6b250.Coremail.ztdepyahoo@163.com> References: <440ae62f.14903.1406fe6b250.Coremail.ztdepyahoo@163.com> Message-ID: <798F1006-6552-4773-A50D-7265B0192DC5@mcs.anl.gov> It supports all the standard built-in preconditioners in PETSc. It only does not support the external solvers which are not written to work with __float128. What solver do you want to use? Barry On Aug 11, 2013, at 7:22 PM, ??? wrote: > > > From ztdepyahoo at 163.com Sun Aug 11 19:29:29 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Mon, 12 Aug 2013 08:29:29 +0800 (CST) Subject: [petsc-users] Does normal PC support --with-precision==_float128 In-Reply-To: <798F1006-6552-4773-A50D-7265B0192DC5@mcs.anl.gov> References: <440ae62f.14903.1406fe6b250.Coremail.ztdepyahoo@163.com> <798F1006-6552-4773-A50D-7265B0192DC5@mcs.anl.gov> Message-ID: <674ad1e3.14d52.1406fed7d3c.Coremail.ztdepyahoo@163.com> I use bicgstab solver. i think float128 means a float number with 128 bit. it is more accurate than normal double float number. At 2013-08-12 08:24:25,"Barry Smith" wrote: > > It supports all the standard built-in preconditioners in PETSc. It only does not support the external solvers which are not written to work with __float128. What solver do you want to use? > > Barry > >On Aug 11, 2013, at 7:22 PM, ??? wrote: > >> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Sun Aug 11 20:22:01 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 11 Aug 2013 20:22:01 -0500 Subject: [petsc-users] Does normal PC support --with-precision==_float128 In-Reply-To: <674ad1e3.14d52.1406fed7d3c.Coremail.ztdepyahoo@163.com> References: <440ae62f.14903.1406fe6b250.Coremail.ztdepyahoo@163.com> <798F1006-6552-4773-A50D-7265B0192DC5@mcs.anl.gov> <674ad1e3.14d52.1406fed7d3c.Coremail.ztdepyahoo@163.com> Message-ID: <87txivu2iu.fsf@mcs.anl.gov> ??? writes: > I use bicgstab solver. i think float128 means a float number with 128 > bit. it is more accurate than normal double float number. Yes, this works as long as you use a PETSc preconditioner rather than an external package. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jefonseca at gmail.com Mon Aug 12 12:24:42 2013 From: jefonseca at gmail.com (Jim Fonseca) Date: Mon, 12 Aug 2013 12:24:42 -0500 Subject: [petsc-users] mixed precision Message-ID: Hi, We are curious about the mixed-precision capabilities in NEMO5. I see that there is a newish configure option to allow single precision for linear solve. Other than that, I found this old post: https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2012-August/014842.html Is there any other information about to see if we can take advantage of this capability? Thanks, Jim -- Jim Fonseca, PhD Research Scientist Network for Computational Nanotechnology Purdue University 765-496-6495 www.jimfonseca.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Aug 12 12:32:56 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 12 Aug 2013 12:32:56 -0500 Subject: [petsc-users] mixed precision In-Reply-To: References: Message-ID: On Mon, Aug 12, 2013 at 12:24 PM, Jim Fonseca wrote: > Hi, > We are curious about the mixed-precision capabilities in NEMO5. I see that > there is a newish configure option to allow single precision for linear > solve. Other than that, I found this old post: > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2012-August/014842.html > > Is there any other information about to see if we can take advantage of > this capability? > Mixed-precision is hard, and especially hard in PETSc because the C type system is limited. However, it also needs to be embedded in an algorithm that can take advantage of it. I would always start out with a clear motivation: - What would mixed precision accomplish in your code? - What is the most possible benefit you would see? and decide if that is worth a large time investment. > Thanks, > Jim > > -- > Jim Fonseca, PhD > Research Scientist > Network for Computational Nanotechnology > Purdue University > 765-496-6495 > www.jimfonseca.com > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From rupp at mcs.anl.gov Mon Aug 12 13:38:34 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Mon, 12 Aug 2013 13:38:34 -0500 Subject: [petsc-users] mixed precision In-Reply-To: References: Message-ID: <52092BAA.5030205@mcs.anl.gov> Hi Jim, in addition to what Matt already said, keep in mind is that you usually won't see a two-fold performance gain in iterative solvers anyway, as the various integers used for storing the nonzeros in the sparse matrix don't change their size. I once played with an implementation of an non-preconditioned mixed-precision CG solver, and I only obtained about a 40 percent overall performance gain for well-conditioned systems. For less well-conditioned systems you may not get any better overall performance at all (or worse, fail to converge). Best regards, Karli On 08/12/2013 12:32 PM, Matthew Knepley wrote: > On Mon, Aug 12, 2013 at 12:24 PM, Jim Fonseca > wrote: > > Hi, > We are curious about the mixed-precision capabilities in NEMO5. I > see that there is a newish configure option to allow single > precision for linear solve. Other than that, I found this old post: > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2012-August/014842.html > > Is there any other information about to see if we can take advantage > of this capability? > > > Mixed-precision is hard, and especially hard in PETSc because the C type > system is limited. > However, it also needs to be embedded in an algorithm that can take > advantage of it. I would > always start out with a clear motivation: > > - What would mixed precision accomplish in your code? > > - What is the most possible benefit you would see? > > and decide if that is worth a large time investment. > > Thanks, > Jim > > -- > Jim Fonseca, PhD > Research Scientist > Network for Computational Nanotechnology > Purdue University > 765-496-6495 > www.jimfonseca.com > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener From hsahasra at purdue.edu Mon Aug 12 16:05:51 2013 From: hsahasra at purdue.edu (Harshad Sahasrabudhe) Date: Mon, 12 Aug 2013 17:05:51 -0400 (EDT) Subject: [petsc-users] Extracting data from a Petsc matrix In-Reply-To: <87ehb2fm1v.fsf@mcs.anl.gov> Message-ID: <1406721381.8338.1376341551601.JavaMail.root@mailhub027.itcs.purdue.edu> Hi Jed, I am now working to add library support for LU decomposition using MAGMA. I need your help with the following: 1) How do I add the options --download-magma, --with-magma, etc. to the configure script for building with MAGMA? 2) I have a fair idea how the PETSc code is structured and how to add source code to the impls/ directory. How does PETSc get to know that there is an additional implementation (in this case MAGMA) in this directory? Is there a config file of some sort? Thanks, Harshad ----- Original Message ----- From: "Jed Brown" To: hsahasra at purdue.edu, petsc-users at mcs.anl.gov Sent: Saturday, July 13, 2013 12:43:08 PM Subject: Re: [petsc-users] Extracting data from a Petsc matrix "hsahasra at purdue.edu" writes: > Hi, > > I am working on solving a system of linear equations with square > matrix. I'm first factoring the matrix using LU decomposition. I assume you're solving a dense problem because that is all MAGMA does. > I want to do the LU decomposition step using MAGMA on GPUs. MAGMA > library implements LAPACK functions on a CPU+GPU based system. > > So my question is, how do I extract the data from a Petsc Mat so that > it can be sent to the dgetrf routine in MAGMA. MatDenseGetArray > Is there any need for duplicating the data for this step? You're on your own for storage of factors. Alternatively, you could add library support so that you could use PCLU and '-pc_factor_mat_solver_package magma' (or PCFactorSetMatSolverPackage). Doing this is not a priority for us, but we can provide guidance if you want to tackle it. From bsmith at mcs.anl.gov Mon Aug 12 16:25:53 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 12 Aug 2013 16:25:53 -0500 Subject: [petsc-users] Extracting data from a Petsc matrix In-Reply-To: <1406721381.8338.1376341551601.JavaMail.root@mailhub027.itcs.purdue.edu> References: <1406721381.8338.1376341551601.JavaMail.root@mailhub027.itcs.purdue.edu> Message-ID: <8D451A13-10C3-4442-9522-C49CBB929F03@mcs.anl.gov> On Aug 12, 2013, at 4:05 PM, Harshad Sahasrabudhe wrote: > Hi Jed, > > I am now working to add library support for LU decomposition using MAGMA. I need your help with the following: > > 1) How do I add the options --download-magma, --with-magma, etc. to the configure script for building with MAGMA? Add a new file in config/PETSc/packages (copy one that is already there and modify for magma). > > 2) I have a fair idea how the PETSc code is structured and how to add source code to the impls/ directory. How does PETSc get to know that there is an additional implementation (in this case MAGMA) in this directory? Is there a config file of some sort? Add the new directory name to list of directories in the makefile in that directory and add in MatRegisterAll(). Barry > > Thanks, > Harshad > > ----- Original Message ----- > From: "Jed Brown" > To: hsahasra at purdue.edu, petsc-users at mcs.anl.gov > Sent: Saturday, July 13, 2013 12:43:08 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > "hsahasra at purdue.edu" writes: > >> Hi, >> >> I am working on solving a system of linear equations with square >> matrix. I'm first factoring the matrix using LU decomposition. > > I assume you're solving a dense problem because that is all MAGMA does. > >> I want to do the LU decomposition step using MAGMA on GPUs. MAGMA >> library implements LAPACK functions on a CPU+GPU based system. >> >> So my question is, how do I extract the data from a Petsc Mat so that >> it can be sent to the dgetrf routine in MAGMA. > > MatDenseGetArray > >> Is there any need for duplicating the data for this step? > > You're on your own for storage of factors. Alternatively, you could add > library support so that you could use PCLU and > '-pc_factor_mat_solver_package magma' (or PCFactorSetMatSolverPackage). > Doing this is not a priority for us, but we can provide guidance if you > want to tackle it. From Shuangshuang.Jin at pnnl.gov Mon Aug 12 20:03:21 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Mon, 12 Aug 2013 18:03:21 -0700 Subject: [petsc-users] Performance of PETSc TS solver Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5C2B@EMAIL04.pnl.gov> Hello, PETSc developers, I have a question regarding the performance of PETSc TS solver expecially the TSTHETA. I used it to solve my DAE equations. There're altogether 1152 functions in my IFunction(), and the Jacobian matrix in my IJacobian is also of dimension 1152*1152. The main TS code is set up as following: ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr); ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr); ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr); ierr = TSSetIFunction(ts, NULL, (TSIFunction) IFunction, &user); CHKERRQ(ierr); ierr = MatCreate(PETSC_COMM_WORLD, &J); CHKERRQ(ierr); // J: Jacobian matrix ierr = MatSetSizes(J, PETSC_DECIDE, PETSC_DECIDE, 4*ngen, 4*ngen); CHKERRQ(ierr); ierr = MatSetFromOptions(J); CHKERRQ(ierr); ierr = MatSetUp(J); CHKERRQ(ierr); ierr = TSSetIJacobian(ts, J, J, (TSIJacobian) IJacobian, &user); CHKERRQ(ierr); ierr = TSSetDM(ts, da); CHKERRQ(ierr); ierr = formInitialSolution(ts, x, &user, t_step, t_width); CHKERRQ(ierr); ftime = t_step[0] * t_width[0]; ierr = TSSetDuration(ts, PETSC_DEFAULT, ftime); CHKERRQ(ierr); ierr = TSSetSolution(ts, x); CHKERRQ(ierr); ierr = TSSetInitialTimeStep(ts, 0.0, t_width[0]); CHKERRQ(ierr); ierr = TSSetFromOptions(ts);CHKERRQ(ierr); ierr = TSSolve(ts,x);CHKERRQ(ierr); ierr = TSGetSolveTime(ts,&ftime);CHKERRQ(ierr); ierr = TSGetTimeStepNumber(ts,&steps);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD,"%D steps, ftime %G\n",steps,ftime);CHKERRQ(ierr); I have recorded the solution times when different numbers of processors are used: 2 processors: 1021 seconds, 4 processors: 587.244 seconds, 8 processors: 421.565 seconds, 16 processors: 355.594 seconds, 32 processors: 322.28 seconds, 64 processors: 382.967 seconds. It seems like with 32 processors, it reaches the best performance. However, 322.28 seconds to solve such DAE equations is too slow than I expected. I have the following questions based on the above results: 1. Is this the usual DAE solving time in PETSc to for the problem with this dimension? 2. I was told that in TS, by default, ksp uses GMRES, and the preconditioner is ILU(0), is there any other alterative ksp solver or options I should use in the command line to solve the problem much faster? 3. Do you have any other suggestion for me to speed up the DAE computation in PETSc? Thanks a lot! Shuangshuang -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Mon Aug 12 20:14:04 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 12 Aug 2013 20:14:04 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5C2B@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5C2B@EMAIL04.pnl.gov> Message-ID: <8761vaqtnn.fsf@mcs.anl.gov> "Jin, Shuangshuang" writes: > Hello, PETSc developers, > I have a question regarding the performance of PETSc TS solver > expecially the TSTHETA. I used it to solve my DAE equations. TSTHETA is not L-stable and not stiffly accurate, so it's not normally something that you'd want to use for a DAE. Make sure you're getting meaningful results and try switching to something like an ARKIMEX or ROSW since those are likely better for your problem. > I have recorded the solution times when different numbers of processors are used: > > 2 processors: 1021 seconds, > 4 processors: 587.244 seconds, > 8 processors: 421.565 seconds, > 16 processors: 355.594 seconds, > 32 processors: 322.28 seconds, > 64 processors: 382.967 seconds. > > It seems like with 32 processors, it reaches the best > performance. However, 322.28 seconds to solve such DAE equations is > too slow than I expected. The number of equations (1152) is quite small, so I'm not surprised there is no further speedup. Can you explain more about your equations? > > I have the following questions based on the above results: > 1. Is this the usual DAE solving time in PETSc to for the problem with this dimension? That depends what your function is. > 2. I was told that in TS, by default, ksp uses GMRES, and the > preconditioner is ILU(0), is there any other alterative ksp solver or > options I should use in the command line to solve the problem much > faster? I would use -ksp_type preonly -pc_type lu for such small problems. Is the system dense? > 3. Do you have any other suggestion for me to speed up the DAE computation in PETSc? Can you describe what sort of problem you're dealing with, what causes the stiffness in your equations, what accuracy you want, etc. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bsmith at mcs.anl.gov Mon Aug 12 20:39:00 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 12 Aug 2013 20:39:00 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <8761vaqtnn.fsf@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5C2B@EMAIL04.pnl.gov> <8761vaqtnn.fsf@mcs.anl.gov> Message-ID: <6BC6B555-1881-45D4-BD5A-8350A9443826@mcs.anl.gov> Also always send the output from running with -log_summary whenever you ask performance questions so we know what kind of performance it is getting. Barry On Aug 12, 2013, at 8:14 PM, Jed Brown wrote: > "Jin, Shuangshuang" writes: > >> Hello, PETSc developers, >> I have a question regarding the performance of PETSc TS solver >> expecially the TSTHETA. I used it to solve my DAE equations. > > TSTHETA is not L-stable and not stiffly accurate, so it's not normally > something that you'd want to use for a DAE. Make sure you're getting > meaningful results and try switching to something like an ARKIMEX or > ROSW since those are likely better for your problem. > >> I have recorded the solution times when different numbers of processors are used: >> >> 2 processors: 1021 seconds, >> 4 processors: 587.244 seconds, >> 8 processors: 421.565 seconds, >> 16 processors: 355.594 seconds, >> 32 processors: 322.28 seconds, >> 64 processors: 382.967 seconds. >> >> It seems like with 32 processors, it reaches the best >> performance. However, 322.28 seconds to solve such DAE equations is >> too slow than I expected. > > The number of equations (1152) is quite small, so I'm not surprised > there is no further speedup. Can you explain more about your equations? > >> >> I have the following questions based on the above results: >> 1. Is this the usual DAE solving time in PETSc to for the problem with this dimension? > > That depends what your function is. > >> 2. I was told that in TS, by default, ksp uses GMRES, and the >> preconditioner is ILU(0), is there any other alterative ksp solver or >> options I should use in the command line to solve the problem much >> faster? > > I would use -ksp_type preonly -pc_type lu for such small problems. Is > the system dense? > >> 3. Do you have any other suggestion for me to speed up the DAE computation in PETSc? > > Can you describe what sort of problem you're dealing with, what causes > the stiffness in your equations, what accuracy you want, etc. From pengxwang at hotmail.com Tue Aug 13 10:31:21 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Tue, 13 Aug 2013 10:31:21 -0500 Subject: [petsc-users] questions about DMDAGetNeighbors() Message-ID: Hi, I am confused about calling PetscErrorCode DMDAGetNeighbors(DM da,const PetscMPIInt *ranks[]), I am calling the function like this: PetscMPIInt ** neighbors; DMDAGetNeighbors(da, neighbors); but the error shows src/solver.cpp:152: error: invalid conversion from ?PetscMPIInt**? to ?const PetscMPIInt**? src/solver.cpp:152: error: initializing argument 2 of ?PetscErrorCode DMDAGetNeighbors(_p_DM*, const PetscMPIInt**)? How should I define the neighbors before calling the function? Sorry it's not a pure Petsc question, but I am new in c++. Thanks for any comments. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Aug 13 10:35:35 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 13 Aug 2013 10:35:35 -0500 Subject: [petsc-users] questions about DMDAGetNeighbors() In-Reply-To: References: Message-ID: On Tue, Aug 13, 2013 at 10:31 AM, Roc Wang wrote: > Hi, > > I am confused about calling > > PetscErrorCode DMDAGetNeighbors(DM da,const PetscMPIInt *ranks[]), > > I am calling the function like this: > > PetscMPIInt ** neighbors; > DMDAGetNeighbors(da, neighbors); > > but the error shows > > src/solver.cpp:152: error: invalid conversion from ?PetscMPIInt**? to ?const PetscMPIInt**? > src/solver.cpp:152: error: initializing argument 2 of ?PetscErrorCode DMDAGetNeighbors(_p_DM*, const PetscMPIInt**)? > > How should I define the neighbors before calling the function? Sorry it's not a pure Petsc question, but I am new in c++. Thanks for any comments. > > const PetscMPIInt *neighbors; DMDAGetNeighbors(dm, &neighbors); Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Tue Aug 13 12:34:46 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Tue, 13 Aug 2013 10:34:46 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6BC6B555-1881-45D4-BD5A-8350A9443826@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5C2B@EMAIL04.pnl.gov> <8761vaqtnn.fsf@mcs.anl.gov> <6BC6B555-1881-45D4-BD5A-8350A9443826@mcs.anl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5CF7@EMAIL04.pnl.gov> Hello, Jed and Barry, thanks for your reply. We are solving a power system dynamic simulation problem. We set up the DAE equations and its Jacobian matrix, and would like to use the Trapezoid method to solve it. That's also the reason why we chose TSTHETA. From the PETSc manual, we read that: "-ts_type theta -ts_theta_theta 0.5 -ts_theta_endpoint corresponds to Crank-Nicholson (TSCN). This method can be applied to DAE. For the default Theta=0.5, this is the trapezoid rule (also known as Crank-Nicolson, see TSCN)." I haven't heard of ARKIMEX or ROSW before. Are they some external packages or DAE solvers that implement the Trapezoid method? I have also tried the -ksp_type preonly -pc_type lu option you indicated but failed. The PETSC ERROR messages are: No support for this operation for this object type! Matrix format mpiaij does not have a built-in PETSc LU! Attached please see the log_summary for running the TSTHETA with -ts_theta_theta 0.5 and its default ksp solver. Please help me to evaluate the performance and see what's the bottleneck of the slow computation speed. Thanks a lot! Shuangshuang -----Original Message----- From: Barry Smith [mailto:bsmith at mcs.anl.gov] Sent: Monday, August 12, 2013 6:39 PM To: Jed Brown Cc: Jin, Shuangshuang; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Performance of PETSc TS solver Also always send the output from running with -log_summary whenever you ask performance questions so we know what kind of performance it is getting. Barry On Aug 12, 2013, at 8:14 PM, Jed Brown wrote: > "Jin, Shuangshuang" writes: > >> Hello, PETSc developers, >> I have a question regarding the performance of PETSc TS solver >> expecially the TSTHETA. I used it to solve my DAE equations. > > TSTHETA is not L-stable and not stiffly accurate, so it's not normally > something that you'd want to use for a DAE. Make sure you're getting > meaningful results and try switching to something like an ARKIMEX or > ROSW since those are likely better for your problem. > >> I have recorded the solution times when different numbers of processors are used: >> >> 2 processors: 1021 seconds, >> 4 processors: 587.244 seconds, >> 8 processors: 421.565 seconds, >> 16 processors: 355.594 seconds, >> 32 processors: 322.28 seconds, >> 64 processors: 382.967 seconds. >> >> It seems like with 32 processors, it reaches the best performance. >> However, 322.28 seconds to solve such DAE equations is too slow than >> I expected. > > The number of equations (1152) is quite small, so I'm not surprised > there is no further speedup. Can you explain more about your equations? > >> >> I have the following questions based on the above results: >> 1. Is this the usual DAE solving time in PETSc to for the problem with this dimension? > > That depends what your function is. > >> 2. I was told that in TS, by default, ksp uses GMRES, and the >> preconditioner is ILU(0), is there any other alterative ksp solver or >> options I should use in the command line to solve the problem much >> faster? > > I would use -ksp_type preonly -pc_type lu for such small problems. Is > the system dense? > >> 3. Do you have any other suggestion for me to speed up the DAE computation in PETSc? > > Can you describe what sort of problem you're dealing with, what causes > the stiffness in your equations, what accuracy you want, etc. -------------- next part -------------- A non-text attachment was scrubbed... Name: job.out.summary Type: application/octet-stream Size: 19555 bytes Desc: job.out.summary URL: From abhyshr at mcs.anl.gov Tue Aug 13 13:00:41 2013 From: abhyshr at mcs.anl.gov (Shri) Date: Tue, 13 Aug 2013 13:00:41 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5CF7@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5C2B@EMAIL04.pnl.gov> <8761vaqtnn.fsf@mcs.anl.gov> <6BC6B555-1881-45D4-BD5A-8350A9443826@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB5B5CF7@EMAIL04.pnl.gov> Message-ID: <1754384B-D23B-4257-8262-45FDD668EF95@mcs.anl.gov> 90% of the time is spent in your Jacobian evaluation routine which is clearly a bottleneck. On Aug 13, 2013, at 12:34 PM, Jin, Shuangshuang wrote: > Hello, Jed and Barry, thanks for your reply. > > We are solving a power system dynamic simulation problem. We set up the DAE equations and its Jacobian matrix, and would like to use the Trapezoid method to solve it. > > That's also the reason why we chose TSTHETA. From the PETSc manual, we read that: > > "-ts_type theta -ts_theta_theta 0.5 -ts_theta_endpoint corresponds to Crank-Nicholson (TSCN). This method can be applied to DAE. > For the default Theta=0.5, this is the trapezoid rule (also known as Crank-Nicolson, see TSCN)." > > I haven't heard of ARKIMEX or ROSW before. Are they some external packages or DAE solvers that implement the Trapezoid method? > > I have also tried the -ksp_type preonly -pc_type lu option you indicated but failed. The PETSC ERROR messages are: No support for this operation for this object type! Matrix format mpiaij does not have a built-in PETSc LU! PETSc does not have a native parallel direct solver. You can use MUMPS (--download-mumps) or superlu_dist (--download_superlu_dist) > > Attached please see the log_summary for running the TSTHETA with -ts_theta_theta 0.5 and its default ksp solver. Please help me to evaluate the performance and see what's the bottleneck of the slow computation speed. > > Thanks a lot! > > Shuangshuang > > > > > > > -----Original Message----- > From: Barry Smith [mailto:bsmith at mcs.anl.gov] > Sent: Monday, August 12, 2013 6:39 PM > To: Jed Brown > Cc: Jin, Shuangshuang; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Performance of PETSc TS solver > > > Also always send the output from running with -log_summary whenever you ask performance questions so we know what kind of performance it is getting. > > Barry > > On Aug 12, 2013, at 8:14 PM, Jed Brown wrote: > >> "Jin, Shuangshuang" writes: >> >>> Hello, PETSc developers, >>> I have a question regarding the performance of PETSc TS solver >>> expecially the TSTHETA. I used it to solve my DAE equations. >> >> TSTHETA is not L-stable and not stiffly accurate, so it's not normally >> something that you'd want to use for a DAE. Make sure you're getting >> meaningful results and try switching to something like an ARKIMEX or >> ROSW since those are likely better for your problem. >> >>> I have recorded the solution times when different numbers of processors are used: >>> >>> 2 processors: 1021 seconds, >>> 4 processors: 587.244 seconds, >>> 8 processors: 421.565 seconds, >>> 16 processors: 355.594 seconds, >>> 32 processors: 322.28 seconds, >>> 64 processors: 382.967 seconds. >>> >>> It seems like with 32 processors, it reaches the best performance. >>> However, 322.28 seconds to solve such DAE equations is too slow than >>> I expected. >> >> The number of equations (1152) is quite small, so I'm not surprised >> there is no further speedup. Can you explain more about your equations? >> >>> >>> I have the following questions based on the above results: >>> 1. Is this the usual DAE solving time in PETSc to for the problem with this dimension? >> >> That depends what your function is. >> >>> 2. I was told that in TS, by default, ksp uses GMRES, and the >>> preconditioner is ILU(0), is there any other alterative ksp solver or >>> options I should use in the command line to solve the problem much >>> faster? >> >> I would use -ksp_type preonly -pc_type lu for such small problems. Is >> the system dense? >> >>> 3. Do you have any other suggestion for me to speed up the DAE computation in PETSc? >> >> Can you describe what sort of problem you're dealing with, what causes >> the stiffness in your equations, what accuracy you want, etc. > > From danyang.su at gmail.com Tue Aug 13 13:01:28 2013 From: danyang.su at gmail.com (Danyang Su) Date: Tue, 13 Aug 2013 11:01:28 -0700 Subject: [petsc-users] Questions on setting value through VecGetArrayF90 Message-ID: <520A7478.5030405@gmail.com> Hi All, I have the following codes, it can be compiled but it always throw out error when running. I also tried the example ex44f.F90, it although throw out similar error. call DMDAGetInfo(da,PETSC_NULL_INTEGER,mx,PETSC_NULL_INTEGER, & PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & PETSC_NULL_INTEGER,ierr) call DMDAGetCorners(da,xs,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & xm,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,ierr) call VecGetArrayF90(b,vecpointer,ierr) do i = xs, xs+xm-1 *vecpointer(i)=b_in(i+1) !!!Error here* end do call VecRestoreArrayF90(b,vecpointer,ierr) *The compiler information are:* $ make ksp_inhm_d /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe ifort -c -MT -Z7 -fpp -I/cygdrive/c/cygwin/packages/petsc- 3.4.2/include -I/cygdrive/c/cygwin/packages/petsc-3.4.2/arch-mswin-c-debug/include -I/cygdrive/c/cygwin/packages/parmeti s-4.0.3/include -I/cygdrive/c/cygwin/packages/metis-5.1.0/include -I/cygdrive/c/Program\ Files/MPICH2/include -o ksp_ inhm.o ksp_inhm.F90 /cygdrive/c/cygwin/packages/petsc-3.4.2/bin/win32fe/win32fe cl -MT -wd4996 -Z7 -o ksp_inhm_d ksp_inhm.o -L/cygdrive/c/ cygwin/packages/petsc-3.4.2/arch-mswin-c-debug/lib -lpetsc -lflapack -lfblas /cygdrive/c/cygwin/packages/parmetis-4.0.3 /build/libparmetis/Release/parmetis.lib /cygdrive/c/cygwin/packages/metis-5.1.0/build/libmetis/Release/metis.lib /cygdri ve/c/Program\ Files/MPICH2/lib/fmpich2.lib /cygdrive/c/Program\ Files/MPICH2/lib/fmpich2g.lib /cygdrive/c/Program\ Files /MPICH2/lib/mpi.lib Gdi32.lib User32.lib Advapi32.lib Kernel32.lib Ws2_32.lib /usr/bin/rm -f ksp_inhm.o ksp_inhm.mod *And the error information are:* [1]PETSC ERROR: PetscTrFreeDefault() called from VecDestroy_MPI() line 20 in src/vec/vec/impls/mpi/C:\cygwin\packages\PE TSC-~1.2\src\vec\vec\impls\mpi\pdvec.c [1]PETSC ERROR: Block [id=0(4048)] at address 000000000279DBE0 is corrupted (probably write past end of array) [1]PETSC ERROR: Block allocated in VecCreate_MPI_Private() line 197 in src/vec/vec/impls/mpi/C:\cygwin\packages\PETSC-~1 .2\src\vec\vec\impls\mpi\pbvec.c [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Memory corruption! [1]PETSC ERROR: Corrupted memory! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: D:\dsu\ResearchAtUBC\Dropbox\ParallelDevelop\AuxiliaryPrograms\Petsc-Solver-Test\petsc_unsym_f\ksp_inhm_ d.exe on a arch-mswin-c-debug named NWMOP by dsu Tue Aug 13 18:46:55 2013 [1]PETSC ERROR: Libraries linked from /cygdrive/c/cygwin/packages/petsc-3.4.2/arch-mswin-c-debug/lib [1]PETSC ERROR: Configure run at Tue Aug 6 10:46:18 2013 [1]PETSC ERROR: Configure options --with-cc="win32fe cl" --with-fc="win32fe ifort" --with-cxx="win32fe cl" --with-parmet is-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/include --with-parmetis-lib=/cygdrive/c/cygwin/packages/parmetis-4 .0.3/build/libparmetis/Release/parmetis.lib --with-metis-include=/cygdrive/c/cygwin/packages/metis-5.1.0/include --with- metis-lib=/cygdrive/c/cygwin/packages/metis-5.1.0/build/libmetis/Release/metis.lib --download-f-blas-lapack --useThreads =0 [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: PetscTrFreeDefault() line 301 in src/sys/memory/C:\cygwin\packages\PETSC-~1.2\src\sys\memory\mtr.c [1]PETSC ERROR: VecDestroy_MPI() line 20 in src/vec/vec/impls/mpi/C:\cygwin\packages\PETSC-~1.2\src\vec\vec\impls\mpi\pd vec.c [1]PETSC ERROR: VecDestroy() line 546 in src/vec/vec/interface/C:\cygwin\packages\PETSC-~1.2\src\vec\vec\INTERF~1\vector .c [1]PETSC ERROR: PetscTrFreeDefault() called from VecDestroy_MPI() line 30 in src/vec/vec/impls/mpi/C:\cygwin\packages\PE TSC-~1.2\src\vec\vec\impls\mpi\pdvec.c [1]PETSC ERROR: Block at address 000000000279FDD0 is corrupted; cannot free; may be block not allocated with PetscMalloc() [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Memory corruption! [1]PETSC ERROR: Bad location or corrupted memory! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: D:\dsu\ResearchAtUBC\Dropbox\ParallelDevelop\AuxiliaryPrograms\Petsc-Solver-Test\petsc_unsym_f\ksp_inhm_ d.exe on a arch-mswin-c-debug named NWMOP by dsu Tue Aug 13 18:46:55 2013 [1]PETSC ERROR: Libraries linked from /cygdrive/c/cygwin/packages/petsc-3.4.2/arch-mswin-c-debug/lib [1]PETSC ERROR: Configure run at Tue Aug 6 10:46:18 2013 [1]PETSC ERROR: Configure options --with-cc="win32fe cl" --with-fc="win32fe ifort" --with-cxx="win32fe cl" --with-parmet is-include=/cygdrive/c/cygwin/packages/parmetis-4.0.3/include --with-parmetis-lib=/cygdrive/c/cygwin/packages/parmetis-4 .0.3/build/libparmetis/Release/parmetis.lib --with-metis-include=/cygdrive/c/cygwin/packages/metis-5.1.0/include --with- metis-lib=/cygdrive/c/cygwin/packages/metis-5.1.0/build/libmetis/Release/metis.lib --download-f-blas-lapack --useThreads =0 [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: PetscTrFreeDefault() line 283 in src/sys/memory/C:\cygwin\packages\PETSC-~1.2\src\sys\memory\mtr.c [1]PETSC ERROR: VecDestroy_MPI() line 30 in src/vec/vec/impls/mpi/C:\cygwin\packages\PETSC-~1.2\src\vec\vec\impls\mpi\pd vec.c [1]PETSC ERROR: VecDestroy() line 546 in src/vec/vec/interface/C:\cygwin\packages\PETSC-~1.2\src\vec\vec\INTERF~1\vector .c Thanks and regards, Danyang -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue Aug 13 13:14:45 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 13 Aug 2013 13:14:45 -0500 Subject: [petsc-users] Questions on setting value through VecGetArrayF90 In-Reply-To: <520A7478.5030405@gmail.com> References: <520A7478.5030405@gmail.com> Message-ID: <877gfpo3u2.fsf@mcs.anl.gov> Danyang Su writes: > Hi All, > > I have the following codes, it can be compiled but it always throw out > error when running. I also tried the example ex44f.F90, it although > throw out similar error. > > call > DMDAGetInfo(da,PETSC_NULL_INTEGER,mx,PETSC_NULL_INTEGER, & > PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & > PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & > PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & > PETSC_NULL_INTEGER,ierr) > call > DMDAGetCorners(da,xs,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER, & > xm,PETSC_NULL_INTEGER,PETSC_NULL_INTEGER,ierr) > call VecGetArrayF90(b,vecpointer,ierr) > do i = xs, xs+xm-1 > *vecpointer(i)=b_in(i+1) !!!Error here* If you want to use global indices like this, you need DMDAVecGetArrayF90. See src/dm/examples/tutorials/ex11f90.F and src/snes/examples/tutorials/ex5f90.F for two ways to do this. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mrosso at uci.edu Tue Aug 13 14:28:54 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 12:28:54 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> Message-ID: <520A88F6.9070603@uci.edu> Hi Barry, I was finally able to try multigrid with a singular system and a finer grid. GAMG works perfectly and has no problem in handling the singular system. On the other hand, MG is giving me problem: [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Argument out of range! [0]PETSC ERROR: Partition in x direction is too fine! 32 64! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a arch-cray-xt5-pkgs-opt named nid01332 by Unknown Tue Aug 13 15:06:21 2013 [0]PETSC ERROR: Libraries linked from /nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib [0]PETSC ERROR: Configure run at Wed Jul 31 22:48:06 2013 The input I used is: -ksp_monitor -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 4 -options_left I am simulating a 256^3 grid with 256 processors. Since I am using a 2D domain decomposition, each sub-domain contains 256x64x4 grid points. To be consistent with my code indexing, I had to initialize DMDA with reverse ordering, that is z,y,x, so when the error message says "x is too fine" it actually means "z is too fine". I was wondering what is the minimum number of nodes per direction that would avoid this problem and how the number of levels is related to minimum grid size required. Thank you! Michele On 08/02/2013 03:11 PM, Barry Smith wrote: > On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: > >> Barry, >> >> thank you very much for your help. I was trying to debug the error with no success! >> Now it works like a charm for me too! >> I have still two questions for you: >> >> 1) How did you choose the number of levels to use: trial and error? > I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. > >> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? > I don't know of anything. > > But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. > > Barry > >> Again, thank you very much! >> >> Michele >> >> On 08/02/2013 02:38 PM, Barry Smith wrote: >>> Finally got it. My failing memory. I had to add the line >>> >>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>> >>> immediately after KSPSetDM() and >>> >>> change >>> >>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>> >>> to >>> >>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>> >>> so it will work in both parallel and sequential then >>> >>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>> >>> works great with 2 levels. >>> >>> Barry >>> >>> >>> >>> >>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>> >>> wrote: >>> >>> >>>> Barry, >>>> >>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>> Please, contact me for any question/suggestion. >>>> I the mean time I will try to debug it. >>>> >>>> Michele >>>> >>>> >>>> >>>> >>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>> >>>>> Run on one process until this is debugged. You can try the option >>>>> >>>>> -start_in_debugger noxterm >>>>> >>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>> >>>>> Barry >>>>> >>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>> >>>>> >>>>> >>>>> wrote: >>>>> >>>>> >>>>> >>>>>> Barry, >>>>>> >>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>> Also, -da_refine is never used (see previous output). >>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>> >>>>>> and now the error is different: >>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [1]PETSC ERROR: Floating point exception! >>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [2]PETSC ERROR: Floating point exception! >>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>> [3]PETSC ERROR: Floating point exception! >>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [3]PETSC ERROR: Configure options >>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [1]PETSC ERROR: Configure options >>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> Configure options >>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> --------------------- Error Message ------------------------------------ >>>>>> [0]PETSC ERROR: Floating point exception! >>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>> [0]PETSC ERROR: Configure options >>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>> >>>>>> #PETSc Option Table entries: >>>>>> -ksp_view >>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>> -mg_levels_ksp_type chebyshev >>>>>> -mg_levels_pc_type jacobi >>>>>> -options_left >>>>>> -pc_mg_galerkin >>>>>> -pc_type mg >>>>>> #End of PETSc Option Table entries >>>>>> There are no unused options. >>>>>> >>>>>> Michele >>>>>> >>>>>> >>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>> >>>>>> >>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>> >>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>> >>>>>>>> ! Create matrix >>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>> >>>>>>>> ! Create rhs term >>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>> >>>>>>>> ! Solve system >>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>> >>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>> >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Barry, >>>>>>>>>> >>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>> >>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>> >>>>>>>>>> The output is: >>>>>>>>>> >>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> Arguments are incompatible! >>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>> -da_refine 4 >>>>>>>>>> -ksp_view >>>>>>>>>> -options_left >>>>>>>>>> -pc_mg_galerkin >>>>>>>>>> -pc_type mg >>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>> There is one unused database option. It is: >>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>> >>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>> ! Create Global Vectors >>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>> ! Create matrix >>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>> ! Create solver >>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>> >>>>>>>>>> ! To allow using option from command line >>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Michele >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>> >>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Barry, >>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>> Thank you, >>>>>>>>>>>> Michele >>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>> >>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>> >>>>>>>>>>>>>> Michele >>>>>>>>>>>>>> >>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rupp at mcs.anl.gov Tue Aug 13 15:33:25 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Tue, 13 Aug 2013 15:33:25 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520A88F6.9070603@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> Message-ID: <520A9815.7030400@mcs.anl.gov> Hi Michele, I suggest you try a different decomposition of your grid. With k levels, you should have at least 2^{k-1} grid nodes per coordinate direction in order to be able to correctly build a coarser mesh. In your case, you should have at least 8 nodes (leading to coarser levels of size 4, 2, and 1) in z direction. Best regards, Karli On 08/13/2013 02:28 PM, Michele Rosso wrote: > Hi Barry, > > I was finally able to try multigrid with a singular system and a finer grid. > GAMG works perfectly and has no problem in handling the singular system. > On the other hand, MG is giving me problem: > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Argument out of range! > [0]PETSC ERROR: Partition in x direction is too fine! 32 64! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a arch-cray-xt5-pkgs-opt named nid01332 by > Unknown Tue Aug 13 15:06:21 2013 > [0]PETSC ERROR: Libraries linked from > /nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib > [0]PETSC ERROR: Configure run at Wed Jul 31 22:48:06 2013 > > The input I used is: > -ksp_monitor -ksp_converged_reason -pc_type mg -pc_mg_galerkin > -pc_mg_levels 4 -options_left > > I am simulating a 256^3 grid with 256 processors. Since I am using a 2D > domain decomposition, each sub-domain contains 256x64x4 grid points. > To be consistent with my code indexing, I had to initialize DMDA with > reverse ordering, that is z,y,x, so when the error message says "x is > too fine" it actually means "z is too fine". > I was wondering what is the minimum number of nodes per direction that > would avoid this problem and how the number of levels is related to > minimum grid size required. > Thank you! > > Michele > > > On 08/02/2013 03:11 PM, Barry Smith wrote: >> On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: >> >>> Barry, >>> >>> thank you very much for your help. I was trying to debug the error with no success! >>> Now it works like a charm for me too! >>> I have still two questions for you: >>> >>> 1) How did you choose the number of levels to use: trial and error? >> I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. >> >>> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? >> I don't know of anything. >> >> But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. >> >> Barry >> >>> Again, thank you very much! >>> >>> Michele >>> >>> On 08/02/2013 02:38 PM, Barry Smith wrote: >>>> Finally got it. My failing memory. I had to add the line >>>> >>>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>>> >>>> immediately after KSPSetDM() and >>>> >>>> change >>>> >>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>> >>>> to >>>> >>>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>>> >>>> so it will work in both parallel and sequential then >>>> >>>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>>> >>>> works great with 2 levels. >>>> >>>> Barry >>>> >>>> >>>> >>>> >>>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>>> >>>> wrote: >>>> >>>> >>>>> Barry, >>>>> >>>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>>> Please, contact me for any question/suggestion. >>>>> I the mean time I will try to debug it. >>>>> >>>>> Michele >>>>> >>>>> >>>>> >>>>> >>>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>>> >>>>>> Run on one process until this is debugged. You can try the option >>>>>> >>>>>> -start_in_debugger noxterm >>>>>> >>>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>>> >>>>>> Barry >>>>>> >>>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>>> >>>>>> >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> >>>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>>> Also, -da_refine is never used (see previous output). >>>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>>> >>>>>>> and now the error is different: >>>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [1]PETSC ERROR: Floating point exception! >>>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [2]PETSC ERROR: Floating point exception! >>>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>> [3]PETSC ERROR: Floating point exception! >>>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [3]PETSC ERROR: Configure options >>>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [1]PETSC ERROR: Configure options >>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> Configure options >>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> --------------------- Error Message ------------------------------------ >>>>>>> [0]PETSC ERROR: Floating point exception! >>>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>> [0]PETSC ERROR: Configure options >>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>> >>>>>>> #PETSc Option Table entries: >>>>>>> -ksp_view >>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>> -mg_levels_ksp_type chebyshev >>>>>>> -mg_levels_pc_type jacobi >>>>>>> -options_left >>>>>>> -pc_mg_galerkin >>>>>>> -pc_type mg >>>>>>> #End of PETSc Option Table entries >>>>>>> There are no unused options. >>>>>>> >>>>>>> Michele >>>>>>> >>>>>>> >>>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>>> >>>>>>> >>>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>> >>>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Barry, >>>>>>>>> >>>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>>> >>>>>>>>> ! Create matrix >>>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>> >>>>>>>>> ! Create rhs term >>>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>>> >>>>>>>>> ! Solve system >>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>>> >>>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>>> >>>>>>>>> >>>>>>>>> Michele >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Barry, >>>>>>>>>>> >>>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>>> >>>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>>> >>>>>>>>>>> The output is: >>>>>>>>>>> >>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> Arguments are incompatible! >>>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>> -da_refine 4 >>>>>>>>>>> -ksp_view >>>>>>>>>>> -options_left >>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>> -pc_type mg >>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>>> >>>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>>> ! Create Global Vectors >>>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>>> ! Create matrix >>>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>>> ! Create solver >>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>>> >>>>>>>>>>> ! To allow using option from command line >>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Michele >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>>> >>>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Barry, >>>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>>> Thank you, >>>>>>>>>>>>> Michele >>>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>>> >> > From mrosso at uci.edu Tue Aug 13 18:09:02 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 16:09:02 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <520A9815.7030400@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> Message-ID: <520ABC8E.3040204@uci.edu> Hi Karli, thank you for your hint: now it works. Now I would like to speed up the solution: I was counting on increasing the number of levels/the number of processors used, but now I see I cannot do that. Do you have any hint to achieve better speed? Thanks! Best, Michele On 08/13/2013 01:33 PM, Karl Rupp wrote: > Hi Michele, > > I suggest you try a different decomposition of your grid. With k > levels, you should have at least 2^{k-1} grid nodes per coordinate > direction in order to be able to correctly build a coarser mesh. In > your case, you should have at least 8 nodes (leading to coarser levels > of size 4, 2, and 1) in z direction. > > Best regards, > Karli > > > On 08/13/2013 02:28 PM, Michele Rosso wrote: >> Hi Barry, >> >> I was finally able to try multigrid with a singular system and a >> finer grid. >> GAMG works perfectly and has no problem in handling the singular system. >> On the other hand, MG is giving me problem: >> >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: Argument out of range! >> [0]PETSC ERROR: Partition in x direction is too fine! 32 64! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./hit on a arch-cray-xt5-pkgs-opt named nid01332 by >> Unknown Tue Aug 13 15:06:21 2013 >> [0]PETSC ERROR: Libraries linked from >> /nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib >> [0]PETSC ERROR: Configure run at Wed Jul 31 22:48:06 2013 >> >> The input I used is: >> -ksp_monitor -ksp_converged_reason -pc_type mg -pc_mg_galerkin >> -pc_mg_levels 4 -options_left >> >> I am simulating a 256^3 grid with 256 processors. Since I am using a 2D >> domain decomposition, each sub-domain contains 256x64x4 grid points. >> To be consistent with my code indexing, I had to initialize DMDA with >> reverse ordering, that is z,y,x, so when the error message says "x is >> too fine" it actually means "z is too fine". >> I was wondering what is the minimum number of nodes per direction that >> would avoid this problem and how the number of levels is related to >> minimum grid size required. >> Thank you! >> >> Michele >> >> >> On 08/02/2013 03:11 PM, Barry Smith wrote: >>> On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: >>> >>>> Barry, >>>> >>>> thank you very much for your help. I was trying to debug the error >>>> with no success! >>>> Now it works like a charm for me too! >>>> I have still two questions for you: >>>> >>>> 1) How did you choose the number of levels to use: trial and error? >>> I just used 2 because it is more than one level :-). When you >>> use a finer mesh you can use more levels. >>> >>>> 2) For a singular system (periodic), besides the nullspace removal, >>>> should I change any parameter? >>> I don't know of anything. >>> >>> But there is a possible problem with -pc_mg_galerkin, PETSc does >>> not transfer the null space information from the fine mesh to the >>> other meshes and technically we really want the multigrid to remove >>> the null space on all the levels but usually it will work without >>> worrying about that. >>> >>> Barry >>> >>>> Again, thank you very much! >>>> >>>> Michele >>>> >>>> On 08/02/2013 02:38 PM, Barry Smith wrote: >>>>> Finally got it. My failing memory. I had to add the line >>>>> >>>>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>>>> >>>>> immediately after KSPSetDM() and >>>>> >>>>> change >>>>> >>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>> >>>>> to >>>>> >>>>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>>>> >>>>> so it will work in both parallel and sequential then >>>>> >>>>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view >>>>> -pc_mg_galerkin -pc_mg_levels 2 >>>>> >>>>> works great with 2 levels. >>>>> >>>>> Barry >>>>> >>>>> >>>>> >>>>> >>>>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>>>> >>>>> wrote: >>>>> >>>>> >>>>>> Barry, >>>>>> >>>>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>>>> My test code is a very reduced version of my productive code >>>>>> (incompressible DNS code) thus fftw3 and the library 2decomp&fft >>>>>> are needed to run it. >>>>>> I attached the 2decomp&fft version I used: it is a matter of >>>>>> minutes to install it, so you should not have any problem. >>>>>> Please, contact me for any question/suggestion. >>>>>> I the mean time I will try to debug it. >>>>>> >>>>>> Michele >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>>>> >>>>>>> Run on one process until this is debugged. You can try the >>>>>>> option >>>>>>> >>>>>>> -start_in_debugger noxterm >>>>>>> >>>>>>> and then call VecView(vec,0) in the debugger when it gives the >>>>>>> error below. It seems like some objects are not getting their >>>>>>> initial values set properly. Are you able to email the code so >>>>>>> we can run it and figure out what is going on? >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>>>> >>>>>>> >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> I checked the matrix: the element (0,0) is not zero, nor any >>>>>>>> other diagonal element is. >>>>>>>> The matrix is symmetric positive define (i.e. the standard >>>>>>>> Poisson matrix). >>>>>>>> Also, -da_refine is never used (see previous output). >>>>>>>> I tried to run with -pc_type mg -pc_mg_galerkin >>>>>>>> -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev >>>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view >>>>>>>> -options_left >>>>>>>> >>>>>>>> and now the error is different: >>>>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error >>>>>>>> Message ------------------------------------ >>>>>>>> [1]PETSC ERROR: Floating point exception! >>>>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number >>>>>>>> or infinite at beginning of function: Parameter number 2! >>>>>>>> [1]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>> shooting. >>>>>>>> [2]PETSC ERROR: --------------------- Error Message >>>>>>>> ------------------------------------ >>>>>>>> [2]PETSC ERROR: Floating point exception! >>>>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number >>>>>>>> or infinite at beginning of function: Parameter number 2! >>>>>>>> [2]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>> shooting. >>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error >>>>>>>> Message ------------------------------------ >>>>>>>> [3]PETSC ERROR: Floating point exception! >>>>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number >>>>>>>> or infinite at beginning of function: Parameter number 2! >>>>>>>> [3]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>> shooting. >>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [3]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [1]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by >>>>>>>> mic Thu Aug 1 15:43:16 2013 >>>>>>>> [1]PETSC ERROR: Libraries linked from >>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [2]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by >>>>>>>> mic Thu Aug 1 15:43:16 2013 >>>>>>>> [2]PETSC ERROR: Libraries linked from >>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named >>>>>>>> enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>> [3]PETSC ERROR: Libraries linked from >>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>> [1]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [1]PETSC ERROR: VecValidValues() line 28 in >>>>>>>> src/vec/vec/interface/rvector.c >>>>>>>> Configure options >>>>>>>> [2]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [2]PETSC ERROR: VecValidValues() line 28 in >>>>>>>> src/vec/vec/interface/rvector.c >>>>>>>> [3]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [3]PETSC ERROR: VecValidValues() line 28 in >>>>>>>> src/vec/vec/interface/rvector.c >>>>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in >>>>>>>> src/mat/interface/matrix.c >>>>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in >>>>>>>> src/mat/interface/matrix.c >>>>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in >>>>>>>> src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in >>>>>>>> src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in >>>>>>>> src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in >>>>>>>> src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in >>>>>>>> src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in >>>>>>>> src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in >>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in >>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [2]PETSC ERROR: PCApply() line 442 in >>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [3]PETSC ERROR: PCApply() line 442 in >>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>> [1]PETSC ERROR: PCApply() line 442 in >>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in >>>>>>>> src/ksp/ksp/impls/cg/cg.c >>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in >>>>>>>> src/ksp/ksp/impls/cg/cg.c >>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> --------------------- Error Message >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: Floating point exception! >>>>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number >>>>>>>> or infinite at beginning of function: Parameter number 2! >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>> shooting. >>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by >>>>>>>> mic Thu Aug 1 15:43:16 2013 >>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> >>>>>>>> [0]PETSC ERROR: VecValidValues() line 28 in >>>>>>>> src/vec/vec/interface/rvector.c >>>>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in >>>>>>>> src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in >>>>>>>> src/ksp/ksp/impls/cheby/cheby.c >>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in >>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>> [0]PETSC ERROR: PCApply() line 442 in >>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in >>>>>>>> src/ksp/ksp/impls/cg/cg.c >>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in >>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>> >>>>>>>> #PETSc Option Table entries: >>>>>>>> -ksp_view >>>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>> -mg_levels_ksp_type chebyshev >>>>>>>> -mg_levels_pc_type jacobi >>>>>>>> -options_left >>>>>>>> -pc_mg_galerkin >>>>>>>> -pc_type mg >>>>>>>> #End of PETSc Option Table entries >>>>>>>> There are no unused options. >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> >>>>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>>>> >>>>>>>> >>>>>>>>> Do a MatView() on A before the solve (remove the -da_refine >>>>>>>>> 4) so it is small. Is the 0,0 entry 0? If the matrix has zero >>>>>>>>> on the diagonals you cannot us Gauss-Seidel as the smoother. >>>>>>>>> You can start with -mg_levels_pc_type jacobi >>>>>>>>> -mg_levels_ksp_type chebychev >>>>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>> >>>>>>>>> Is the matrix a Stokes-like matrix? If so then different >>>>>>>>> preconditioners are in order. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Barry, >>>>>>>>>> >>>>>>>>>> here it is the fraction of code where I set the rhs term and >>>>>>>>>> the matrix. >>>>>>>>>> >>>>>>>>>> ! Create matrix >>>>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>> >>>>>>>>>> ! Create rhs term >>>>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>>>> >>>>>>>>>> ! Solve system >>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>>>> >>>>>>>>>> The subroutine form_matrix returns the Mat object A that is >>>>>>>>>> filled by using MatSetValuesStencil. >>>>>>>>>> qrho, lsf and head are additional arguments that are needed >>>>>>>>>> to compute the matrix value. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Michele >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Where are you putting the values into the matrix? It >>>>>>>>>>> seems the matrix has no values in it? The code is stopping >>>>>>>>>>> because in the Gauss-Seidel smoothing it has detected zero >>>>>>>>>>> diagonals. >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Barry, >>>>>>>>>>>> >>>>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 >>>>>>>>>>>> -ksp_view -options_left >>>>>>>>>>>> >>>>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>>>> >>>>>>>>>>>> The output is: >>>>>>>>>>>> >>>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>> [2]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>> updates. >>>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>> shooting. >>>>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [2]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named >>>>>>>>>>>> enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from >>>>>>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>>>> [2]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in >>>>>>>>>>>> src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in >>>>>>>>>>>> src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in >>>>>>>>>>>> src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in >>>>>>>>>>>> src/mat/interface/matrix.c >>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in >>>>>>>>>>>> src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> Arguments are incompatible! >>>>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in >>>>>>>>>>>> src/ksp/ksp/interface/itres.c >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in >>>>>>>>>>>> src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line >>>>>>>>>>>> 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in >>>>>>>>>>>> src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>> updates. >>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints >>>>>>>>>>>> about trouble shooting. >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>> [3]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>> updates. >>>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>> shooting. >>>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [3]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named >>>>>>>>>>>> enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>> [3]PETSC ERROR: Libraries linked from >>>>>>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 >>>>>>>>>>>> 12:01:44 2013 >>>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>>> [3]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in >>>>>>>>>>>> src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in >>>>>>>>>>>> src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 >>>>>>>>>>>> in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>> [1]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>> updates. >>>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>> shooting. >>>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [1]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named >>>>>>>>>>>> enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>> [1]PETSC ERROR: Libraries linked from >>>>>>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>>> [1]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in >>>>>>>>>>>> src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in >>>>>>>>>>>> src/mat/interface/matrix.c >>>>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in >>>>>>>>>>>> src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in >>>>>>>>>>>> src/ksp/ksp/interface/itres.c >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in >>>>>>>>>>>> src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in >>>>>>>>>>>> src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in >>>>>>>>>>>> src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in >>>>>>>>>>>> src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in >>>>>>>>>>>> src/mat/interface/matrix.c >>>>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in >>>>>>>>>>>> src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in >>>>>>>>>>>> src/ksp/ksp/interface/itres.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in >>>>>>>>>>>> src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in >>>>>>>>>>>> src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in >>>>>>>>>>>> src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named >>>>>>>>>>>> enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>> /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> >>>>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in >>>>>>>>>>>> src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in >>>>>>>>>>>> src/mat/impls/aij/seq/aij.c >>>>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in >>>>>>>>>>>> src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in >>>>>>>>>>>> src/mat/interface/matrix.c >>>>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in >>>>>>>>>>>> src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in >>>>>>>>>>>> src/ksp/ksp/interface/itres.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in >>>>>>>>>>>> src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in >>>>>>>>>>>> src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in >>>>>>>>>>>> src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in >>>>>>>>>>>> src/ksp/pc/interface/precon.c >>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in >>>>>>>>>>>> src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in >>>>>>>>>>>> src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in >>>>>>>>>>>> src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>> -ksp_view >>>>>>>>>>>> -options_left >>>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>>> -pc_type mg >>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>>>> >>>>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD >>>>>>>>>>>> , & >>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , >>>>>>>>>>>> DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , >>>>>>>>>>>> DMDA_STENCIL_STAR, & >>>>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , >>>>>>>>>>>> 1_ip, 1_ip , 1_ip , & >>>>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , >>>>>>>>>>>> ierr) >>>>>>>>>>>> ! Create Global Vectors >>>>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>>>> ! Create matrix >>>>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>>>> ! Create solver >>>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>>>> call >>>>>>>>>>>> KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>>>> call >>>>>>>>>>>> KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real >>>>>>>>>>>> residual >>>>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>>>> call KSPSetTolerances(ksp, tol >>>>>>>>>>>> ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>>>> & >>>>>>>>>>>> PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>>>> >>>>>>>>>>>> ! To allow using option from command line >>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Michele >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG >>>>>>>>>>>>> will compute the coarser matrices with a sparse matrix >>>>>>>>>>>>> matrix matrix product so you should not need to change >>>>>>>>>>>>> your code to try it out. You still need to use the >>>>>>>>>>>>> KSPSetDM() and -da_refine n to get it working >>>>>>>>>>>>> >>>>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>> you are correct, I did not use it. I think I get now >>>>>>>>>>>>>> where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>>>> geometric multigrid to work, ksp must be provided with >>>>>>>>>>>>>> subroutines to compute the matrix and the rhs at any >>>>>>>>>>>>>> level through >>>>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>>>> I do not do that, I simply build a rhs vector and a >>>>>>>>>>>>>> matrix and then I solve the system. >>>>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my >>>>>>>>>>>>>> code accordingly and get back to you. >>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>> Michele >>>>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See >>>>>>>>>>>>>>> src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid >>>>>>>>>>>>>>>> and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous >>>>>>>>>>>>>>>> thread, I was told an odd number of grid points was >>>>>>>>>>>>>>>> needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 >>>>>>>>>>>>>>>> -ksp_view -options_left >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, >>>>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>> Coarse grid solver -- level >>>>>>>>>>>>>>>> ------------------------------- >>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = >>>>>>>>>>>>>>>> 0.134543, max = 1.47998 >>>>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI >>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>>>> GMRES: restart=30, using Classical >>>>>>>>>>>>>>>> (unmodified) Gram-Schmidt Orthogonalization with no >>>>>>>>>>>>>>>> iterative refinement >>>>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = >>>>>>>>>>>>>>>> 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated >>>>>>>>>>>>>>>> nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, >>>>>>>>>>>>>>>> local iterations = 1, omega = 1 >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated >>>>>>>>>>>>>>>> nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>>>>> calls =0 >>>>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> What kind of mesh are you using? Are you using >>>>>>>>>>>>>>>>> DMDA? If you are using DMDA (and have written your >>>>>>>>>>>>>>>>> code to use it "correctly") then it should be trivial >>>>>>>>>>>>>>>>> to run with geometric multigrid and geometric >>>>>>>>>>>>>>>>> multigrid should be a bit faster. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c >>>>>>>>>>>>>>>>> I run with ./ex19 -pc_type mg -da_refine 4 and it >>>>>>>>>>>>>>>>> refines the original DMDA 4 times and uses geometric >>>>>>>>>>>>>>>>> multigrid with 5 levels. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D >>>>>>>>>>>>>>>>>> Poisson's equation with CG + GAMG as I was suggested >>>>>>>>>>>>>>>>>> to do in a previous thread. >>>>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I >>>>>>>>>>>>>>>>>> would like to know if you have any suggestions to >>>>>>>>>>>>>>>>>> further speed it up, particularly >>>>>>>>>>>>>>>>>> if there is any parameters worth looking into to >>>>>>>>>>>>>>>>>> achieve an even faster solution, for example number >>>>>>>>>>>>>>>>>> of levels and so on. >>>>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, >>>>>>>>>>>>>>>>>> but I will soon have periodic conditions: in this >>>>>>>>>>>>>>>>>> case, does GAMG require particular settings? >>>>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you >>>>>>>>>>>>>>>>>> think it is worth a shot? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view >>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, >>>>>>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>> Coarse grid solver -- level >>>>>>>>>>>>>>>>>> ------------------------------- >>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>>>> Local solve info for each block is in the >>>>>>>>>>>>>>>>>> following KSP and PC objects: >>>>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local >>>>>>>>>>>>>>>>>> block number = 0 >>>>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, >>>>>>>>>>>>>>>>>> absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left >>>>>>>>>>>>>>>>>> preconditioning >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, >>>>>>>>>>>>>>>>>> absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>> using diagonal shift on blocks to >>>>>>>>>>>>>>>>>> prevent zero pivot >>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>> using diagonal shift on blocks to >>>>>>>>>>>>>>>>>> prevent zero pivot >>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>> Matrix Object: Matrix >>>>>>>>>>>>>>>>>> Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>> package used to perform >>>>>>>>>>>>>>>>>> factorization: petsc >>>>>>>>>>>>>>>>>> total: nonzeros=132379, allocated >>>>>>>>>>>>>>>>>> nonzeros=132379 >>>>>>>>>>>>>>>>>> total number of mallocs used >>>>>>>>>>>>>>>>>> during MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>> package used to perform >>>>>>>>>>>>>>>>>> factorization: petsc >>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated >>>>>>>>>>>>>>>>>> nonzeros=1 >>>>>>>>>>>>>>>>>> total number of mallocs used >>>>>>>>>>>>>>>>>> during MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated >>>>>>>>>>>>>>>>>> nonzeros=32037 >>>>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, >>>>>>>>>>>>>>>>>> absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>> using diagonal shift on blocks to >>>>>>>>>>>>>>>>>> prevent zero pivot >>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>> package used to perform >>>>>>>>>>>>>>>>>> factorization: petsc >>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated >>>>>>>>>>>>>>>>>> nonzeros=1 >>>>>>>>>>>>>>>>>> total number of mallocs used >>>>>>>>>>>>>>>>>> during MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, >>>>>>>>>>>>>>>>>> absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>> using diagonal shift on blocks to >>>>>>>>>>>>>>>>>> prevent zero pivot >>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI >>>>>>>>>>>>>>>>>> processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>> package used to perform >>>>>>>>>>>>>>>>>> factorization: petsc >>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated >>>>>>>>>>>>>>>>>> nonzeros=1 >>>>>>>>>>>>>>>>>> total number of mallocs used >>>>>>>>>>>>>>>>>> during MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local >>>>>>>>>>>>>>>>>> block number = 1 >>>>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local >>>>>>>>>>>>>>>>>> block number = 2 >>>>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local >>>>>>>>>>>>>>>>>> block number = 3 >>>>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 >>>>>>>>>>>>>>>>>> ------------------------------- >>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = >>>>>>>>>>>>>>>>>> 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>>>> total: nonzeros=818732, allocated >>>>>>>>>>>>>>>>>> nonzeros=818732 >>>>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver >>>>>>>>>>>>>>>>>> (pre-smoother) >>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 >>>>>>>>>>>>>>>>>> ------------------------------- >>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = >>>>>>>>>>>>>>>>>> 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated >>>>>>>>>>>>>>>>>> nonzeros=1835008 >>>>>>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver >>>>>>>>>>>>>>>>>> (pre-smoother) >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>>>>>>> calls =0 >>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Aug 13 18:26:26 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 13 Aug 2013 18:26:26 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520ABC8E.3040204@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> Message-ID: On Tue, Aug 13, 2013 at 6:09 PM, Michele Rosso wrote: > Hi Karli, > > thank you for your hint: now it works. > Now I would like to speed up the solution: I was counting on increasing > the number of levels/the number of processors used, but now I see I cannot > do that. > Do you have any hint to achieve better speed? > Thanks! > "Better speed" is not very helpful for us, and thus we cannot offer much help. You could 1) Send the output of -log_summary -ksp_monitor -ksp_view 2) Describe the operator succintly Matt > Best, > Michele > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Tue Aug 13 18:30:27 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Tue, 13 Aug 2013 16:30:27 -0700 Subject: [petsc-users] Performance of PETSc TS solver Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> Hi, Shri, >From the log_summary, we can see that the TSJacobianEval/SNESJacobianEval dominates the computation time as you mentioned. Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ TSJacobianEval 1782 1.0 3.1640e+02 1.0 0.00e+00 0.0 2.0e+03 7.4e+01 1.4e+04 90 0 0 0 21 90 0 0 0 21 0 SNESJacobianEval 1782 1.0 3.1641e+02 1.0 0.00e+00 0.0 2.0e+03 7.4e+01 1.4e+04 90 0 0 0 21 90 0 0 0 21 0 It takes 316 seconds for the total Jacobian Eval out of the 350 seconds of SNESSolve,which is the total simulation time. So I look into my IJacobian Function. However, all it does is forming a 1152*1152 dimension Jacobian matrix. I don't understand why it takes such a long time. And if it's the reason of poor performance, is there any way to polish it to accelerate the speed in this part? I also noticed that the KSPSolve part in the DAE process only takes 8.56 seconds: KSPSolve 1782 1.0 8.5647e+00 1.0 2.28e+09 1.0 6.5e+06 5.8e+02 1.5e+04 2 79 78 78 22 2 79 78 78 22 8505 Comparing to the 316 seconds for JacobianEval, it's pretty fast. Does that also imply the default KSP solver is good enough to solve the problem? And I should focus on reduce the IJacobian Eval time without worrying about different KSP solver here? Thanks, Shuangshuang -----Original Message----- From: Shri [mailto:abhyshr at mcs.anl.gov] Sent: Tuesday, August 13, 2013 11:01 AM To: Jin, Shuangshuang Cc: Barry Smith; Jed Brown; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Performance of PETSc TS solver 90% of the time is spent in your Jacobian evaluation routine which is clearly a bottleneck. On Aug 13, 2013, at 12:34 PM, Jin, Shuangshuang wrote: > Hello, Jed and Barry, thanks for your reply. > > We are solving a power system dynamic simulation problem. We set up the DAE equations and its Jacobian matrix, and would like to use the Trapezoid method to solve it. > > That's also the reason why we chose TSTHETA. From the PETSc manual, we read that: > > "-ts_type theta -ts_theta_theta 0.5 -ts_theta_endpoint corresponds to Crank-Nicholson (TSCN). This method can be applied to DAE. > For the default Theta=0.5, this is the trapezoid rule (also known as Crank-Nicolson, see TSCN)." > > I haven't heard of ARKIMEX or ROSW before. Are they some external packages or DAE solvers that implement the Trapezoid method? > > I have also tried the -ksp_type preonly -pc_type lu option you indicated but failed. The PETSC ERROR messages are: No support for this operation for this object type! Matrix format mpiaij does not have a built-in PETSc LU! PETSc does not have a native parallel direct solver. You can use MUMPS (--download-mumps) or superlu_dist (--download_superlu_dist) > > Attached please see the log_summary for running the TSTHETA with -ts_theta_theta 0.5 and its default ksp solver. Please help me to evaluate the performance and see what's the bottleneck of the slow computation speed. > > Thanks a lot! > > Shuangshuang > > > > > > > -----Original Message----- > From: Barry Smith [mailto:bsmith at mcs.anl.gov] > Sent: Monday, August 12, 2013 6:39 PM > To: Jed Brown > Cc: Jin, Shuangshuang; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Performance of PETSc TS solver > > > Also always send the output from running with -log_summary whenever you ask performance questions so we know what kind of performance it is getting. > > Barry > > On Aug 12, 2013, at 8:14 PM, Jed Brown wrote: > >> "Jin, Shuangshuang" writes: >> >>> Hello, PETSc developers, >>> I have a question regarding the performance of PETSc TS solver >>> expecially the TSTHETA. I used it to solve my DAE equations. >> >> TSTHETA is not L-stable and not stiffly accurate, so it's not >> normally something that you'd want to use for a DAE. Make sure >> you're getting meaningful results and try switching to something like >> an ARKIMEX or ROSW since those are likely better for your problem. >> >>> I have recorded the solution times when different numbers of processors are used: >>> >>> 2 processors: 1021 seconds, >>> 4 processors: 587.244 seconds, >>> 8 processors: 421.565 seconds, >>> 16 processors: 355.594 seconds, >>> 32 processors: 322.28 seconds, >>> 64 processors: 382.967 seconds. >>> >>> It seems like with 32 processors, it reaches the best performance. >>> However, 322.28 seconds to solve such DAE equations is too slow than >>> I expected. >> >> The number of equations (1152) is quite small, so I'm not surprised >> there is no further speedup. Can you explain more about your equations? >> >>> >>> I have the following questions based on the above results: >>> 1. Is this the usual DAE solving time in PETSc to for the problem with this dimension? >> >> That depends what your function is. >> >>> 2. I was told that in TS, by default, ksp uses GMRES, and the >>> preconditioner is ILU(0), is there any other alterative ksp solver >>> or options I should use in the command line to solve the problem >>> much faster? >> >> I would use -ksp_type preonly -pc_type lu for such small problems. >> Is the system dense? >> >>> 3. Do you have any other suggestion for me to speed up the DAE computation in PETSc? >> >> Can you describe what sort of problem you're dealing with, what >> causes the stiffness in your equations, what accuracy you want, etc. > > From mrosso at uci.edu Tue Aug 13 19:05:50 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 17:05:50 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> Message-ID: <520AC9DE.1050508@uci.edu> Hi Matt, I attached the output of the commands you suggested. The options I used are: -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -options_left and here are the lines of codes where I setup the solution process: call DMDACreate3d( PETSC_COMM_WORLD , & & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & & NNZ ,NNY , NNX, da , ierr) ! Create Global Vectors call DMCreateGlobalVector(da,b,ierr) call VecDuplicate(b,x,ierr) ! Set initial guess for first use of the module to 0 call VecSet(x,0.0_rp,ierr) ! Create matrix call DMCreateMatrix(da,MATAIJ,A,ierr) ! Create solver call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) call KSPSetDM(ksp,da,ierr) call KSPSetDMActive(ksp,PETSC_FALSE,ierr) call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) call KSPSetType(ksp,KSPCG,ierr) call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) ! Nullspace removal call MatNullSpaceCreate( PETSC_COMM_WORLD,PETSC_TRUE,PETSC_NULL_INTEGER,& & PETSC_NULL_INTEGER,nullspace,ierr) call KSPSetNullspace(ksp,nullspace,ierr) call MatNullSpaceDestroy(nullspace,ierr) ! To allow using option from command line call KSPSetFromOptions(ksp,ierr) Hope I did not omit anything useful. Thank you for your time. Best, Michele On 08/13/2013 04:26 PM, Matthew Knepley wrote: > On Tue, Aug 13, 2013 at 6:09 PM, Michele Rosso > wrote: > > Hi Karli, > > thank you for your hint: now it works. > Now I would like to speed up the solution: I was counting on > increasing the number of levels/the number of processors used, but > now I see I cannot do that. > Do you have any hint to achieve better speed? > Thanks! > > > "Better speed" is not very helpful for us, and thus we cannot offer > much help. You could > > 1) Send the output of -log_summary -ksp_monitor -ksp_view > > 2) Describe the operator succintly > > Matt > > Best, > Michele >>>>>>>>>>>>>>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- 0 KSP Residual norm 3.653385002401e-05 1 KSP Residual norm 9.460380827787e-07 2 KSP Residual norm 2.745875833479e-08 3 KSP Residual norm 4.613281252783e-10 Linear solve converged due to CONVERGED_RTOL iterations 3 KSP Object: 8 MPI processes type: cg maximum iterations=10000 tolerances: relative=0.0001, absolute=1e-50, divergence=10000 left preconditioning has attached null space using nonzero initial guess using UNPRECONDITIONED norm type for convergence test PC Object: 8 MPI processes type: mg MG: type is MULTIPLICATIVE, levels=5 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 8 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 8 MPI processes type: redundant Redundant preconditioner: First (color=0) of 8 PCs follows KSP Object: (mg_coarse_redundant_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_redundant_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 8.69546 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=512, cols=512 package used to perform factorization: petsc total: nonzeros=120206, allocated nonzeros=120206 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=512, cols=512 total: nonzeros=13824, allocated nonzeros=13824 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=512, cols=512 total: nonzeros=13824, allocated nonzeros=13824 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 32 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 8 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.140194, max = 1.54213 Chebyshev: estimated using: [0 0.1; 0 1.1] KSP Object: (mg_levels_1_est_) 8 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=4096, cols=4096 total: nonzeros=110592, allocated nonzeros=110592 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=4096, cols=4096 total: nonzeros=110592, allocated nonzeros=110592 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 8 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.139949, max = 1.53944 Chebyshev: estimated using: [0 0.1; 0 1.1] KSP Object: (mg_levels_2_est_) 8 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=32768, cols=32768 total: nonzeros=884736, allocated nonzeros=884736 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=32768, cols=32768 total: nonzeros=884736, allocated nonzeros=884736 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 8 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.135788, max = 1.49366 Chebyshev: estimated using: [0 0.1; 0 1.1] KSP Object: (mg_levels_3_est_) 8 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_3_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=7077888, allocated nonzeros=7077888 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_3_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=7077888, allocated nonzeros=7077888 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 8 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.138904, max = 1.52794 Chebyshev: estimated using: [0 0.1; 0 1.1] KSP Object: (mg_levels_4_est_) 8 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_4_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=2097152, cols=2097152 total: nonzeros=14680064, allocated nonzeros=14680064 total number of mallocs used during MatSetValues calls =0 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_4_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=2097152, cols=2097152 total: nonzeros=14680064, allocated nonzeros=14680064 total number of mallocs used during MatSetValues calls =0 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=2097152, cols=2097152 total: nonzeros=14680064, allocated nonzeros=14680064 total number of mallocs used during MatSetValues calls =0 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./hit on a arch-cray-xt5-pkgs-opt named nid14554 with 8 processors, by Unknown Tue Aug 13 19:53:41 2013 Using Petsc Release Version 3.4.2, Jul, 02, 2013 Max Max/Min Avg Total Time (sec): 6.402e+00 1.00011 6.402e+00 Objects: 2.970e+02 1.00000 2.970e+02 Flops: 6.953e+08 1.00000 6.953e+08 5.562e+09 Flops/sec: 1.086e+08 1.00011 1.086e+08 8.688e+08 MPI Messages: 1.170e+03 1.00000 1.170e+03 9.360e+03 MPI Message Lengths: 1.565e+07 1.00000 1.338e+04 1.252e+08 MPI Reductions: 6.260e+02 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 6.4021e+00 100.0% 5.5620e+09 100.0% 9.360e+03 100.0% 1.338e+04 100.0% 6.250e+02 99.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %f - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecMDot 40 1.0 9.1712e-02 1.0 3.29e+07 1.0 0.0e+00 0.0e+00 4.0e+01 1 5 0 0 6 1 5 0 0 6 2874 VecTDot 10 1.0 2.3873e-02 1.1 5.24e+06 1.0 0.0e+00 0.0e+00 1.0e+01 0 1 0 0 2 0 1 0 0 2 1757 VecNorm 52 1.0 2.3764e-02 1.3 1.08e+07 1.0 0.0e+00 0.0e+00 5.2e+01 0 2 0 0 8 0 2 0 0 8 3630 VecScale 124 1.0 2.6341e-02 1.4 9.29e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2820 VecCopy 27 1.0 1.7691e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 109 1.0 1.7006e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 178 1.0 1.1252e-01 1.2 3.04e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 4 0 0 0 2 4 0 0 0 2162 VecAYPX 164 1.0 1.0078e-01 1.1 1.68e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1334 VecMAXPY 44 1.0 1.1766e-01 1.0 3.89e+07 1.0 0.0e+00 0.0e+00 0.0e+00 2 6 0 0 0 2 6 0 0 0 2647 VecScatterBegin 228 1.0 5.5004e-02 1.1 0.00e+00 0.0 7.4e+03 1.4e+04 0.0e+00 1 0 79 82 0 1 0 79 82 0 0 VecScatterEnd 228 1.0 4.0928e-02 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecNormalize 44 1.0 1.8650e-02 1.3 9.88e+06 1.0 0.0e+00 0.0e+00 4.4e+01 0 1 0 0 7 0 1 0 0 7 4240 MatMult 170 1.0 9.7667e-01 1.0 2.41e+08 1.0 6.0e+03 1.6e+04 0.0e+00 15 35 65 77 0 15 35 65 77 0 1977 MatMultAdd 20 1.0 5.1495e-02 1.1 1.01e+07 1.0 4.8e+02 2.8e+03 0.0e+00 1 1 5 1 0 1 1 5 1 0 1570 MatMultTranspose 24 1.0 6.8663e-02 1.2 1.21e+07 1.0 5.8e+02 2.8e+03 0.0e+00 1 2 6 1 0 1 2 6 1 0 1413 MatSolve 5 1.0 3.2754e-03 1.0 1.20e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2930 MatSOR 164 1.0 1.7211e+00 1.0 2.26e+08 1.0 0.0e+00 0.0e+00 0.0e+00 27 32 0 0 0 27 32 0 0 0 1050 MatLUFactorSym 1 1.0 3.0711e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 1.0 2.4564e-02 1.0 1.95e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 3 0 0 0 6355 MatAssemblyBegin 20 1.0 8.0438e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 2.2e+01 0 0 0 0 4 0 0 0 0 4 0 MatAssemblyEnd 20 1.0 1.3442e-01 1.0 0.00e+00 0.0 5.6e+02 2.1e+03 7.2e+01 2 0 6 1 12 2 0 6 1 12 0 MatGetRowIJ 1 1.0 1.1206e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 4.0507e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 24 1.2 1.7951e-03 2.4 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+01 0 0 0 0 3 0 0 0 0 3 0 MatPtAP 4 1.0 6.4214e-01 1.0 4.06e+07 1.0 1.1e+03 1.7e+04 1.0e+02 10 6 12 16 16 10 6 12 16 16 506 MatPtAPSymbolic 4 1.0 3.7196e-01 1.0 0.00e+00 0.0 7.2e+02 2.0e+04 6.0e+01 6 0 8 12 10 6 0 8 12 10 0 MatPtAPNumeric 4 1.0 2.7023e-01 1.0 4.06e+07 1.0 4.2e+02 1.2e+04 4.0e+01 4 6 4 4 6 4 6 4 4 6 1201 MatGetRedundant 1 1.0 8.0895e-04 1.1 0.00e+00 0.0 1.7e+02 7.1e+03 4.0e+00 0 0 2 1 1 0 0 2 1 1 0 MatGetLocalMat 4 1.0 4.0415e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 1 0 0 0 1 1 0 0 0 1 0 MatGetBrAoCol 4 1.0 1.7636e-02 1.0 0.00e+00 0.0 4.3e+02 2.7e+04 8.0e+00 0 0 5 9 1 0 0 5 9 1 0 MatGetSymTrans 8 1.0 1.3187e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPGMRESOrthog 40 1.0 1.8928e-01 1.0 6.59e+07 1.0 0.0e+00 0.0e+00 4.0e+01 3 9 0 0 6 3 9 0 0 6 2785 KSPSetUp 11 1.0 3.2629e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 7.2e+01 0 0 0 0 12 0 0 0 0 12 0 KSPSolve 2 1.0 3.3489e+00 1.0 6.33e+08 1.0 7.3e+03 1.4e+04 2.3e+02 52 91 78 79 36 52 91 78 79 36 1512 PCSetUp 1 1.0 8.6804e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 14 9 21 17 52 14 9 21 17 52 572 PCApply 5 1.0 3.1772e+00 1.0 5.96e+08 1.0 7.1e+03 1.3e+04 2.0e+02 49 86 76 72 33 49 86 76 72 33 1501 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Container 1 1 564 0 Vector 139 139 71560728 0 Vector Scatter 21 21 22092 0 Matrix 37 37 75834272 0 Matrix Null Space 1 1 596 0 Distributed Mesh 5 5 2740736 0 Bipartite Graph 10 10 7920 0 Index Set 50 50 1546832 0 IS L to G Mapping 5 5 1361108 0 Krylov Solver 11 11 129320 0 DMKSP interface 3 3 1944 0 Preconditioner 11 11 9840 0 Viewer 3 2 1456 0 ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 Average time for MPI_Barrier(): 2.43187e-06 Average time for zero size MPI_Send(): 2.5034e-06 #PETSc Option Table entries: -ksp_converged_reason -ksp_monitor -ksp_view -log_summary -options_left -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Wed Jul 31 22:48:06 2013 Configure options: --known-level1-dcache-size=65536 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=2 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=0 --known-mpi-c-double-complex=0 --with-cc=cc --with-cxx=CC --with-fc=ftn --with-clib-autodetect=0 --with-cxxlib-autodetect=0 --with-fortranlib-autodetect=0 --with-debugging=0 --COPTFLAGS="-fastsse -Mipa=fast -mp" --CXXOPTFLAGS="-fastsse -Mipa=fast -mp" --FOPTFLAGS="-fastsse -Mipa=fast -mp" --with-blas-lapack-lib="-L/opt/acml/4.4.0/pgi64/lib -lacml -lacml_mv" --with-shared-libraries=0 --with-x=0 --with-batch --known-mpi-shared-libraries=0 PETSC_ARCH=arch-cray-xt5-pkgs-opt ----------------------------------------- Libraries compiled on Wed Jul 31 22:48:06 2013 on krakenpf1 Machine characteristics: Linux-2.6.27.48-0.12.1_1.0301.5943-cray_ss_s-x86_64-with-SuSE-11-x86_64 Using PETSc directory: /nics/c/home/mrosso/LIBS/petsc-3.4.2 Using PETSc arch: arch-cray-xt5-pkgs-opt ----------------------------------------- Using C compiler: cc -fastsse -Mipa=fast -mp ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: ftn -fastsse -Mipa=fast -mp ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/include -I/opt/cray/portals/2.2.0-1.0301.26633.6.9.ss/include -I/opt/cray/pmi/2.1.4-1.0000.8596.15.1.ss/include -I/opt/cray/mpt/5.3.5/xt/seastar/mpich2-pgi/109/include -I/opt/acml/4.4.0/pgi64/include -I/opt/xt-libsci/11.0.04/pgi/109/istanbul/include -I/opt/fftw/3.3.0.0/x86_64/include -I/usr/include/alps ----------------------------------------- Using C linker: cc Using Fortran linker: ftn Using libraries: -Wl,-rpath,/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib -L/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib -lpetsc -L/opt/acml/4.4.0/pgi64/lib -lacml -lacml_mv -lpthread -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_converged_reason -ksp_monitor -ksp_view -log_summary -options_left -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg #End of PETSc Option Table entries There are no unused options. From knepley at gmail.com Tue Aug 13 19:51:23 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 13 Aug 2013 19:51:23 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520AC9DE.1050508@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> Message-ID: On Tue, Aug 13, 2013 at 7:05 PM, Michele Rosso wrote: > Hi Matt, > > I attached the output of the commands you suggested. > The options I used are: > > -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg > -pc_mg_galerkin -pc_mg_levels 5 -options_left > The convergence is great. I notice that your coarse solve takes no time. You could probably use fewer levels for this problem. For this problem there is no easy things left I think. We are currently debating how you can squeeze something extra out of the smoother. Here you could probably get rid of Chebychev and use only SOR. Matt > and here are the lines of codes where I setup the solution process: > > call DMDACreate3d( PETSC_COMM_WORLD , > & > & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, > & > & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, > & > & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , > & > & NNZ ,NNY , NNX, da , ierr) > > > ! Create Global Vectors > call DMCreateGlobalVector(da,b,ierr) > call VecDuplicate(b,x,ierr) > > ! Set initial guess for first use of the module to 0 > call VecSet(x,0.0_rp,ierr) > > ! Create matrix > call DMCreateMatrix(da,MATAIJ,A,ierr) > > ! Create solver > call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) > call KSPSetDM(ksp,da,ierr) > call KSPSetDMActive(ksp,PETSC_FALSE,ierr) > call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) > call KSPSetType(ksp,KSPCG,ierr) > call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) > call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) > call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& > & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) > > ! Nullspace removal > call MatNullSpaceCreate( > PETSC_COMM_WORLD,PETSC_TRUE,PETSC_NULL_INTEGER,& > & PETSC_NULL_INTEGER,nullspace,ierr) > call KSPSetNullspace(ksp,nullspace,ierr) > call MatNullSpaceDestroy(nullspace,ierr) > > ! To allow using option from command line > call KSPSetFromOptions(ksp,ierr) > > > Hope I did not omit anything useful. > Thank you for your time. > > Best, > Michele > > > > > On 08/13/2013 04:26 PM, Matthew Knepley wrote: > > On Tue, Aug 13, 2013 at 6:09 PM, Michele Rosso wrote: > >> Hi Karli, >> >> thank you for your hint: now it works. >> Now I would like to speed up the solution: I was counting on increasing >> the number of levels/the number of processors used, but now I see I cannot >> do that. >> Do you have any hint to achieve better speed? >> Thanks! >> > > "Better speed" is not very helpful for us, and thus we cannot offer much > help. You could > > 1) Send the output of -log_summary -ksp_monitor -ksp_view > > 2) Describe the operator succintly > > Matt > > >> Best, >> Michele >> >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Tue Aug 13 20:25:56 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 18:25:56 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> Message-ID: <520ADCA4.3030902@uci.edu> Matt, thank you! I will try to reduce the number of levels and see how it goes. I asked about the speed since CG + Block Jacobi with ICC in each block runs faster then CG + MG, so I thought I was missing something. Could you please tell me how to get rid of Chebichev? Best, Michele On 08/13/2013 05:51 M, Matthew Knepley wrote: > On Tue, Aug 13, 2013 at 7:05 PM, Michele Rosso > wrote: > > Hi Matt, > > I attached the output of the commands you suggested. > The options I used are: > > -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type > mg -pc_mg_galerkin -pc_mg_levels 5 -options_left > > > The convergence is great. I notice that your coarse solve takes no > time. You could probably use fewer levels for > this problem. For this problem there is no easy things left I think. > We are currently debating how you can squeeze > something extra out of the smoother. Here you could probably get rid > of Chebychev and use only SOR. > > Matt > > and here are the lines of codes where I setup the solution process: > > call DMDACreate3d( PETSC_COMM_WORLD > , & > & DMDA_BOUNDARY_PERIODIC , > DMDA_BOUNDARY_PERIODIC, & > & DMDA_BOUNDARY_PERIODIC , > DMDA_STENCIL_STAR, & > & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip > , 1_ip , & > & NNZ ,NNY , NNX, da , ierr) > > > ! Create Global Vectors > call DMCreateGlobalVector(da,b,ierr) > call VecDuplicate(b,x,ierr) > > ! Set initial guess for first use of the module to 0 > call VecSet(x,0.0_rp,ierr) > > ! Create matrix > call DMCreateMatrix(da,MATAIJ,A,ierr) > > ! Create solver > call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) > call KSPSetDM(ksp,da,ierr) > call KSPSetDMActive(ksp,PETSC_FALSE,ierr) > call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) > call KSPSetType(ksp,KSPCG,ierr) > call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) > call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) > call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& > & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) > > ! Nullspace removal > call MatNullSpaceCreate( > PETSC_COMM_WORLD,PETSC_TRUE,PETSC_NULL_INTEGER,& > & PETSC_NULL_INTEGER,nullspace,ierr) > call KSPSetNullspace(ksp,nullspace,ierr) > call MatNullSpaceDestroy(nullspace,ierr) > > ! To allow using option from command line > call KSPSetFromOptions(ksp,ierr) > > > Hope I did not omit anything useful. > Thank you for your time. > > Best, > Michele > > > > > On 08/13/2013 04:26 PM, Matthew Knepley wrote: >> On Tue, Aug 13, 2013 at 6:09 PM, Michele Rosso > > wrote: >> >> Hi Karli, >> >> thank you for your hint: now it works. >> Now I would like to speed up the solution: I was counting on >> increasing the number of levels/the number of processors >> used, but now I see I cannot do that. >> Do you have any hint to achieve better speed? >> Thanks! >> >> >> "Better speed" is not very helpful for us, and thus we cannot >> offer much help. You could >> >> 1) Send the output of -log_summary -ksp_monitor -ksp_view >> >> 2) Describe the operator succintly >> >> Matt >> >> Best, >> Michele >>>>>>>>>>>>>>>>>> > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Aug 13 20:34:55 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 13 Aug 2013 20:34:55 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520ADCA4.3030902@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> Message-ID: On Tue, Aug 13, 2013 at 8:25 PM, Michele Rosso wrote: > Matt, > > thank you! I will try to reduce the number of levels and see how it goes. > I asked about the speed since CG + Block Jacobi with ICC in each block > runs faster then CG + MG, so I thought I was missing something. > What is the operator? If its the mass matrix, we expect this. > Could you please tell me how to get rid of Chebichev? > I believe its -mg_levels_ksp_type richardson but you can check. matt > Best, > Michele > > On 08/13/2013 05:51 M, Matthew Knepley wrote: > > On Tue, Aug 13, 2013 at 7:05 PM, Michele Rosso wrote: > >> Hi Matt, >> >> I attached the output of the commands you suggested. >> The options I used are: >> >> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg >> -pc_mg_galerkin -pc_mg_levels 5 -options_left >> > > The convergence is great. I notice that your coarse solve takes no time. > You could probably use fewer levels for > this problem. For this problem there is no easy things left I think. We > are currently debating how you can squeeze > something extra out of the smoother. Here you could probably get rid of > Chebychev and use only SOR. > > Matt > > >> and here are the lines of codes where I setup the solution process: >> >> call DMDACreate3d( PETSC_COMM_WORLD >> , & >> & DMDA_BOUNDARY_PERIODIC , >> DMDA_BOUNDARY_PERIODIC, & >> & DMDA_BOUNDARY_PERIODIC , >> DMDA_STENCIL_STAR, & >> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip >> , & >> & NNZ ,NNY , NNX, da , ierr) >> >> >> ! Create Global Vectors >> call DMCreateGlobalVector(da,b,ierr) >> call VecDuplicate(b,x,ierr) >> >> ! Set initial guess for first use of the module to 0 >> call VecSet(x,0.0_rp,ierr) >> >> ! Create matrix >> call DMCreateMatrix(da,MATAIJ,A,ierr) >> >> ! Create solver >> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >> call KSPSetDM(ksp,da,ierr) >> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >> call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >> call KSPSetType(ksp,KSPCG,ierr) >> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) >> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >> >> ! Nullspace removal >> call MatNullSpaceCreate( >> PETSC_COMM_WORLD,PETSC_TRUE,PETSC_NULL_INTEGER,& >> & PETSC_NULL_INTEGER,nullspace,ierr) >> call KSPSetNullspace(ksp,nullspace,ierr) >> call MatNullSpaceDestroy(nullspace,ierr) >> >> ! To allow using option from command line >> call KSPSetFromOptions(ksp,ierr) >> >> >> Hope I did not omit anything useful. >> Thank you for your time. >> >> Best, >> Michele >> >> >> >> >> On 08/13/2013 04:26 PM, Matthew Knepley wrote: >> >> On Tue, Aug 13, 2013 at 6:09 PM, Michele Rosso wrote: >> >>> Hi Karli, >>> >>> thank you for your hint: now it works. >>> Now I would like to speed up the solution: I was counting on increasing >>> the number of levels/the number of processors used, but now I see I cannot >>> do that. >>> Do you have any hint to achieve better speed? >>> Thanks! >>> >> >> "Better speed" is not very helpful for us, and thus we cannot offer >> much help. You could >> >> 1) Send the output of -log_summary -ksp_monitor -ksp_view >> >> 2) Describe the operator succintly >> >> Matt >> >> >>> Best, >>> Michele >>> >>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Tue Aug 13 20:40:20 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 18:40:20 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> Message-ID: <520AE004.1010803@uci.edu> Matt, thank you. The matrix arises from discretization of the Poisson equation in incompressible flow calculations. Michele On 08/13/2013 06:34 PM, Matthew Knepley wrote: > On Tue, Aug 13, 2013 at 8:25 PM, Michele Rosso > wrote: > > Matt, > > thank you! I will try to reduce the number of levels and see how > it goes. > I asked about the speed since CG + Block Jacobi with ICC in each > block runs faster then CG + MG, so I thought I was missing something. > > > What is the operator? If its the mass matrix, we expect this. > > Could you please tell me how to get rid of Chebichev? > > > I believe its > > -mg_levels_ksp_type richardson > > but you can check. > > matt > > Best, > Michele > > On 08/13/2013 05:51 M, Matthew Knepley wrote: >> On Tue, Aug 13, 2013 at 7:05 PM, Michele Rosso > > wrote: >> >> Hi Matt, >> >> I attached the output of the commands you suggested. >> The options I used are: >> >> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason >> -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -options_left >> >> >> The convergence is great. I notice that your coarse solve takes >> no time. You could probably use fewer levels for >> this problem. For this problem there is no easy things left I >> think. We are currently debating how you can squeeze >> something extra out of the smoother. Here you could probably get >> rid of Chebychev and use only SOR. >> >> Matt >> >> and here are the lines of codes where I setup the solution >> process: >> >> call DMDACreate3d( PETSC_COMM_WORLD >> , & >> & DMDA_BOUNDARY_PERIODIC , >> DMDA_BOUNDARY_PERIODIC, & >> & DMDA_BOUNDARY_PERIODIC , >> DMDA_STENCIL_STAR, & >> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, >> 1_ip , 1_ip , & >> & NNZ ,NNY , NNX, da , ierr) >> >> >> ! Create Global Vectors >> call DMCreateGlobalVector(da,b,ierr) >> call VecDuplicate(b,x,ierr) >> >> ! Set initial guess for first use of the module to 0 >> call VecSet(x,0.0_rp,ierr) >> >> ! Create matrix >> call DMCreateMatrix(da,MATAIJ,A,ierr) >> >> ! Create solver >> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >> call KSPSetDM(ksp,da,ierr) >> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >> call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >> call KSPSetType(ksp,KSPCG,ierr) >> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) >> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >> call KSPSetTolerances(ksp, tol >> ,PETSC_DEFAULT_DOUBLE_PRECISION,& >> & >> PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >> >> ! Nullspace removal >> call MatNullSpaceCreate( >> PETSC_COMM_WORLD,PETSC_TRUE,PETSC_NULL_INTEGER,& >> & PETSC_NULL_INTEGER,nullspace,ierr) >> call KSPSetNullspace(ksp,nullspace,ierr) >> call MatNullSpaceDestroy(nullspace,ierr) >> >> ! To allow using option from command line >> call KSPSetFromOptions(ksp,ierr) >> >> >> Hope I did not omit anything useful. >> Thank you for your time. >> >> Best, >> Michele >> >> >> >> >> On 08/13/2013 04:26 PM, Matthew Knepley wrote: >>> On Tue, Aug 13, 2013 at 6:09 PM, Michele Rosso >>> > wrote: >>> >>> Hi Karli, >>> >>> thank you for your hint: now it works. >>> Now I would like to speed up the solution: I was >>> counting on increasing the number of levels/the number >>> of processors used, but now I see I cannot do that. >>> Do you have any hint to achieve better speed? >>> Thanks! >>> >>> >>> "Better speed" is not very helpful for us, and thus we >>> cannot offer much help. You could >>> >>> 1) Send the output of -log_summary -ksp_monitor -ksp_view >>> >>> 2) Describe the operator succintly >>> >>> Matt >>> >>> Best, >>> Michele >>>>>>>>>>>>>>>>>>> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Aug 13 20:43:36 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 13 Aug 2013 20:43:36 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520ABC8E.3040204@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> Message-ID: <5B150477-B0C5-4D36-BDF0-9D731ED28F1D@mcs.anl.gov> Michele, Why do you want to use only a 2d parallel decomposition? For big problems a 3d decomposition is better. You should be able to use geometric multigrid with several levels and it should be very fast. Barry On Aug 13, 2013, at 6:09 PM, Michele Rosso wrote: > Hi Karli, > > thank you for your hint: now it works. > Now I would like to speed up the solution: I was counting on increasing the number of levels/the number of processors used, but now I see I cannot do that. > Do you have any hint to achieve better speed? > Thanks! > > Best, > Michele > > On 08/13/2013 01:33 PM, Karl Rupp wrote: >> Hi Michele, >> >> I suggest you try a different decomposition of your grid. With k levels, you should have at least 2^{k-1} grid nodes per coordinate direction in order to be able to correctly build a coarser mesh. In your case, you should have at least 8 nodes (leading to coarser levels of size 4, 2, and 1) in z direction. >> >> Best regards, >> Karli >> >> >> On 08/13/2013 02:28 PM, Michele Rosso wrote: >>> Hi Barry, >>> >>> I was finally able to try multigrid with a singular system and a finer grid. >>> GAMG works perfectly and has no problem in handling the singular system. >>> On the other hand, MG is giving me problem: >>> >>> [0]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [0]PETSC ERROR: Argument out of range! >>> [0]PETSC ERROR: Partition in x direction is too fine! 32 64! >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./hit on a arch-cray-xt5-pkgs-opt named nid01332 by >>> Unknown Tue Aug 13 15:06:21 2013 >>> [0]PETSC ERROR: Libraries linked from >>> /nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib >>> [0]PETSC ERROR: Configure run at Wed Jul 31 22:48:06 2013 >>> >>> The input I used is: >>> -ksp_monitor -ksp_converged_reason -pc_type mg -pc_mg_galerkin >>> -pc_mg_levels 4 -options_left >>> >>> I am simulating a 256^3 grid with 256 processors. Since I am using a 2D >>> domain decomposition, each sub-domain contains 256x64x4 grid points. >>> To be consistent with my code indexing, I had to initialize DMDA with >>> reverse ordering, that is z,y,x, so when the error message says "x is >>> too fine" it actually means "z is too fine". >>> I was wondering what is the minimum number of nodes per direction that >>> would avoid this problem and how the number of levels is related to >>> minimum grid size required. >>> Thank you! >>> >>> Michele >>> >>> >>> On 08/02/2013 03:11 PM, Barry Smith wrote: >>>> On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: >>>> >>>>> Barry, >>>>> >>>>> thank you very much for your help. I was trying to debug the error with no success! >>>>> Now it works like a charm for me too! >>>>> I have still two questions for you: >>>>> >>>>> 1) How did you choose the number of levels to use: trial and error? >>>> I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. >>>> >>>>> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? >>>> I don't know of anything. >>>> >>>> But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. >>>> >>>> Barry >>>> >>>>> Again, thank you very much! >>>>> >>>>> Michele >>>>> >>>>> On 08/02/2013 02:38 PM, Barry Smith wrote: >>>>>> Finally got it. My failing memory. I had to add the line >>>>>> >>>>>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>>>>> >>>>>> immediately after KSPSetDM() and >>>>>> >>>>>> change >>>>>> >>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>> >>>>>> to >>>>>> >>>>>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>>>>> >>>>>> so it will work in both parallel and sequential then >>>>>> >>>>>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>>>>> >>>>>> works great with 2 levels. >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>>>>> >>>>>> wrote: >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> >>>>>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>>>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>>>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>>>>> Please, contact me for any question/suggestion. >>>>>>> I the mean time I will try to debug it. >>>>>>> >>>>>>> Michele >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>>>>> >>>>>>>> Run on one process until this is debugged. You can try the option >>>>>>>> >>>>>>>> -start_in_debugger noxterm >>>>>>>> >>>>>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Barry, >>>>>>>>> >>>>>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>>>>> Also, -da_refine is never used (see previous output). >>>>>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>>>>> >>>>>>>>> and now the error is different: >>>>>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>> [1]PETSC ERROR: Floating point exception! >>>>>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>> [2]PETSC ERROR: Floating point exception! >>>>>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>> [3]PETSC ERROR: Floating point exception! >>>>>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>> Configure options >>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>> [0]PETSC ERROR: Floating point exception! >>>>>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>> >>>>>>>>> #PETSc Option Table entries: >>>>>>>>> -ksp_view >>>>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>> -mg_levels_ksp_type chebyshev >>>>>>>>> -mg_levels_pc_type jacobi >>>>>>>>> -options_left >>>>>>>>> -pc_mg_galerkin >>>>>>>>> -pc_type mg >>>>>>>>> #End of PETSc Option Table entries >>>>>>>>> There are no unused options. >>>>>>>>> >>>>>>>>> Michele >>>>>>>>> >>>>>>>>> >>>>>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>>> >>>>>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Barry, >>>>>>>>>>> >>>>>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>>>>> >>>>>>>>>>> ! Create matrix >>>>>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>> >>>>>>>>>>> ! Create rhs term >>>>>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>>>>> >>>>>>>>>>> ! Solve system >>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>>>>> >>>>>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Michele >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Barry, >>>>>>>>>>>>> >>>>>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>> >>>>>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>>>>> >>>>>>>>>>>>> The output is: >>>>>>>>>>>>> >>>>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> Arguments are incompatible! >>>>>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>> -options_left >>>>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>>>>> >>>>>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>>>>> ! Create Global Vectors >>>>>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>>>>> ! Create matrix >>>>>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>>>>> ! Create solver >>>>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>>>>> >>>>>>>>>>>>> ! To allow using option from command line >>>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Michele >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>>>>> >>>>>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>>>>> >>>> >>> >> > From mrosso at uci.edu Tue Aug 13 20:48:46 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 18:48:46 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <5B150477-B0C5-4D36-BDF0-9D731ED28F1D@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <5B150477-B0C5-4D36-BDF0-9D731ED28F1D@mcs.anl.gov> Message-ID: <520AE1FE.9010101@uci.edu> Barry, the reason is that the decomposition was already implemented and change it would be a mess :-) Michele On 08/13/2013 06:43 PM, Barry Smith wrote: > Michele, > > Why do you want to use only a 2d parallel decomposition? For big problems a 3d decomposition is better. You should be able to use geometric multigrid with several levels and it should be very fast. > > > Barry > > On Aug 13, 2013, at 6:09 PM, Michele Rosso wrote: > >> Hi Karli, >> >> thank you for your hint: now it works. >> Now I would like to speed up the solution: I was counting on increasing the number of levels/the number of processors used, but now I see I cannot do that. >> Do you have any hint to achieve better speed? >> Thanks! >> >> Best, >> Michele >> >> On 08/13/2013 01:33 PM, Karl Rupp wrote: >>> Hi Michele, >>> >>> I suggest you try a different decomposition of your grid. With k levels, you should have at least 2^{k-1} grid nodes per coordinate direction in order to be able to correctly build a coarser mesh. In your case, you should have at least 8 nodes (leading to coarser levels of size 4, 2, and 1) in z direction. >>> >>> Best regards, >>> Karli >>> >>> >>> On 08/13/2013 02:28 PM, Michele Rosso wrote: >>>> Hi Barry, >>>> >>>> I was finally able to try multigrid with a singular system and a finer grid. >>>> GAMG works perfectly and has no problem in handling the singular system. >>>> On the other hand, MG is giving me problem: >>>> >>>> [0]PETSC ERROR: --------------------- Error Message >>>> ------------------------------------ >>>> [0]PETSC ERROR: Argument out of range! >>>> [0]PETSC ERROR: Partition in x direction is too fine! 32 64! >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: ./hit on a arch-cray-xt5-pkgs-opt named nid01332 by >>>> Unknown Tue Aug 13 15:06:21 2013 >>>> [0]PETSC ERROR: Libraries linked from >>>> /nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib >>>> [0]PETSC ERROR: Configure run at Wed Jul 31 22:48:06 2013 >>>> >>>> The input I used is: >>>> -ksp_monitor -ksp_converged_reason -pc_type mg -pc_mg_galerkin >>>> -pc_mg_levels 4 -options_left >>>> >>>> I am simulating a 256^3 grid with 256 processors. Since I am using a 2D >>>> domain decomposition, each sub-domain contains 256x64x4 grid points. >>>> To be consistent with my code indexing, I had to initialize DMDA with >>>> reverse ordering, that is z,y,x, so when the error message says "x is >>>> too fine" it actually means "z is too fine". >>>> I was wondering what is the minimum number of nodes per direction that >>>> would avoid this problem and how the number of levels is related to >>>> minimum grid size required. >>>> Thank you! >>>> >>>> Michele >>>> >>>> >>>> On 08/02/2013 03:11 PM, Barry Smith wrote: >>>>> On Aug 2, 2013, at 4:52 PM, Michele Rosso wrote: >>>>> >>>>>> Barry, >>>>>> >>>>>> thank you very much for your help. I was trying to debug the error with no success! >>>>>> Now it works like a charm for me too! >>>>>> I have still two questions for you: >>>>>> >>>>>> 1) How did you choose the number of levels to use: trial and error? >>>>> I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. >>>>> >>>>>> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? >>>>> I don't know of anything. >>>>> >>>>> But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. >>>>> >>>>> Barry >>>>> >>>>>> Again, thank you very much! >>>>>> >>>>>> Michele >>>>>> >>>>>> On 08/02/2013 02:38 PM, Barry Smith wrote: >>>>>>> Finally got it. My failing memory. I had to add the line >>>>>>> >>>>>>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>>>>>> >>>>>>> immediately after KSPSetDM() and >>>>>>> >>>>>>> change >>>>>>> >>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>> >>>>>>> to >>>>>>> >>>>>>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>>>>>> >>>>>>> so it will work in both parallel and sequential then >>>>>>> >>>>>>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>>>>>> >>>>>>> works great with 2 levels. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>>>>>> >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>>>>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>>>>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>>>>>> Please, contact me for any question/suggestion. >>>>>>>> I the mean time I will try to debug it. >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>>>>>> >>>>>>>>> Run on one process until this is debugged. You can try the option >>>>>>>>> >>>>>>>>> -start_in_debugger noxterm >>>>>>>>> >>>>>>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Barry, >>>>>>>>>> >>>>>>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>>>>>> Also, -da_refine is never used (see previous output). >>>>>>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>>>>>> >>>>>>>>>> and now the error is different: >>>>>>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [1]PETSC ERROR: Floating point exception! >>>>>>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Floating point exception! >>>>>>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Floating point exception! >>>>>>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>> Configure options >>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Floating point exception! >>>>>>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>> >>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>> -ksp_view >>>>>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>>> -mg_levels_ksp_type chebyshev >>>>>>>>>> -mg_levels_pc_type jacobi >>>>>>>>>> -options_left >>>>>>>>>> -pc_mg_galerkin >>>>>>>>>> -pc_type mg >>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>> There are no unused options. >>>>>>>>>> >>>>>>>>>> Michele >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>>>> >>>>>>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Barry, >>>>>>>>>>>> >>>>>>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>>>>>> >>>>>>>>>>>> ! Create matrix >>>>>>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>>> >>>>>>>>>>>> ! Create rhs term >>>>>>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>>>>>> >>>>>>>>>>>> ! Solve system >>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>>>>>> >>>>>>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Michele >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>> >>>>>>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>>>>>> >>>>>>>>>>>>>> The output is: >>>>>>>>>>>>>> >>>>>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> Arguments are incompatible! >>>>>>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>>>>>> >>>>>>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>>>>>> ! Create Global Vectors >>>>>>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>>>>>> ! Create matrix >>>>>>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>>>>>> ! Create solver >>>>>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>>>>>> >>>>>>>>>>>>>> ! To allow using option from command line >>>>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Michele >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>>>>>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue Aug 13 20:49:36 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 13 Aug 2013 20:49:36 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> Message-ID: <877gfpm47j.fsf@mcs.anl.gov> "Jin, Shuangshuang" writes: > Hi, Shri, > > From the log_summary, we can see that the TSJacobianEval/SNESJacobianEval dominates the computation time as you mentioned. > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s > ------------------------------------------------------------------------------------------------------------------------ > TSJacobianEval 1782 1.0 3.1640e+02 1.0 0.00e+00 0.0 2.0e+03 7.4e+01 1.4e+04 90 0 0 0 21 90 0 0 0 21 0 > SNESJacobianEval 1782 1.0 3.1641e+02 1.0 0.00e+00 0.0 2.0e+03 7.4e+01 1.4e+04 90 0 0 0 21 90 0 0 0 21 0 > > It takes 316 seconds for the total Jacobian Eval out of the 350 seconds of SNESSolve,which is the total simulation time. > > So I look into my IJacobian Function. However, all it does is forming > a 1152*1152 dimension Jacobian matrix. Can you send code for it? If not, please create a "stage" for profiling (PetscLogStageRegister), use PetscLogStagePush at the start of your Jacobian function, and PetscLogStagePop at the end of the function. This will profile any PETSc operations that you call while inside your function separately from the solver. I suspect you are either doing a lot of communication or you have poor load balance. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bsmith at mcs.anl.gov Tue Aug 13 20:53:15 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 13 Aug 2013 20:53:15 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520AE1FE.9010101@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <5B150477-B0C5-4D36-BDF0-9D731ED28F1D@mcs.anl.gov> <520AE1FE.9010101@uci.edu> Message-ID: <7AA7169C-83F4-428D-BEB3-7FC1BEB70F16@mcs.anl.gov> GAMG would actually also do better with the proper decomposition. How about leaving the rest of the code with the silly decomposition and doing the Poisson solve using the better decomposition. The time you save in the Poisson solve will be more than the time needed to move the solution data back and forth between the bad decomposition and the good one. Barry On Aug 13, 2013, at 8:48 PM, Michele Rosso wrote: > Barry, > > the reason is that the decomposition was already implemented and change it would be a mess :-) > > Michele > On 08/13/2013 06:43 PM, Barry Smith wrote: >> Michele, >> >> Why do you want to use only a 2d parallel decomposition? For big problems a 3d decomposition is better. You should be able to use geometric multigrid with several levels and it should be very fast. >> >> >> Barry >> >> On Aug 13, 2013, at 6:09 PM, Michele Rosso >> >> wrote: >> >> >>> Hi Karli, >>> >>> thank you for your hint: now it works. >>> Now I would like to speed up the solution: I was counting on increasing the number of levels/the number of processors used, but now I see I cannot do that. >>> Do you have any hint to achieve better speed? >>> Thanks! >>> >>> Best, >>> Michele >>> >>> On 08/13/2013 01:33 PM, Karl Rupp wrote: >>> >>>> Hi Michele, >>>> >>>> I suggest you try a different decomposition of your grid. With k levels, you should have at least 2^{k-1} grid nodes per coordinate direction in order to be able to correctly build a coarser mesh. In your case, you should have at least 8 nodes (leading to coarser levels of size 4, 2, and 1) in z direction. >>>> >>>> Best regards, >>>> Karli >>>> >>>> >>>> On 08/13/2013 02:28 PM, Michele Rosso wrote: >>>> >>>>> Hi Barry, >>>>> >>>>> I was finally able to try multigrid with a singular system and a finer grid. >>>>> GAMG works perfectly and has no problem in handling the singular system. >>>>> On the other hand, MG is giving me problem: >>>>> >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>> ------------------------------------ >>>>> [0]PETSC ERROR: Argument out of range! >>>>> [0]PETSC ERROR: Partition in x direction is too fine! 32 64! >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: ./hit on a arch-cray-xt5-pkgs-opt named nid01332 by >>>>> Unknown Tue Aug 13 15:06:21 2013 >>>>> [0]PETSC ERROR: Libraries linked from >>>>> /nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib >>>>> [0]PETSC ERROR: Configure run at Wed Jul 31 22:48:06 2013 >>>>> >>>>> The input I used is: >>>>> -ksp_monitor -ksp_converged_reason -pc_type mg -pc_mg_galerkin >>>>> -pc_mg_levels 4 -options_left >>>>> >>>>> I am simulating a 256^3 grid with 256 processors. Since I am using a 2D >>>>> domain decomposition, each sub-domain contains 256x64x4 grid points. >>>>> To be consistent with my code indexing, I had to initialize DMDA with >>>>> reverse ordering, that is z,y,x, so when the error message says "x is >>>>> too fine" it actually means "z is too fine". >>>>> I was wondering what is the minimum number of nodes per direction that >>>>> would avoid this problem and how the number of levels is related to >>>>> minimum grid size required. >>>>> Thank you! >>>>> >>>>> Michele >>>>> >>>>> >>>>> On 08/02/2013 03:11 PM, Barry Smith wrote: >>>>> >>>>>> On Aug 2, 2013, at 4:52 PM, Michele Rosso >>>>>> wrote: >>>>>> >>>>>> >>>>>>> Barry, >>>>>>> >>>>>>> thank you very much for your help. I was trying to debug the error with no success! >>>>>>> Now it works like a charm for me too! >>>>>>> I have still two questions for you: >>>>>>> >>>>>>> 1) How did you choose the number of levels to use: trial and error? >>>>>>> >>>>>> I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. >>>>>> >>>>>> >>>>>>> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? >>>>>>> >>>>>> I don't know of anything. >>>>>> >>>>>> But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>>> Again, thank you very much! >>>>>>> >>>>>>> Michele >>>>>>> >>>>>>> On 08/02/2013 02:38 PM, Barry Smith wrote: >>>>>>> >>>>>>>> Finally got it. My failing memory. I had to add the line >>>>>>>> >>>>>>>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>>>>>>> >>>>>>>> immediately after KSPSetDM() and >>>>>>>> >>>>>>>> change >>>>>>>> >>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>> >>>>>>>> to >>>>>>>> >>>>>>>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>>>>>>> >>>>>>>> so it will work in both parallel and sequential then >>>>>>>> >>>>>>>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>>>>>>> >>>>>>>> works great with 2 levels. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Barry, >>>>>>>>> >>>>>>>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>>>>>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>>>>>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>>>>>>> Please, contact me for any question/suggestion. >>>>>>>>> I the mean time I will try to debug it. >>>>>>>>> >>>>>>>>> Michele >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>>> Run on one process until this is debugged. You can try the option >>>>>>>>>> >>>>>>>>>> -start_in_debugger noxterm >>>>>>>>>> >>>>>>>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Barry, >>>>>>>>>>> >>>>>>>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>>>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>>>>>>> Also, -da_refine is never used (see previous output). >>>>>>>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>>>>>>> >>>>>>>>>>> and now the error is different: >>>>>>>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: Floating point exception! >>>>>>>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: Floating point exception! >>>>>>>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: Floating point exception! >>>>>>>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>> Configure options >>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Floating point exception! >>>>>>>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> >>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>> -ksp_view >>>>>>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>>>> -mg_levels_ksp_type chebyshev >>>>>>>>>>> -mg_levels_pc_type jacobi >>>>>>>>>>> -options_left >>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>> -pc_type mg >>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>> There are no unused options. >>>>>>>>>>> >>>>>>>>>>> Michele >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>>>>> >>>>>>>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Barry, >>>>>>>>>>>>> >>>>>>>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>>>>>>> >>>>>>>>>>>>> ! Create matrix >>>>>>>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>>>> >>>>>>>>>>>>> ! Create rhs term >>>>>>>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>>>>>>> >>>>>>>>>>>>> ! Solve system >>>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>>>>>>> >>>>>>>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Michele >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> The output is: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> Arguments are incompatible! >>>>>>>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>>>>>>> ! Create Global Vectors >>>>>>>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>>>>>>> ! Create matrix >>>>>>>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>>>>>>> ! Create solver >>>>>>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ! To allow using option from command line >>>>>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>>>>>>> >>>>>>>>> >> > From mrosso at uci.edu Tue Aug 13 21:22:08 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 19:22:08 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <7AA7169C-83F4-428D-BEB3-7FC1BEB70F16@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAAA5F.20805@uci.edu> <51FABB5B.50708@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <5B150477-B0C5-4D36-BDF0-9D731ED28F1D@mcs.anl.gov> <520AE1FE.9010101@uci.edu> <7AA7169C-83F4-428D-BEB3-7FC1BEB70F16@mcs.anl.gov> Message-ID: <520AE9D0.1090509@uci.edu> Barry, thank you for your input. I may consider to do so in the future. I have to keep the 2D decomposition though since I need to perform FFT for spectral analysis as well as turbulent initialization and I am not aware of any library that does that in a 3D decomposition framework (only 2D). Michele On 08/13/2013 06:53 PM, Barry Smith wrote: > GAMG would actually also do better with the proper decomposition. > > How about leaving the rest of the code with the silly decomposition and doing the Poisson solve using the better decomposition. The time you save in the Poisson solve will be more than the time needed to move the solution data back and forth between the bad decomposition and the good one. > > Barry > > On Aug 13, 2013, at 8:48 PM, Michele Rosso wrote: > >> Barry, >> >> the reason is that the decomposition was already implemented and change it would be a mess :-) >> >> Michele >> On 08/13/2013 06:43 PM, Barry Smith wrote: >>> Michele, >>> >>> Why do you want to use only a 2d parallel decomposition? For big problems a 3d decomposition is better. You should be able to use geometric multigrid with several levels and it should be very fast. >>> >>> >>> Barry >>> >>> On Aug 13, 2013, at 6:09 PM, Michele Rosso >>> >>> wrote: >>> >>> >>>> Hi Karli, >>>> >>>> thank you for your hint: now it works. >>>> Now I would like to speed up the solution: I was counting on increasing the number of levels/the number of processors used, but now I see I cannot do that. >>>> Do you have any hint to achieve better speed? >>>> Thanks! >>>> >>>> Best, >>>> Michele >>>> >>>> On 08/13/2013 01:33 PM, Karl Rupp wrote: >>>> >>>>> Hi Michele, >>>>> >>>>> I suggest you try a different decomposition of your grid. With k levels, you should have at least 2^{k-1} grid nodes per coordinate direction in order to be able to correctly build a coarser mesh. In your case, you should have at least 8 nodes (leading to coarser levels of size 4, 2, and 1) in z direction. >>>>> >>>>> Best regards, >>>>> Karli >>>>> >>>>> >>>>> On 08/13/2013 02:28 PM, Michele Rosso wrote: >>>>> >>>>>> Hi Barry, >>>>>> >>>>>> I was finally able to try multigrid with a singular system and a finer grid. >>>>>> GAMG works perfectly and has no problem in handling the singular system. >>>>>> On the other hand, MG is giving me problem: >>>>>> >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> ------------------------------------ >>>>>> [0]PETSC ERROR: Argument out of range! >>>>>> [0]PETSC ERROR: Partition in x direction is too fine! 32 64! >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: ./hit on a arch-cray-xt5-pkgs-opt named nid01332 by >>>>>> Unknown Tue Aug 13 15:06:21 2013 >>>>>> [0]PETSC ERROR: Libraries linked from >>>>>> /nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib >>>>>> [0]PETSC ERROR: Configure run at Wed Jul 31 22:48:06 2013 >>>>>> >>>>>> The input I used is: >>>>>> -ksp_monitor -ksp_converged_reason -pc_type mg -pc_mg_galerkin >>>>>> -pc_mg_levels 4 -options_left >>>>>> >>>>>> I am simulating a 256^3 grid with 256 processors. Since I am using a 2D >>>>>> domain decomposition, each sub-domain contains 256x64x4 grid points. >>>>>> To be consistent with my code indexing, I had to initialize DMDA with >>>>>> reverse ordering, that is z,y,x, so when the error message says "x is >>>>>> too fine" it actually means "z is too fine". >>>>>> I was wondering what is the minimum number of nodes per direction that >>>>>> would avoid this problem and how the number of levels is related to >>>>>> minimum grid size required. >>>>>> Thank you! >>>>>> >>>>>> Michele >>>>>> >>>>>> >>>>>> On 08/02/2013 03:11 PM, Barry Smith wrote: >>>>>> >>>>>>> On Aug 2, 2013, at 4:52 PM, Michele Rosso >>>>>>> wrote: >>>>>>> >>>>>>> >>>>>>>> Barry, >>>>>>>> >>>>>>>> thank you very much for your help. I was trying to debug the error with no success! >>>>>>>> Now it works like a charm for me too! >>>>>>>> I have still two questions for you: >>>>>>>> >>>>>>>> 1) How did you choose the number of levels to use: trial and error? >>>>>>>> >>>>>>> I just used 2 because it is more than one level :-). When you use a finer mesh you can use more levels. >>>>>>> >>>>>>> >>>>>>>> 2) For a singular system (periodic), besides the nullspace removal, should I change any parameter? >>>>>>>> >>>>>>> I don't know of anything. >>>>>>> >>>>>>> But there is a possible problem with -pc_mg_galerkin, PETSc does not transfer the null space information from the fine mesh to the other meshes and technically we really want the multigrid to remove the null space on all the levels but usually it will work without worrying about that. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>>> Again, thank you very much! >>>>>>>> >>>>>>>> Michele >>>>>>>> >>>>>>>> On 08/02/2013 02:38 PM, Barry Smith wrote: >>>>>>>> >>>>>>>>> Finally got it. My failing memory. I had to add the line >>>>>>>>> >>>>>>>>> call KSPSetDMActive(ksp,PETSC_FALSE,ierr) >>>>>>>>> >>>>>>>>> immediately after KSPSetDM() and >>>>>>>>> >>>>>>>>> change >>>>>>>>> >>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>> >>>>>>>>> to >>>>>>>>> >>>>>>>>> call DMCreateMatrix(da,MATAIJ,A,ierr) >>>>>>>>> >>>>>>>>> so it will work in both parallel and sequential then >>>>>>>>> >>>>>>>>> ksp_monitor -ksp_converged_reason -pc_type mg -ksp_view -pc_mg_galerkin -pc_mg_levels 2 >>>>>>>>> >>>>>>>>> works great with 2 levels. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Aug 1, 2013, at 6:29 PM, Michele Rosso >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Barry, >>>>>>>>>> >>>>>>>>>> no problem. I attached the full code in test_poisson_solver.tar.gz. >>>>>>>>>> My test code is a very reduced version of my productive code (incompressible DNS code) thus fftw3 and the library 2decomp&fft are needed to run it. >>>>>>>>>> I attached the 2decomp&fft version I used: it is a matter of minutes to install it, so you should not have any problem. >>>>>>>>>> Please, contact me for any question/suggestion. >>>>>>>>>> I the mean time I will try to debug it. >>>>>>>>>> >>>>>>>>>> Michele >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On 08/01/2013 04:19 PM, Barry Smith wrote: >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Run on one process until this is debugged. You can try the option >>>>>>>>>>> >>>>>>>>>>> -start_in_debugger noxterm >>>>>>>>>>> >>>>>>>>>>> and then call VecView(vec,0) in the debugger when it gives the error below. It seems like some objects are not getting their initial values set properly. Are you able to email the code so we can run it and figure out what is going on? >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> On Aug 1, 2013, at 5:52 PM, Michele Rosso >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> wrote: >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Barry, >>>>>>>>>>>> >>>>>>>>>>>> I checked the matrix: the element (0,0) is not zero, nor any other diagonal element is. >>>>>>>>>>>> The matrix is symmetric positive define (i.e. the standard Poisson matrix). >>>>>>>>>>>> Also, -da_refine is never used (see previous output). >>>>>>>>>>>> I tried to run with -pc_type mg -pc_mg_galerkin -mg_levels_pc_type jacobi -mg_levels_ksp_type chebyshev -mg_levels_ksp_chebyshev_estimate_eigenvalues -ksp_view -options_left >>>>>>>>>>>> >>>>>>>>>>>> and now the error is different: >>>>>>>>>>>> 0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>> [1]PETSC ERROR: Floating point exception! >>>>>>>>>>>> [1]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>> [2]PETSC ERROR: Floating point exception! >>>>>>>>>>>> [2]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>> [3]PETSC ERROR: Floating point exception! >>>>>>>>>>>> [3]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>>> [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>>> Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [1]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>>> Configure options >>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [2]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [3]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>>> [3]PETSC ERROR: [1]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>>> [1]PETSC ERROR: [2]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>>> [2]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [3]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [2]PETSC ERROR: [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Floating point exception! >>>>>>>>>>>> [0]PETSC ERROR: Vec entry at local location 0 is not-a-number or infinite at beginning of function: Parameter number 2! >>>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 15:43:16 2013 >>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: VecValidValues() line 28 in src/vec/vec/interface/rvector.c >>>>>>>>>>>> [0]PETSC ERROR: MatMult() line 2174 in src/mat/interface/matrix.c >>>>>>>>>>>> [0]PETSC ERROR: KSP_MatMult() line 204 in src/ksp/ksp/impls/cheby//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 504 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> >>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>> -ksp_view >>>>>>>>>>>> -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>>>>> -mg_levels_ksp_type chebyshev >>>>>>>>>>>> -mg_levels_pc_type jacobi >>>>>>>>>>>> -options_left >>>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>>> -pc_type mg >>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>> There are no unused options. >>>>>>>>>>>> >>>>>>>>>>>> Michele >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On 08/01/2013 03:27 PM, Barry Smith wrote: >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Do a MatView() on A before the solve (remove the -da_refine 4) so it is small. Is the 0,0 entry 0? If the matrix has zero on the diagonals you cannot us Gauss-Seidel as the smoother. You can start with -mg_levels_pc_type jacobi -mg_levels_ksp_type chebychev -mg_levels_ksp_chebyshev_estimate_eigenvalues >>>>>>>>>>>>> >>>>>>>>>>>>> Is the matrix a Stokes-like matrix? If so then different preconditioners are in order. >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> On Aug 1, 2013, at 5:21 PM, Michele Rosso >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>> >>>>>>>>>>>>>> here it is the fraction of code where I set the rhs term and the matrix. >>>>>>>>>>>>>> >>>>>>>>>>>>>> ! Create matrix >>>>>>>>>>>>>> call form_matrix( A , qrho, lsf, head ) >>>>>>>>>>>>>> call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>>>>> call MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr) >>>>>>>>>>>>>> >>>>>>>>>>>>>> ! Create rhs term >>>>>>>>>>>>>> call form_rhs(work, qrho, lsf, b , head) >>>>>>>>>>>>>> >>>>>>>>>>>>>> ! Solve system >>>>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>>>> call KSPSetUp(ksp,ierr) >>>>>>>>>>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>>>>>>>>>> call KSPGetIterationNumber(ksp, iiter ,ierr) >>>>>>>>>>>>>> >>>>>>>>>>>>>> The subroutine form_matrix returns the Mat object A that is filled by using MatSetValuesStencil. >>>>>>>>>>>>>> qrho, lsf and head are additional arguments that are needed to compute the matrix value. >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Michele >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On 08/01/2013 03:11 PM, Barry Smith wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Where are you putting the values into the matrix? It seems the matrix has no values in it? The code is stopping because in the Gauss-Seidel smoothing it has detected zero diagonals. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Aug 1, 2013, at 4:47 PM, Michele Rosso >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I run with : -pc_type mg -pc_mg_galerkin -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> For the test I use a 64^3 grid and 4 processors. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> The output is: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> [2]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>>> [2]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>>>> [2]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>>> [2]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>>> [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>>> [2]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [2]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>>> [2]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>>> [2]PETSC ERROR: Configure options >>>>>>>>>>>>>>>> [2]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [2]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> Arguments are incompatible! >>>>>>>>>>>>>>>> [2]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> Zero diagonal on row 0! >>>>>>>>>>>>>>>> [2]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> [2]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: [2]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [2]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>>> [3]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>>>> [3]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>>> [3]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>>> [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>>> [3]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [3]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>>> [3]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>>> [3]PETSC ERROR: Configure options >>>>>>>>>>>>>>>> [3]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [3]PETSC ERROR: --------------------- Error Message ------------------------------------ >>>>>>>>>>>>>>>> MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: Arguments are incompatible! >>>>>>>>>>>>>>>> [1]PETSC ERROR: Zero diagonal on row 0! >>>>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>>> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>>>>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>>>>>>>>> [1]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [1]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>>> [1]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>>> [1]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>>> [1]PETSC ERROR: Configure options >>>>>>>>>>>>>>>> [1]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [1]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: [3]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>>> [3]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>>> [1]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: ./test on a linux-gnu-dbg named enterprise-A by mic Thu Aug 1 14:44:04 2013 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-3.4.2/linux-gnu-dbg/lib >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Aug 1 12:01:44 2013 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options >>>>>>>>>>>>>>>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: MatInvertDiagonal_SeqAIJ() line 1457 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_SeqAIJ() line 1489 in src/mat/impls/aij/seq/aij.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR_MPIAIJ() line 1623 in src/mat/impls/aij/mpi/mpiaij.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: MatSOR() line 3649 in src/mat/interface/matrix.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply_SOR() line 35 in src/ksp/pc/impls/sor/sor.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/interface//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPInitialResidual() line 64 in src/ksp/ksp/interface/itres.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_GMRES() line 239 in src/ksp/ksp/impls/gmres/gmres.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_Chebyshev() line 409 in src/ksp/ksp/impls/cheby/cheby.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCMGMCycle_Private() line 19 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply_MG() line 330 in src/ksp/pc/impls/mg/mg.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCApply() line 442 in src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSP_PCApply() line 227 in src/ksp/ksp/impls/cg//opt/petsc/petsc-3.4.2/include/petsc-private/kspimpl.h >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve_CG() line 175 in src/ksp/ksp/impls/cg/cg.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSolve() line 441 in src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>> -pc_mg_galerkin >>>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Here is the code I use to setup DMDA and KSP: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> call DMDACreate3d( PETSC_COMM_WORLD , & >>>>>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & >>>>>>>>>>>>>>>> & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, & >>>>>>>>>>>>>>>> & N_Z , N_Y , N_X , N_B3 , N_B2 , 1_ip, 1_ip , 1_ip , & >>>>>>>>>>>>>>>> & int(NNZ,ip) ,int(NNY,ip) , NNX, da , ierr) >>>>>>>>>>>>>>>> ! Create Global Vectors >>>>>>>>>>>>>>>> call DMCreateGlobalVector(da,b,ierr) >>>>>>>>>>>>>>>> call VecDuplicate(b,x,ierr) >>>>>>>>>>>>>>>> ! Set initial guess for first use of the module to 0 >>>>>>>>>>>>>>>> call VecSet(x,0.0_rp,ierr) >>>>>>>>>>>>>>>> ! Create matrix >>>>>>>>>>>>>>>> call DMCreateMatrix(da,MATMPIAIJ,A,ierr) >>>>>>>>>>>>>>>> ! Create solver >>>>>>>>>>>>>>>> call KSPCreate(PETSC_COMM_WORLD,ksp,ierr) >>>>>>>>>>>>>>>> call KSPSetDM(ksp,da,ierr) >>>>>>>>>>>>>>>> call KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN,ierr) >>>>>>>>>>>>>>>> ! call KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN,ierr) >>>>>>>>>>>>>>>> call KSPSetType(ksp,KSPCG,ierr) >>>>>>>>>>>>>>>> call KSPSetNormType(ksp,KSP_NORM_UNPRECONDITIONED,ierr) ! Real residual >>>>>>>>>>>>>>>> call KSPSetInitialGuessNonzero(ksp,PETSC_TRUE,ierr) >>>>>>>>>>>>>>>> call KSPSetTolerances(ksp, tol ,PETSC_DEFAULT_DOUBLE_PRECISION,& >>>>>>>>>>>>>>>> & PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_INTEGER,ierr) >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ! To allow using option from command line >>>>>>>>>>>>>>>> call KSPSetFromOptions(ksp,ierr) >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On 08/01/2013 01:04 PM, Barry Smith wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> You can use the option -pc_mg_galerkin and then MG will compute the coarser matrices with a sparse matrix matrix matrix product so you should not need to change your code to try it out. You still need to use the KSPSetDM() and -da_refine n to get it working >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> If it doesn't work, send us all the output. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Aug 1, 2013, at 2:47 PM, Michele Rosso >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>>>> you are correct, I did not use it. I think I get now where is the problem. Correct me if I am wrong, but for the >>>>>>>>>>>>>>>>>> geometric multigrid to work, ksp must be provided with subroutines to compute the matrix and the rhs at any level through >>>>>>>>>>>>>>>>>> KSPSetComputeOperators and KSPSetComputeRHS. >>>>>>>>>>>>>>>>>> I do not do that, I simply build a rhs vector and a matrix and then I solve the system. >>>>>>>>>>>>>>>>>> If you confirm what I just wrote, I will try to modify my code accordingly and get back to you. >>>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>> On 08/01/2013 11:48 AM, Barry Smith wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Do you use KSPSetDM(ksp,da); ? See src/ksp/ksp/examples/tutorials/ex19.c >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:35 PM, Michele Rosso >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Barry, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I am using a finite difference Cartesian uniform grid and DMDA and so far it has not given me any problem. >>>>>>>>>>>>>>>>>>>> I am using a ksp solver (not snes). In a previous thread, I was told an odd number of grid points was needed for the geometric multigrid, is that correct? >>>>>>>>>>>>>>>>>>>> I tried to run my case with >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -pc_type mg -da_refine 4 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> but it does not seem to use the -da_refine option: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> mpiexec -np 4 ./test -pc_type mg -da_refine 4 -ksp_view -options_left >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mg >>>>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=1 cycles=v >>>>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>>>> Not using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.134543, max = 1.47998 >>>>>>>>>>>>>>>>>>>> Chebyshev: estimated using: [0 0.1; 0 1.1] >>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_0_est_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>>>>>>>> maximum iterations=10, initial guess is zero >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_0_) 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: sor >>>>>>>>>>>>>>>>>>>> SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>> Solution = 1.53600013 sec >>>>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>>>> -da_refine 4 >>>>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>>>> -pc_type mg >>>>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>>>> There is one unused database option. It is: >>>>>>>>>>>>>>>>>>>> Option left: name:-da_refine value: 4 >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On 08/01/2013 11:21 AM, Barry Smith wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> What kind of mesh are you using? Are you using DMDA? If you are using DMDA (and have written your code to use it "correctly") then it should be trivial to run with geometric multigrid and geometric multigrid should be a bit faster. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> For example on src/snes/examples/tutorials/ex19.c I run with ./ex19 -pc_type mg -da_refine 4 and it refines the original DMDA 4 times and uses geometric multigrid with 5 levels. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Aug 1, 2013, at 1:14 PM, Michele Rosso >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Hi, >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> I am successfully using PETSc (v3.4.2) to solve a 3D Poisson's equation with CG + GAMG as I was suggested to do in a previous thread. >>>>>>>>>>>>>>>>>>>>>> So far I am using GAMG with the default settings, i.e. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> The speed of the solution is satisfactory, but I would like to know if you have any suggestions to further speed it up, particularly >>>>>>>>>>>>>>>>>>>>>> if there is any parameters worth looking into to achieve an even faster solution, for example number of levels and so on. >>>>>>>>>>>>>>>>>>>>>> So far I am using Dirichlet's BCs for my test case, but I will soon have periodic conditions: in this case, does GAMG require particular settings? >>>>>>>>>>>>>>>>>>>>>> Finally, I did not try geometric multigrid: do you think it is worth a shot? >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Here are my current settings: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> I run with >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_view -options_left >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> and the output is: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> KSP Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: cg >>>>>>>>>>>>>>>>>>>>>> maximum iterations=10000 >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-08, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: gamg >>>>>>>>>>>>>>>>>>>>>> MG: type is MULTIPLICATIVE, levels=3 cycles=v >>>>>>>>>>>>>>>>>>>>>> Cycles per PCApply=1 >>>>>>>>>>>>>>>>>>>>>> Using Galerkin computed coarse grid matrices >>>>>>>>>>>>>>>>>>>>>> Coarse grid solver -- level ------------------------------- >>>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: bjacobi >>>>>>>>>>>>>>>>>>>>>> block Jacobi: number of blocks = 4 >>>>>>>>>>>>>>>>>>>>>> Local solve info for each block is in the following KSP and PC objects: >>>>>>>>>>>>>>>>>>>>>> [0] number of local blocks = 1, first local block number = 0 >>>>>>>>>>>>>>>>>>>>>> [0] local block number 0 >>>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) left preconditioning >>>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 4.13207 >>>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=132379, allocated nonzeros=132379 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> Matrix Object:KSP Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: preonly >>>>>>>>>>>>>>>>>>>>>> maximum iterations=1, initial guess is zero >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: (mg_coarse_sub_) 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: lu >>>>>>>>>>>>>>>>>>>>>> LU: out-of-place factorization >>>>>>>>>>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>>>>>>>>>> matrix ordering: nd >>>>>>>>>>>>>>>>>>>>>> factor fill ratio given 5, needed 0 >>>>>>>>>>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=1, allocated nonzeros=1 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>>>>>>>>>> rows=0, cols=0 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=0, allocated nonzeros=0 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>>>>>>>>>> [1] number of local blocks = 1, first local block number = 1 >>>>>>>>>>>>>>>>>>>>>> [1] local block number 0 >>>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>>> [2] number of local blocks = 1, first local block number = 2 >>>>>>>>>>>>>>>>>>>>>> [2] local block number 0 >>>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>>> [3] number of local blocks = 1, first local block number = 3 >>>>>>>>>>>>>>>>>>>>>> [3] local block number 0 >>>>>>>>>>>>>>>>>>>>>> - - - - - - - - - - - - - - - - - - >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>>> rows=395, cols=395 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=32037, allocated nonzeros=32037 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 1 ------------------------------- >>>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0636225, max = 1.33607 >>>>>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_1_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>>> rows=23918, cols=23918 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=818732, allocated nonzeros=818732 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> not using I-node (on process 0) routines >>>>>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>>>>> Down solver (pre-smoother) on level 2 ------------------------------- >>>>>>>>>>>>>>>>>>>>>> KSP Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: chebyshev >>>>>>>>>>>>>>>>>>>>>> Chebyshev: eigenvalue estimates: min = 0.0971369, max = 2.03987 >>>>>>>>>>>>>>>>>>>>>> maximum iterations=2 >>>>>>>>>>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>>>>>>>>>> using nonzero initial guess >>>>>>>>>>>>>>>>>>>>>> using NONE norm type for convergence test >>>>>>>>>>>>>>>>>>>>>> PC Object: (mg_levels_2_) 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: jacobi >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> Up solver (post-smoother) same as down solver (pre-smoother) >>>>>>>>>>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>>>>>>>>>> Matrix Object: 4 MPI processes >>>>>>>>>>>>>>>>>>>>>> type: mpiaij >>>>>>>>>>>>>>>>>>>>>> rows=262144, cols=262144 >>>>>>>>>>>>>>>>>>>>>> total: nonzeros=1835008, allocated nonzeros=1835008 >>>>>>>>>>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>>>>>>>>>> #PETSc Option Table entries: >>>>>>>>>>>>>>>>>>>>>> -ksp_view >>>>>>>>>>>>>>>>>>>>>> -options_left >>>>>>>>>>>>>>>>>>>>>> -pc_gamg_agg_nsmooths 1 >>>>>>>>>>>>>>>>>>>>>> -pc_type gamg >>>>>>>>>>>>>>>>>>>>>> #End of PETSc Option Table entries >>>>>>>>>>>>>>>>>>>>>> There are no unused options. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Thank you, >>>>>>>>>>>>>>>>>>>>>> Michele >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>> <2decomp_fft-1.5.847-modified.tar.gz> >>>>>>>>>> >>>>>>>>>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue Aug 13 21:23:20 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 13 Aug 2013 21:23:20 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520AE004.1010803@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> <520AE004.1010803@uci.edu> Message-ID: <87y585ko2v.fsf@mcs.anl.gov> Michele Rosso writes: > The matrix arises from discretization of the Poisson equation in > incompressible flow calculations. Can you try the two runs below and send -log_summary? -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 -pc_mg_type full -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mrosso at uci.edu Tue Aug 13 21:57:36 2013 From: mrosso at uci.edu (Michele Rosso) Date: Tue, 13 Aug 2013 19:57:36 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <87y585ko2v.fsf@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> <520AE004.1010803@uci.edu> <87y585ko2v.fsf@mcs.anl.gov> Message-ID: <520AF220.3070900@uci.edu> Hi Jed, I attached the output for both the runs you suggested. At the beginning of each file I included the options I used. On a side note, I tried to run with a grid of 256^3 (exactly as before) but with less levels, i.e. 3 instead of 4 or 5. My system stops the run because of an Out Of Memory condition. It is really odd since I have not changed anything except - pc_mg_levels. I cannot send you any output since there is none. Do you have any guess where the problem comes from? Thanks, Michele On 08/13/2013 07:23 PM, Jed Brown wrote: > Michele Rosso writes: >> The matrix arises from discretization of the Poisson equation in >> incompressible flow calculations. > Can you try the two runs below and send -log_summary? > > -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 > > > -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 -pc_mg_type full -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 0 KSP Residual norm 3.653965664551e-05 1 KSP Residual norm 1.910638846094e-06 2 KSP Residual norm 8.690440116045e-08 3 KSP Residual norm 3.732213639394e-09 4 KSP Residual norm 1.964855338020e-10 Linear solve converged due to CONVERGED_RTOL iterations 4 KSP Object: 8 MPI processes type: cg maximum iterations=10000 tolerances: relative=0.0001, absolute=1e-50, divergence=10000 left preconditioning has attached null space using nonzero initial guess using UNPRECONDITIONED norm type for convergence test PC Object: 8 MPI processes type: mg MG: type is MULTIPLICATIVE, levels=5 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 8 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 8 MPI processes type: redundant Redundant preconditioner: First (color=0) of 8 PCs follows KSP Object: (mg_coarse_redundant_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_redundant_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 8.69546 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=512, cols=512 package used to perform factorization: petsc total: nonzeros=120206, allocated nonzeros=120206 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=512, cols=512 total: nonzeros=13824, allocated nonzeros=13824 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=512, cols=512 total: nonzeros=13824, allocated nonzeros=13824 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 32 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=4096, cols=4096 total: nonzeros=110592, allocated nonzeros=110592 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=32768, cols=32768 total: nonzeros=884736, allocated nonzeros=884736 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_3_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=7077888, allocated nonzeros=7077888 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_4_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=2097152, cols=2097152 total: nonzeros=14680064, allocated nonzeros=14680064 total number of mallocs used during MatSetValues calls =0 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=2097152, cols=2097152 total: nonzeros=14680064, allocated nonzeros=14680064 total number of mallocs used during MatSetValues calls =0 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./hit on a arch-cray-xt5-pkgs-opt named nid13790 with 8 processors, by Unknown Tue Aug 13 22:37:31 2013 Using Petsc Release Version 3.4.2, Jul, 02, 2013 Max Max/Min Avg Total Time (sec): 4.048e+00 1.00012 4.048e+00 Objects: 2.490e+02 1.00000 2.490e+02 Flops: 2.663e+08 1.00000 2.663e+08 2.130e+09 Flops/sec: 6.579e+07 1.00012 6.579e+07 5.263e+08 MPI Messages: 6.820e+02 1.00000 6.820e+02 5.456e+03 MPI Message Lengths: 8.245e+06 1.00000 1.209e+04 6.596e+07 MPI Reductions: 4.580e+02 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 4.0480e+00 100.0% 2.1305e+09 100.0% 5.456e+03 100.0% 1.209e+04 100.0% 4.570e+02 99.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %f - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecTDot 12 1.0 2.9428e-02 1.2 6.29e+06 1.0 0.0e+00 0.0e+00 1.2e+01 1 2 0 0 3 1 2 0 0 3 1710 VecNorm 9 1.0 1.0796e-02 1.2 4.72e+06 1.0 0.0e+00 0.0e+00 9.0e+00 0 2 0 0 2 0 2 0 0 2 3497 VecScale 24 1.0 2.4652e-04 1.1 1.99e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6442 VecCopy 3 1.0 5.0740e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 116 1.0 1.4349e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 12 1.0 2.8027e-02 1.0 6.29e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1796 VecAYPX 29 1.0 3.0655e-02 1.4 4.16e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1085 VecScatterBegin 123 1.0 3.5391e-02 1.1 0.00e+00 0.0 3.5e+03 1.2e+04 0.0e+00 1 0 65 66 0 1 0 65 66 0 0 VecScatterEnd 123 1.0 2.5395e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 31 1.0 2.3556e-01 1.0 5.62e+07 1.0 1.0e+03 2.3e+04 0.0e+00 6 21 19 36 0 6 21 19 36 0 1908 MatMultAdd 24 1.0 5.9044e-02 1.0 1.21e+07 1.0 5.8e+02 2.8e+03 0.0e+00 1 5 11 2 0 1 5 11 2 0 1644 MatMultTranspose 28 1.0 7.4601e-02 1.1 1.42e+07 1.0 6.7e+02 2.8e+03 0.0e+00 2 5 12 3 0 2 5 12 3 0 1518 MatSolve 6 1.0 3.8311e-03 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3006 MatSOR 48 1.0 5.8050e-01 1.0 1.01e+08 1.0 8.6e+02 1.5e+04 4.8e+01 14 38 16 19 10 14 38 16 19 11 1390 MatLUFactorSym 1 1.0 3.0620e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 1 0 0 0 0 1 0 MatLUFactorNum 1 1.0 2.4665e-02 1.0 1.95e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 7 0 0 0 1 7 0 0 0 6329 MatAssemblyBegin 20 1.0 2.4351e-02 6.7 0.00e+00 0.0 0.0e+00 0.0e+00 2.2e+01 0 0 0 0 5 0 0 0 0 5 0 MatAssemblyEnd 20 1.0 1.3176e-01 1.0 0.00e+00 0.0 5.6e+02 2.1e+03 7.2e+01 3 0 10 2 16 3 0 10 2 16 0 MatGetRowIJ 1 1.0 1.1516e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 4.1008e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 16 1.3 1.0209e-03 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 3 0 0 0 0 3 0 MatPtAP 4 1.0 6.4001e-01 1.0 4.06e+07 1.0 1.1e+03 1.7e+04 1.0e+02 16 15 21 30 22 16 15 21 30 22 507 MatPtAPSymbolic 4 1.0 3.7003e-01 1.0 0.00e+00 0.0 7.2e+02 2.0e+04 6.0e+01 9 0 13 22 13 9 0 13 22 13 0 MatPtAPNumeric 4 1.0 2.7004e-01 1.0 4.06e+07 1.0 4.2e+02 1.2e+04 4.0e+01 7 15 8 8 9 7 15 8 8 9 1202 MatGetRedundant 1 1.0 7.9393e-04 1.0 0.00e+00 0.0 1.7e+02 7.1e+03 4.0e+00 0 0 3 2 1 0 0 3 2 1 0 MatGetLocalMat 4 1.0 3.9521e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 1 0 0 0 2 1 0 0 0 2 0 MatGetBrAoCol 4 1.0 1.7719e-02 1.0 0.00e+00 0.0 4.3e+02 2.7e+04 8.0e+00 0 0 8 18 2 0 0 8 18 2 0 MatGetSymTrans 8 1.0 1.3007e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 7 1.0 1.3097e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 5 0 0 0 0 5 0 KSPSolve 2 1.0 1.0450e+00 1.0 2.04e+08 1.0 3.4e+03 1.2e+04 7.5e+01 26 77 62 60 16 26 77 62 60 16 1563 PCSetUp 1 1.0 8.6248e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 21 23 35 32 69 21 23 35 32 69 576 PCApply 6 1.0 8.4384e-01 1.0 1.61e+08 1.0 3.2e+03 9.0e+03 4.8e+01 21 60 59 44 10 21 60 59 44 11 1523 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Container 1 1 564 0 Vector 99 99 47537368 0 Vector Scatter 21 21 22092 0 Matrix 37 37 75834272 0 Matrix Null Space 1 1 596 0 Distributed Mesh 5 5 2740736 0 Bipartite Graph 10 10 7920 0 Index Set 50 50 1546832 0 IS L to G Mapping 5 5 1361108 0 Krylov Solver 7 7 8616 0 DMKSP interface 3 3 1944 0 Preconditioner 7 7 6672 0 Viewer 3 2 1456 0 ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 Average time for MPI_Barrier(): 2.43187e-06 Average time for zero size MPI_Send(): 2.38419e-06 #PETSc Option Table entries: -ksp_converged_reason -ksp_monitor -ksp_view -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -options_left -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Wed Jul 31 22:48:06 2013 Configure options: --known-level1-dcache-size=65536 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=2 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=0 --known-mpi-c-double-complex=0 --with-cc=cc --with-cxx=CC --with-fc=ftn --with-clib-autodetect=0 --with-cxxlib-autodetect=0 --with-fortranlib-autodetect=0 --with-debugging=0 --COPTFLAGS="-fastsse -Mipa=fast -mp" --CXXOPTFLAGS="-fastsse -Mipa=fast -mp" --FOPTFLAGS="-fastsse -Mipa=fast -mp" --with-blas-lapack-lib="-L/opt/acml/4.4.0/pgi64/lib -lacml -lacml_mv" --with-shared-libraries=0 --with-x=0 --with-batch --known-mpi-shared-libraries=0 PETSC_ARCH=arch-cray-xt5-pkgs-opt ----------------------------------------- Libraries compiled on Wed Jul 31 22:48:06 2013 on krakenpf1 Machine characteristics: Linux-2.6.27.48-0.12.1_1.0301.5943-cray_ss_s-x86_64-with-SuSE-11-x86_64 Using PETSc directory: /nics/c/home/mrosso/LIBS/petsc-3.4.2 Using PETSc arch: arch-cray-xt5-pkgs-opt ----------------------------------------- Using C compiler: cc -fastsse -Mipa=fast -mp ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: ftn -fastsse -Mipa=fast -mp ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/include -I/opt/cray/portals/2.2.0-1.0301.26633.6.9.ss/include -I/opt/cray/pmi/2.1.4-1.0000.8596.15.1.ss/include -I/opt/cray/mpt/5.3.5/xt/seastar/mpich2-pgi/109/include -I/opt/acml/4.4.0/pgi64/include -I/opt/xt-libsci/11.0.04/pgi/109/istanbul/include -I/opt/fftw/3.3.0.0/x86_64/include -I/usr/include/alps ----------------------------------------- Using C linker: cc Using Fortran linker: ftn Using libraries: -Wl,-rpath,/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib -L/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib -lpetsc -L/opt/acml/4.4.0/pgi64/lib -lacml -lacml_mv -lpthread -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_converged_reason -ksp_monitor -ksp_view -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -options_left -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg #End of PETSc Option Table entries There are no unused options. -------------- next part -------------- -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 -pc_mg_type full 0 KSP Residual norm 3.654533581988e-05 1 KSP Residual norm 8.730776244351e-07 2 KSP Residual norm 3.474626061661e-08 3 KSP Residual norm 1.813665557493e-09 Linear solve converged due to CONVERGED_RTOL iterations 3 KSP Object: 8 MPI processes type: cg maximum iterations=10000 tolerances: relative=0.0001, absolute=1e-50, divergence=10000 left preconditioning has attached null space using nonzero initial guess using UNPRECONDITIONED norm type for convergence test PC Object: 8 MPI processes type: mg MG: type is FULL, levels=5 cycles=v Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 8 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 8 MPI processes type: redundant Redundant preconditioner: First (color=0) of 8 PCs follows KSP Object: (mg_coarse_redundant_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_redundant_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 8.69546 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=512, cols=512 package used to perform factorization: petsc total: nonzeros=120206, allocated nonzeros=120206 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=512, cols=512 total: nonzeros=13824, allocated nonzeros=13824 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=512, cols=512 total: nonzeros=13824, allocated nonzeros=13824 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 32 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=4096, cols=4096 total: nonzeros=110592, allocated nonzeros=110592 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=32768, cols=32768 total: nonzeros=884736, allocated nonzeros=884736 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 3 ------------------------------- KSP Object: (mg_levels_3_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_3_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=262144, cols=262144 total: nonzeros=7077888, allocated nonzeros=7077888 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 4 ------------------------------- KSP Object: (mg_levels_4_) 8 MPI processes type: richardson Richardson: damping factor=1 maximum iterations=1 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_4_) 8 MPI processes type: sor SOR: type = local_symmetric, iterations = 1, local iterations = 1, omega = 1 linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=2097152, cols=2097152 total: nonzeros=14680064, allocated nonzeros=14680064 total number of mallocs used during MatSetValues calls =0 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 8 MPI processes type: mpiaij rows=2097152, cols=2097152 total: nonzeros=14680064, allocated nonzeros=14680064 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./hit on a arch-cray-xt5-pkgs-opt named nid14615 with 8 processors, by Unknown Tue Aug 13 22:44:16 2013 Using Petsc Release Version 3.4.2, Jul, 02, 2013 Max Max/Min Avg Total Time (sec): 4.261e+00 1.00012 4.261e+00 Objects: 2.950e+02 1.00000 2.950e+02 Flops: 3.322e+08 1.00000 3.322e+08 2.658e+09 Flops/sec: 7.797e+07 1.00012 7.796e+07 6.237e+08 MPI Messages: 1.442e+03 1.00000 1.442e+03 1.154e+04 MPI Message Lengths: 1.018e+07 1.00000 7.057e+03 8.141e+07 MPI Reductions: 5.460e+02 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 4.2609e+00 100.0% 2.6575e+09 100.0% 1.154e+04 100.0% 7.057e+03 100.0% 5.450e+02 99.8% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %f - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecTDot 10 1.0 2.4743e-02 1.1 5.24e+06 1.0 0.0e+00 0.0e+00 1.0e+01 1 2 0 0 2 1 2 0 0 2 1695 VecNorm 8 1.0 9.9294e-03 1.3 4.19e+06 1.0 0.0e+00 0.0e+00 8.0e+00 0 1 0 0 1 0 1 0 0 1 3379 VecScale 70 1.0 4.9663e-04 1.1 3.86e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6222 VecCopy 3 1.0 5.0108e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 271 1.0 1.0437e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 10 1.0 2.3400e-02 1.0 5.24e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1792 VecAYPX 54 1.0 2.5038e-02 1.5 3.55e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1133 VecScatterBegin 324 1.0 4.1335e-02 1.1 0.00e+00 0.0 9.6e+03 6.1e+03 0.0e+00 1 0 83 72 0 1 0 83 72 0 0 VecScatterEnd 324 1.0 4.4111e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatMult 76 1.0 2.8557e-01 1.1 6.73e+07 1.0 2.5e+03 9.8e+03 0.0e+00 6 20 22 31 0 6 20 22 31 0 1884 MatMultAdd 50 1.0 5.5734e-02 1.0 1.15e+07 1.0 1.2e+03 1.5e+03 0.0e+00 1 3 10 2 0 1 3 10 2 0 1657 MatMultTranspose 74 1.0 1.2116e-01 1.2 2.37e+07 1.0 1.8e+03 1.9e+03 0.0e+00 3 7 15 4 0 3 7 15 4 0 1563 MatSolve 25 1.0 1.3877e-02 1.0 6.00e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 2 0 0 0 0 2 0 0 0 3458 MatSOR 100 1.0 7.1429e-01 1.1 1.45e+08 1.0 2.6e+03 9.4e+03 1.4e+02 16 44 23 30 26 16 44 23 30 26 1628 MatLUFactorSym 1 1.0 3.0639e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 1 0 0 0 0 1 0 MatLUFactorNum 1 1.0 2.4523e-02 1.0 1.95e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 6 0 0 0 1 6 0 0 0 6366 MatAssemblyBegin 20 1.0 3.1168e-02 6.9 0.00e+00 0.0 0.0e+00 0.0e+00 2.2e+01 0 0 0 0 4 0 0 0 0 4 0 MatAssemblyEnd 20 1.0 1.3784e-01 1.1 0.00e+00 0.0 5.6e+02 2.1e+03 7.2e+01 3 0 5 1 13 3 0 5 1 13 0 MatGetRowIJ 1 1.0 1.1015e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 4.0793e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 16 1.3 1.0140e-03 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 2 0 0 0 0 2 0 MatPtAP 4 1.0 6.4115e-01 1.0 4.06e+07 1.0 1.1e+03 1.7e+04 1.0e+02 15 12 10 24 18 15 12 10 24 18 506 MatPtAPSymbolic 4 1.0 3.7106e-01 1.0 0.00e+00 0.0 7.2e+02 2.0e+04 6.0e+01 9 0 6 18 11 9 0 6 18 11 0 MatPtAPNumeric 4 1.0 2.7011e-01 1.0 4.06e+07 1.0 4.2e+02 1.2e+04 4.0e+01 6 12 4 6 7 6 12 4 6 7 1202 MatGetRedundant 1 1.0 8.1611e-04 1.0 0.00e+00 0.0 1.7e+02 7.1e+03 4.0e+00 0 0 1 1 1 0 0 1 1 1 0 MatGetLocalMat 4 1.0 3.9911e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 1 0 0 0 1 1 0 0 0 1 0 MatGetBrAoCol 4 1.0 1.7765e-02 1.0 0.00e+00 0.0 4.3e+02 2.7e+04 8.0e+00 0 0 4 14 1 0 0 4 14 1 0 MatGetSymTrans 8 1.0 1.3194e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 7 1.0 1.4666e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 4 0 0 0 0 4 0 KSPSolve 2 1.0 1.2287e+00 1.0 2.70e+08 1.0 9.5e+03 5.8e+03 1.6e+02 29 81 82 68 30 29 81 82 68 30 1758 PCSetUp 1 1.0 8.6414e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 20 19 17 26 58 20 19 17 26 58 575 PCApply 5 1.0 1.0571e+00 1.0 2.33e+08 1.0 9.3e+03 4.9e+03 1.4e+02 24 70 81 56 26 24 70 81 56 26 1764 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Container 1 1 564 0 Vector 145 145 58892872 0 Vector Scatter 21 21 22092 0 Matrix 37 37 75834272 0 Matrix Null Space 1 1 596 0 Distributed Mesh 5 5 2740736 0 Bipartite Graph 10 10 7920 0 Index Set 50 50 1546832 0 IS L to G Mapping 5 5 1361108 0 Krylov Solver 7 7 8616 0 DMKSP interface 3 3 1944 0 Preconditioner 7 7 6672 0 Viewer 3 2 1456 0 ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 Average time for MPI_Barrier(): 6.58035e-06 Average time for zero size MPI_Send(): 4.02331e-06 #PETSc Option Table entries: -ksp_converged_reason -ksp_monitor -ksp_view -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -options_left -pc_mg_galerkin -pc_mg_levels 5 -pc_mg_type full -pc_type mg #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure run at: Wed Jul 31 22:48:06 2013 Configure options: --known-level1-dcache-size=65536 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=2 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=0 --known-mpi-c-double-complex=0 --with-cc=cc --with-cxx=CC --with-fc=ftn --with-clib-autodetect=0 --with-cxxlib-autodetect=0 --with-fortranlib-autodetect=0 --with-debugging=0 --COPTFLAGS="-fastsse -Mipa=fast -mp" --CXXOPTFLAGS="-fastsse -Mipa=fast -mp" --FOPTFLAGS="-fastsse -Mipa=fast -mp" --with-blas-lapack-lib="-L/opt/acml/4.4.0/pgi64/lib -lacml -lacml_mv" --with-shared-libraries=0 --with-x=0 --with-batch --known-mpi-shared-libraries=0 PETSC_ARCH=arch-cray-xt5-pkgs-opt ----------------------------------------- Libraries compiled on Wed Jul 31 22:48:06 2013 on krakenpf1 Machine characteristics: Linux-2.6.27.48-0.12.1_1.0301.5943-cray_ss_s-x86_64-with-SuSE-11-x86_64 Using PETSc directory: /nics/c/home/mrosso/LIBS/petsc-3.4.2 Using PETSc arch: arch-cray-xt5-pkgs-opt ----------------------------------------- Using C compiler: cc -fastsse -Mipa=fast -mp ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: ftn -fastsse -Mipa=fast -mp ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/include -I/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/include -I/opt/cray/portals/2.2.0-1.0301.26633.6.9.ss/include -I/opt/cray/pmi/2.1.4-1.0000.8596.15.1.ss/include -I/opt/cray/mpt/5.3.5/xt/seastar/mpich2-pgi/109/include -I/opt/acml/4.4.0/pgi64/include -I/opt/xt-libsci/11.0.04/pgi/109/istanbul/include -I/opt/fftw/3.3.0.0/x86_64/include -I/usr/include/alps ----------------------------------------- Using C linker: cc Using Fortran linker: ftn Using libraries: -Wl,-rpath,/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib -L/nics/c/home/mrosso/LIBS/petsc-3.4.2/arch-cray-xt5-pkgs-opt/lib -lpetsc -L/opt/acml/4.4.0/pgi64/lib -lacml -lacml_mv -lpthread -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_converged_reason -ksp_monitor -ksp_view -log_summary -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -options_left -pc_mg_galerkin -pc_mg_levels 5 -pc_mg_type full -pc_type mg #End of PETSc Option Table entries There are no unused options. Application 6640063 resources: utime ~45s, stime ~2s From bsmith at mcs.anl.gov Tue Aug 13 22:03:15 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 13 Aug 2013 22:03:15 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520AF220.3070900@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAD784.6080302@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> <520AE004.1010803@uci.edu> <87y585ko2v.fsf@mcs.anl.gov> <520AF220.3070900@uci.edu> Message-ID: <14431CA7-5DB3-4C43-AE0E-60006E20C27A@mcs.anl.gov> On Aug 13, 2013, at 9:57 PM, Michele Rosso wrote: > Hi Jed, > > I attached the output for both the runs you suggested. At the beginning of each file I included the options I used. > > On a side note, I tried to run with a grid of 256^3 (exactly as before) but with less levels, i.e. 3 instead of 4 or 5. > My system stops the run because of an Out Of Memory condition. It is really odd since I have not changed anything except > - pc_mg_levels. I cannot send you any output since there is none. Do you have any guess where the problem comes from? By default it uses a direct solver (maybe even sequential) for the coarsest level; since the coarse level is big the direct solver requires too much memory. You could install PETSc with the ./configure option --download-superlu_dist and run with -mg_coarse_pc_type lu -mg_coarse_pc_factor_mat_solver_package superlu_dist Barry > Thanks, > > Michele > > On 08/13/2013 07:23 PM, Jed Brown wrote: >> Michele Rosso >> writes: >> >>> The matrix arises from discretization of the Poisson equation in >>> incompressible flow calculations. >>> >> Can you try the two runs below and send -log_summary? >> >> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 >> >> >> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 -pc_mg_type full >> > > From jedbrown at mcs.anl.gov Tue Aug 13 22:36:56 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 13 Aug 2013 22:36:56 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520AF220.3070900@uci.edu> References: <51FAA56D.60106@uci.edu> <5C6A92BA-A999-45C7-916D-809ED023A70C@mcs.anl.gov> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> <520AE004.1010803@uci.edu> <87y585ko2v.fsf@mcs.anl.gov> <520AF220.3070900@uci.edu> Message-ID: <87k3jpkko7.fsf@mcs.anl.gov> Michele Rosso writes: > Hi Jed, > > I attached the output for both the runs you suggested. At the beginning > of each file I included the options I used. > > On a side note, I tried to run with a grid of 256^3 (exactly as before) > but with less levels, i.e. 3 instead of 4 or 5. > My system stops the run because of an Out Of Memory condition. It is > really odd since I have not changed anything except > - pc_mg_levels. I cannot send you any output since there is none. Do > you have any guess where the problem comes from? The selected algorithm does a direct solve on the coarse grid. Each time you reduce the number of levels, the coarse grid size grows by a factor of 8. Going from 5 to 3 levels is going from a 16^3 coarse grid to a 64^3 coarse grid. Applying a direct solver to the latter ends up using a lot of memory. I think this is not worth bothering with and it might even be (slightly) faster to use 6 levels. That is not where the time is being spent. > -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg > -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson > -mg_levels_ksp_max_it 1 > > > > 0 KSP Residual norm 3.653965664551e-05 > 1 KSP Residual norm 1.910638846094e-06 > 2 KSP Residual norm 8.690440116045e-08 > 3 KSP Residual norm 3.732213639394e-09 > 4 KSP Residual norm 1.964855338020e-10 This converges well. > Max Max/Min Avg Total > Time (sec): 4.048e+00 1.00012 4.048e+00 > Objects: 2.490e+02 1.00000 2.490e+02 > Flops: 2.663e+08 1.00000 2.663e+08 2.130e+09 > Flops/sec: 6.579e+07 1.00012 6.579e+07 5.263e+08 > MPI Messages: 6.820e+02 1.00000 6.820e+02 5.456e+03 > MPI Message Lengths: 8.245e+06 1.00000 1.209e+04 6.596e+07 > MPI Reductions: 4.580e+02 1.00000 > VecTDot 12 1.0 2.9428e-02 1.2 6.29e+06 1.0 0.0e+00 0.0e+00 1.2e+01 1 2 0 0 3 1 2 0 0 3 1710 > VecNorm 9 1.0 1.0796e-02 1.2 4.72e+06 1.0 0.0e+00 0.0e+00 9.0e+00 0 2 0 0 2 0 2 0 0 2 3497 > VecScale 24 1.0 2.4652e-04 1.1 1.99e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6442 > VecCopy 3 1.0 5.0740e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecSet 116 1.0 1.4349e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAXPY 12 1.0 2.8027e-02 1.0 6.29e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1796 > VecAYPX 29 1.0 3.0655e-02 1.4 4.16e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1085 > VecScatterBegin 123 1.0 3.5391e-02 1.1 0.00e+00 0.0 3.5e+03 1.2e+04 0.0e+00 1 0 65 66 0 1 0 65 66 0 0 > VecScatterEnd 123 1.0 2.5395e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMult 31 1.0 2.3556e-01 1.0 5.62e+07 1.0 1.0e+03 2.3e+04 0.0e+00 6 21 19 36 0 6 21 19 36 0 1908 > MatMultAdd 24 1.0 5.9044e-02 1.0 1.21e+07 1.0 5.8e+02 2.8e+03 0.0e+00 1 5 11 2 0 1 5 11 2 0 1644 > MatMultTranspose 28 1.0 7.4601e-02 1.1 1.42e+07 1.0 6.7e+02 2.8e+03 0.0e+00 2 5 12 3 0 2 5 12 3 0 1518 > MatSolve 6 1.0 3.8311e-03 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3006 > MatSOR 48 1.0 5.8050e-01 1.0 1.01e+08 1.0 8.6e+02 1.5e+04 4.8e+01 14 38 16 19 10 14 38 16 19 11 1390 Most of the solve time is in MatSOR and MatMult. That's expected since the subdomains are pretty big. > MatLUFactorSym 1 1.0 3.0620e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 1 0 0 0 0 1 0 > MatLUFactorNum 1 1.0 2.4665e-02 1.0 1.95e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 7 0 0 0 1 7 0 0 0 6329 > MatAssemblyBegin 20 1.0 2.4351e-02 6.7 0.00e+00 0.0 0.0e+00 0.0e+00 2.2e+01 0 0 0 0 5 0 0 0 0 5 0 > MatAssemblyEnd 20 1.0 1.3176e-01 1.0 0.00e+00 0.0 5.6e+02 2.1e+03 7.2e+01 3 0 10 2 16 3 0 10 2 16 0 > MatGetRowIJ 1 1.0 1.1516e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetOrdering 1 1.0 4.1008e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatView 16 1.3 1.0209e-03 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 3 0 0 0 0 3 0 > MatPtAP 4 1.0 6.4001e-01 1.0 4.06e+07 1.0 1.1e+03 1.7e+04 1.0e+02 16 15 21 30 22 16 15 21 30 22 507 MatPtAP dominates the setup time. For profiling, you could register a stage (PetscLogStageRegister) and time the setup separately from the solve. > MatPtAPSymbolic 4 1.0 3.7003e-01 1.0 0.00e+00 0.0 7.2e+02 2.0e+04 6.0e+01 9 0 13 22 13 9 0 13 22 13 0 > MatPtAPNumeric 4 1.0 2.7004e-01 1.0 4.06e+07 1.0 4.2e+02 1.2e+04 4.0e+01 7 15 8 8 9 7 15 8 8 9 1202 > MatGetRedundant 1 1.0 7.9393e-04 1.0 0.00e+00 0.0 1.7e+02 7.1e+03 4.0e+00 0 0 3 2 1 0 0 3 2 1 0 > MatGetLocalMat 4 1.0 3.9521e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 1 0 0 0 2 1 0 0 0 2 0 > MatGetBrAoCol 4 1.0 1.7719e-02 1.0 0.00e+00 0.0 4.3e+02 2.7e+04 8.0e+00 0 0 8 18 2 0 0 8 18 2 0 > MatGetSymTrans 8 1.0 1.3007e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSetUp 7 1.0 1.3097e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 5 0 0 0 0 5 0 > KSPSolve 2 1.0 1.0450e+00 1.0 2.04e+08 1.0 3.4e+03 1.2e+04 7.5e+01 26 77 62 60 16 26 77 62 60 16 1563 > PCSetUp 1 1.0 8.6248e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 21 23 35 32 69 21 23 35 32 69 576 > PCApply 6 1.0 8.4384e-01 1.0 1.61e+08 1.0 3.2e+03 9.0e+03 4.8e+01 21 60 59 44 10 21 60 59 44 11 1523 Do you know why there are 6 PCApply events? With four iterations of the Krylov method, there should be only 5 events. Oh, it looks like you do two solves. Is one of those with a different system? Recall that the old KSPSolve time was over 3.35 seconds. > -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg > -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson > -mg_levels_ksp_max_it 1 -pc_mg_type full > > 0 KSP Residual norm 3.654533581988e-05 > 1 KSP Residual norm 8.730776244351e-07 > 2 KSP Residual norm 3.474626061661e-08 > 3 KSP Residual norm 1.813665557493e-09 This converges slightly faster, but ends up not paying off. > Time (sec): 4.261e+00 1.00012 4.261e+00 > Objects: 2.950e+02 1.00000 2.950e+02 > Flops: 3.322e+08 1.00000 3.322e+08 2.658e+09 > Flops/sec: 7.797e+07 1.00012 7.796e+07 6.237e+08 > MPI Messages: 1.442e+03 1.00000 1.442e+03 1.154e+04 > MPI Message Lengths: 1.018e+07 1.00000 7.057e+03 8.141e+07 > MPI Reductions: 5.460e+02 1.00000 More messages, more work, etc., so not better. > KSPSolve 2 1.0 1.2287e+00 1.0 2.70e+08 1.0 9.5e+03 5.8e+03 1.6e+02 29 81 82 68 30 29 81 82 68 30 1758 > PCSetUp 1 1.0 8.6414e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 20 19 17 26 58 20 19 17 26 58 575 > PCApply 5 1.0 1.0571e+00 1.0 2.33e+08 1.0 9.3e+03 4.9e+03 1.4e+02 24 70 81 56 26 24 70 81 56 26 1764 It's still entirely possible that you can make Full MG beat V-cycles, especially if you only need to converge up to discretization error. By my figures, your good solver takes 12 work units to converge well below discretization error (after Galerkin setup, but maybe you only need to do that once?). If you only need to equal truncation error, this can be brought down to about 5 (probably at best a 2x speedup in parallel). This would involve a high-order (cubic) FMG prolongation. Alternatively, you can speed up the implementation (and significantly reduce memory usage) by creating geometric coarse levels and a matrix-free implementation of MatSOR and MatMult. (The matrices are great for experimenting, but if this solver is mission critical and still a bottleneck, the matrix is an inefficient way to represent the operator since it has very low arithmetic intensity/requires a lot of memory bandwidth.) I predict you can probably speed up the solve by perhaps another factor of 2 with a good matrix-free FMG implementation. Do you want to go down this path? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Tue Aug 13 22:38:17 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 13 Aug 2013 22:38:17 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <14431CA7-5DB3-4C43-AE0E-60006E20C27A@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FADF55.50003@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> <520AE004.1010803@uci.edu> <87y585ko2v.fsf@mcs.anl.gov> <520AF220.3070900@uci.edu> <14431CA7-5DB3-4C43-AE0E-60006E20C27A@mcs.anl.gov> Message-ID: <87haetkkly.fsf@mcs.anl.gov> Barry Smith writes: > By default it uses a direct solver (maybe even sequential) for the > coarsest level; Redundant, in fact, so it uses the full amount of memory on every core. I think we should probably change this default solve to gather and scatter rather than compute it redundantly. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Wed Aug 14 06:35:34 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 14 Aug 2013 06:35:34 -0500 Subject: [petsc-users] GAMG speed In-Reply-To: <520B2743.7020405@uci.edu> References: <51FAA56D.60106@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> <520AE004.1010803@uci.edu> <87y585ko2v.fsf@mcs.anl.gov> <520AF220.3070900@uci.edu> <87k3jpkko7.fsf@mcs.anl.gov> <520B2743.7020405@uci.edu> Message-ID: <874nasld2x.fsf@mcs.anl.gov> Please always use "reply-all" so that your messages go to the list. This is standard mailing list etiquette. It is important to preserve threading for people who find this discussion later and so that we do not waste our time re-answering the same questions that have already been answered in private side-conversations. You'll likely get an answer faster that way too. Michele Rosso writes: > Jed, > > thank you very much for the detailed analysis. > I confirm that > > -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg > -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson > -mg_levels_ksp_max_it 1 > > results in a faster solve with the 256^3 grid (it finally beats CG + ICC). > > Yes, I perform different solves: I setup matrix and KSP only once at the beginning of the run and then I re-use them > at each time step. The rhs term changes during the simulation though. For now (single phase flow) the matrix does not change, > but I will be dealing soon with a multiphase flow and thus not even the matrix values will be constant in time (it will be a variable coefficients Poisson Equation). Okay, the coefficient variation from the multiphase flow can drastically change the characteristics of the solve. > I need to solve up to the discretization error, so maybe FMG is worth a try. > The matrix-free approach is appealing given that the Poisson solver is really mission critical (basically it accounts for most of the simulation time). > I will use the level set method and ghost fluid method in order to account for the discontinuities at the interface between phases: the computation of the matrix and rhs > values will be influenced by such methods so my only concern is to be sure matrix-free can be used in these circumstances. Matrix-free can be used in principle, but those problems can be several orders of magnitude more ill-conditioned, so don't invest any more time on it right now. Get the discretization set up using assembled matrices, then go through the options we've tried to find an efficient solver. The best choice will likely depend on the details of the formulation, the types of fluids involved, and the geometric configuration of the fluids. > I do not have any prior experience with matrix-free methods so I will have to rely on your assistance for this. > Thank you very much. > > Michele > > > > > On 08/13/2013 08:36 PM, Jed Brown wrote: >> Michele Rosso writes: >> >>> Hi Jed, >>> >>> I attached the output for both the runs you suggested. At the beginning >>> of each file I included the options I used. >>> >>> On a side note, I tried to run with a grid of 256^3 (exactly as before) >>> but with less levels, i.e. 3 instead of 4 or 5. >>> My system stops the run because of an Out Of Memory condition. It is >>> really odd since I have not changed anything except >>> - pc_mg_levels. I cannot send you any output since there is none. Do >>> you have any guess where the problem comes from? >> The selected algorithm does a direct solve on the coarse grid. Each >> time you reduce the number of levels, the coarse grid size grows by a >> factor of 8. Going from 5 to 3 levels is going from a 16^3 coarse grid >> to a 64^3 coarse grid. Applying a direct solver to the latter ends up >> using a lot of memory. I think this is not worth bothering with and it >> might even be (slightly) faster to use 6 levels. That is not where the >> time is being spent. >> >>> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg >>> -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson >>> -mg_levels_ksp_max_it 1 >>> >>> >>> >>> 0 KSP Residual norm 3.653965664551e-05 >>> 1 KSP Residual norm 1.910638846094e-06 >>> 2 KSP Residual norm 8.690440116045e-08 >>> 3 KSP Residual norm 3.732213639394e-09 >>> 4 KSP Residual norm 1.964855338020e-10 >> This converges well. >> >>> Max Max/Min Avg Total >>> Time (sec): 4.048e+00 1.00012 4.048e+00 >>> Objects: 2.490e+02 1.00000 2.490e+02 >>> Flops: 2.663e+08 1.00000 2.663e+08 2.130e+09 >>> Flops/sec: 6.579e+07 1.00012 6.579e+07 5.263e+08 >>> MPI Messages: 6.820e+02 1.00000 6.820e+02 5.456e+03 >>> MPI Message Lengths: 8.245e+06 1.00000 1.209e+04 6.596e+07 >>> MPI Reductions: 4.580e+02 1.00000 >>> VecTDot 12 1.0 2.9428e-02 1.2 6.29e+06 1.0 0.0e+00 0.0e+00 1.2e+01 1 2 0 0 3 1 2 0 0 3 1710 >>> VecNorm 9 1.0 1.0796e-02 1.2 4.72e+06 1.0 0.0e+00 0.0e+00 9.0e+00 0 2 0 0 2 0 2 0 0 2 3497 >>> VecScale 24 1.0 2.4652e-04 1.1 1.99e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6442 >>> VecCopy 3 1.0 5.0740e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecSet 116 1.0 1.4349e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecAXPY 12 1.0 2.8027e-02 1.0 6.29e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1796 >>> VecAYPX 29 1.0 3.0655e-02 1.4 4.16e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1085 >>> VecScatterBegin 123 1.0 3.5391e-02 1.1 0.00e+00 0.0 3.5e+03 1.2e+04 0.0e+00 1 0 65 66 0 1 0 65 66 0 0 >>> VecScatterEnd 123 1.0 2.5395e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatMult 31 1.0 2.3556e-01 1.0 5.62e+07 1.0 1.0e+03 2.3e+04 0.0e+00 6 21 19 36 0 6 21 19 36 0 1908 >>> MatMultAdd 24 1.0 5.9044e-02 1.0 1.21e+07 1.0 5.8e+02 2.8e+03 0.0e+00 1 5 11 2 0 1 5 11 2 0 1644 >>> MatMultTranspose 28 1.0 7.4601e-02 1.1 1.42e+07 1.0 6.7e+02 2.8e+03 0.0e+00 2 5 12 3 0 2 5 12 3 0 1518 >>> MatSolve 6 1.0 3.8311e-03 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3006 >>> MatSOR 48 1.0 5.8050e-01 1.0 1.01e+08 1.0 8.6e+02 1.5e+04 4.8e+01 14 38 16 19 10 14 38 16 19 11 1390 >> Most of the solve time is in MatSOR and MatMult. That's expected since >> the subdomains are pretty big. >> >>> MatLUFactorSym 1 1.0 3.0620e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 1 0 0 0 0 1 0 >>> MatLUFactorNum 1 1.0 2.4665e-02 1.0 1.95e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 7 0 0 0 1 7 0 0 0 6329 >>> MatAssemblyBegin 20 1.0 2.4351e-02 6.7 0.00e+00 0.0 0.0e+00 0.0e+00 2.2e+01 0 0 0 0 5 0 0 0 0 5 0 >>> MatAssemblyEnd 20 1.0 1.3176e-01 1.0 0.00e+00 0.0 5.6e+02 2.1e+03 7.2e+01 3 0 10 2 16 3 0 10 2 16 0 >>> MatGetRowIJ 1 1.0 1.1516e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatGetOrdering 1 1.0 4.1008e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatView 16 1.3 1.0209e-03 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 3 0 0 0 0 3 0 >>> MatPtAP 4 1.0 6.4001e-01 1.0 4.06e+07 1.0 1.1e+03 1.7e+04 1.0e+02 16 15 21 30 22 16 15 21 30 22 507 >> MatPtAP dominates the setup time. For profiling, you could register a >> stage (PetscLogStageRegister) and time the setup separately from the >> solve. >> >>> MatPtAPSymbolic 4 1.0 3.7003e-01 1.0 0.00e+00 0.0 7.2e+02 2.0e+04 6.0e+01 9 0 13 22 13 9 0 13 22 13 0 >>> MatPtAPNumeric 4 1.0 2.7004e-01 1.0 4.06e+07 1.0 4.2e+02 1.2e+04 4.0e+01 7 15 8 8 9 7 15 8 8 9 1202 >>> MatGetRedundant 1 1.0 7.9393e-04 1.0 0.00e+00 0.0 1.7e+02 7.1e+03 4.0e+00 0 0 3 2 1 0 0 3 2 1 0 >>> MatGetLocalMat 4 1.0 3.9521e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 1 0 0 0 2 1 0 0 0 2 0 >>> MatGetBrAoCol 4 1.0 1.7719e-02 1.0 0.00e+00 0.0 4.3e+02 2.7e+04 8.0e+00 0 0 8 18 2 0 0 8 18 2 0 >>> MatGetSymTrans 8 1.0 1.3007e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> KSPSetUp 7 1.0 1.3097e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 5 0 0 0 0 5 0 >>> KSPSolve 2 1.0 1.0450e+00 1.0 2.04e+08 1.0 3.4e+03 1.2e+04 7.5e+01 26 77 62 60 16 26 77 62 60 16 1563 >>> PCSetUp 1 1.0 8.6248e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 21 23 35 32 69 21 23 35 32 69 576 >>> PCApply 6 1.0 8.4384e-01 1.0 1.61e+08 1.0 3.2e+03 9.0e+03 4.8e+01 21 60 59 44 10 21 60 59 44 11 1523 >> Do you know why there are 6 PCApply events? With four iterations of the >> Krylov method, there should be only 5 events. Oh, it looks like you do >> two solves. Is one of those with a different system? >> >> Recall that the old KSPSolve time was over 3.35 seconds. >> >>> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg >>> -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson >>> -mg_levels_ksp_max_it 1 -pc_mg_type full >>> >>> 0 KSP Residual norm 3.654533581988e-05 >>> 1 KSP Residual norm 8.730776244351e-07 >>> 2 KSP Residual norm 3.474626061661e-08 >>> 3 KSP Residual norm 1.813665557493e-09 >> This converges slightly faster, but ends up not paying off. >> >>> Time (sec): 4.261e+00 1.00012 4.261e+00 >>> Objects: 2.950e+02 1.00000 2.950e+02 >>> Flops: 3.322e+08 1.00000 3.322e+08 2.658e+09 >>> Flops/sec: 7.797e+07 1.00012 7.796e+07 6.237e+08 >>> MPI Messages: 1.442e+03 1.00000 1.442e+03 1.154e+04 >>> MPI Message Lengths: 1.018e+07 1.00000 7.057e+03 8.141e+07 >>> MPI Reductions: 5.460e+02 1.00000 >> More messages, more work, etc., so not better. >> >>> KSPSolve 2 1.0 1.2287e+00 1.0 2.70e+08 1.0 9.5e+03 5.8e+03 1.6e+02 29 81 82 68 30 29 81 82 68 30 1758 >>> PCSetUp 1 1.0 8.6414e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 20 19 17 26 58 20 19 17 26 58 575 >>> PCApply 5 1.0 1.0571e+00 1.0 2.33e+08 1.0 9.3e+03 4.9e+03 1.4e+02 24 70 81 56 26 24 70 81 56 26 1764 >> It's still entirely possible that you can make Full MG beat V-cycles, >> especially if you only need to converge up to discretization error. By >> my figures, your good solver takes 12 work units to converge well below >> discretization error (after Galerkin setup, but maybe you only need to >> do that once?). If you only need to equal truncation error, this can be >> brought down to about 5 (probably at best a 2x speedup in parallel). >> This would involve a high-order (cubic) FMG prolongation. >> >> Alternatively, you can speed up the implementation (and significantly >> reduce memory usage) by creating geometric coarse levels and a >> matrix-free implementation of MatSOR and MatMult. (The matrices are >> great for experimenting, but if this solver is mission critical and >> still a bottleneck, the matrix is an inefficient way to represent the >> operator since it has very low arithmetic intensity/requires a lot of >> memory bandwidth.) I predict you can probably speed up the solve by >> perhaps another factor of 2 with a good matrix-free FMG implementation. >> Do you want to go down this path? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From hsahasra at purdue.edu Wed Aug 14 12:17:39 2013 From: hsahasra at purdue.edu (Harshad Sahasrabudhe) Date: Wed, 14 Aug 2013 13:17:39 -0400 (EDT) Subject: [petsc-users] Extracting data from a Petsc matrix In-Reply-To: <8D451A13-10C3-4442-9522-C49CBB929F03@mcs.anl.gov> Message-ID: <1678258854.12362.1376500659340.JavaMail.root@mailhub027.itcs.purdue.edu> Thanks Barry. How do we get the compiler names from the Python build system? For example, MAGMA make.inc varies depending on whether the compiler is gcc or icc. So is there any easy way to get this information? Harshad ----- Original Message ----- From: "Barry Smith" To: "Harshad Sahasrabudhe" Cc: "Jed Brown" , petsc-users at mcs.anl.gov Sent: Monday, August 12, 2013 5:25:53 PM Subject: Re: [petsc-users] Extracting data from a Petsc matrix On Aug 12, 2013, at 4:05 PM, Harshad Sahasrabudhe wrote: > Hi Jed, > > I am now working to add library support for LU decomposition using MAGMA. I need your help with the following: > > 1) How do I add the options --download-magma, --with-magma, etc. to the configure script for building with MAGMA? Add a new file in config/PETSc/packages (copy one that is already there and modify for magma). > > 2) I have a fair idea how the PETSc code is structured and how to add source code to the impls/ directory. How does PETSc get to know that there is an additional implementation (in this case MAGMA) in this directory? Is there a config file of some sort? Add the new directory name to list of directories in the makefile in that directory and add in MatRegisterAll(). Barry > > Thanks, > Harshad > > ----- Original Message ----- > From: "Jed Brown" > To: hsahasra at purdue.edu, petsc-users at mcs.anl.gov > Sent: Saturday, July 13, 2013 12:43:08 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > "hsahasra at purdue.edu" writes: > >> Hi, >> >> I am working on solving a system of linear equations with square >> matrix. I'm first factoring the matrix using LU decomposition. > > I assume you're solving a dense problem because that is all MAGMA does. > >> I want to do the LU decomposition step using MAGMA on GPUs. MAGMA >> library implements LAPACK functions on a CPU+GPU based system. >> >> So my question is, how do I extract the data from a Petsc Mat so that >> it can be sent to the dgetrf routine in MAGMA. > > MatDenseGetArray > >> Is there any need for duplicating the data for this step? > > You're on your own for storage of factors. Alternatively, you could add > library support so that you could use PCLU and > '-pc_factor_mat_solver_package magma' (or PCFactorSetMatSolverPackage). > Doing this is not a priority for us, but we can provide guidance if you > want to tackle it. From danyang.su at gmail.com Wed Aug 14 12:28:27 2013 From: danyang.su at gmail.com (Danyang Su) Date: Wed, 14 Aug 2013 10:28:27 -0700 Subject: [petsc-users] Result error in repeatedly solving linear equations Message-ID: <520BBE3B.3040401@gmail.com> Hi All, I have many linear equations with the same matrix structure (same non-zero entries) that are derived from a flow problem at different time steps. I feel puzzled that the results are a little different when the solver run repeatedly and one by one. Say, I have three equations, I can get the following results if running three equations together Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 Equation 2: Iterations 2 norm 0.2994E-05 Result error PETSc vs Solver2, max 0.1381E-05 min -0.7209E-08 norm 0.2994E-05 Equation 3: Iterations 2 norm 0.3919E-04 Result error PETSc vs Solver2, max 0.9435E-07 min -0.1808E-04 norm 0.3919E-04 But if I solve only one equation every time, then restart the program to run another one, the results are like this: Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 Equation 2: Iterations 1 norm 0.7949E-05 Result error PETSc vs Solver2, max 0.3501E-05 min -0.8377E-06 norm 0.7949E-05 Equation 3: Iterations 1 norm 0.1980E-04 Result error PETSc vs Solver2, max 0.4168E-08 min -0.9085E-05 norm 0.1980E-04 Note: Solver2 is the original sequential solver used in this flow model. Though there are no big difference in the solution for the above equations, I want to know why? For another large linear equations with more than 400,000 unknowns and 10,000,000 non-zero entries, if the equations are solved repeatedly, they need a lot of iterations or fail, but if the equations are solved one by one, it only needs 1 to 2 iterations. How does this difference come from? The sample codes are attached bellow. Thanks and regards, Danyang !***************************************************************************! !Create matrix, rhs and solver call MatCreateAIJ(Petsc_Comm_World, Petsc_Decide, Petsc_Decide, nb, nb, nd_nzrow, & Petsc_Null_Integer, nd_nzrow, Petsc_Null_Integer, a, ierr) call MatSetOption(a,Mat_New_Nonzero_Allocation_Err,Petsc_False,ierr) call VecCreateMPI(Petsc_Comm_World, Petsc_Decide, nb, b, ierr) call VecDuplicate(b, x, ierr) call VecDuplicate(x, u, ierr) call KSPCreate(Petsc_Comm_World,ksp,ierr) call KSPSetTolerances(ksp,tol, & PETSC_DEFAULT_DOUBLE_PRECISION, & PETSC_DEFAULT_DOUBLE_PRECISION, & 100,ierr) call KSPSetFromOptions(ksp,ierr) !Do time loop do i = 1, nTimeStep call MatGetOwnershipRange(a,istart,iend,ierr) do i = istart, iend - 1 ii = ia_in(i+1) jj = ia_in(i+2) call MatSetValues(a, ione, i, jj-ii, ja_in(ii:jj-1)-1, a_in(ii:jj-1), Insert_Values, ierr) end do call MatAssemblyBegin(a, Mat_Final_Assembly, ierr) call MatAssemblyEnd(a, Mat_Final_Assembly, ierr) call VecGetOwnershipRange(b,istart,iend,ierr) call VecSetValues(b, iend-istart, ix(istart+1:iend), b_in(istart+1:iend), Insert_Values, ierr) call VecAssemblyBegin(b,ierr) call VecAssemblyEnd(b,ierr) if(i == 1) then call MatConvert(a,MATSAME,MAT_INITIAL_MATRIX,a2,ierr) end if !call KSPSetOperators(ksp,a,a2,SAME_PRECONDITIONER,ierr) call KSPSetOperators(ksp,a,a2,SAME_NONZERO_PATTERN,ierr) !These three patterns make no difference in current codes !call KSPSetOperators(ksp,a,a2,DIFFERENT_NONZERO_PATTERN,ierr) call KSPSolve(ksp,b,x,ierr) call KSPGetResidualNorm(ksp,norm,ierr) call KSPGetIterationNumber(ksp,its,ierr) end do !Destroy objects !... !***************************************************************************! -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 14 13:11:07 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 14 Aug 2013 13:11:07 -0500 Subject: [petsc-users] Extracting data from a Petsc matrix In-Reply-To: <1678258854.12362.1376500659340.JavaMail.root@mailhub027.itcs.purdue.edu> References: <8D451A13-10C3-4442-9522-C49CBB929F03@mcs.anl.gov> <1678258854.12362.1376500659340.JavaMail.root@mailhub027.itcs.purdue.edu> Message-ID: On Wed, Aug 14, 2013 at 12:17 PM, Harshad Sahasrabudhe wrote: > Thanks Barry. > > How do we get the compiler names from the Python build system? For > example, MAGMA make.inc varies depending on whether the compiler is gcc or > icc. So is there any easy way to get this information? > That is a horrible way to organize a build. Just ignore what they are doing with flags and use the PETSc flags. Take a look at Chaco.py for a simple install. Matt > Harshad > > ----- Original Message ----- > From: "Barry Smith" > To: "Harshad Sahasrabudhe" > Cc: "Jed Brown" , petsc-users at mcs.anl.gov > Sent: Monday, August 12, 2013 5:25:53 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > > On Aug 12, 2013, at 4:05 PM, Harshad Sahasrabudhe > wrote: > > > Hi Jed, > > > > I am now working to add library support for LU decomposition using > MAGMA. I need your help with the following: > > > > 1) How do I add the options --download-magma, --with-magma, etc. to the > configure script for building with MAGMA? > > Add a new file in config/PETSc/packages (copy one that is already > there and modify for magma). > > > > 2) I have a fair idea how the PETSc code is structured and how to add > source code to the impls/ directory. How does PETSc get to know that there > is an additional implementation (in this case MAGMA) in this directory? Is > there a config file of some sort? > > Add the new directory name to list of directories in the makefile in > that directory and add in MatRegisterAll(). > > Barry > > > > > Thanks, > > Harshad > > > > ----- Original Message ----- > > From: "Jed Brown" > > To: hsahasra at purdue.edu, petsc-users at mcs.anl.gov > > Sent: Saturday, July 13, 2013 12:43:08 PM > > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > > > "hsahasra at purdue.edu" writes: > > > >> Hi, > >> > >> I am working on solving a system of linear equations with square > >> matrix. I'm first factoring the matrix using LU decomposition. > > > > I assume you're solving a dense problem because that is all MAGMA does. > > > >> I want to do the LU decomposition step using MAGMA on GPUs. MAGMA > >> library implements LAPACK functions on a CPU+GPU based system. > >> > >> So my question is, how do I extract the data from a Petsc Mat so that > >> it can be sent to the dgetrf routine in MAGMA. > > > > MatDenseGetArray > > > >> Is there any need for duplicating the data for this step? > > > > You're on your own for storage of factors. Alternatively, you could add > > library support so that you could use PCLU and > > '-pc_factor_mat_solver_package magma' (or PCFactorSetMatSolverPackage). > > Doing this is not a priority for us, but we can provide guidance if you > > want to tackle it. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 14 13:14:19 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 14 Aug 2013 13:14:19 -0500 Subject: [petsc-users] Result error in repeatedly solving linear equations In-Reply-To: <520BBE3B.3040401@gmail.com> References: <520BBE3B.3040401@gmail.com> Message-ID: On Wed, Aug 14, 2013 at 12:28 PM, Danyang Su wrote: > Hi All, > > I have many linear equations with the same matrix structure (same non-zero > entries) that are derived from a flow problem at different time steps. I > feel puzzled that the results are a little different when the solver run > repeatedly and one by one. Say, I have three equations, I can get the > following results if running three equations together > > Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs > Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 > Equation 2: Iterations 2 norm 0.2994E-05 Result error PETSc vs > Solver2, max 0.1381E-05 min -0.7209E-08 norm 0.2994E-05 > Equation 3: Iterations 2 norm 0.3919E-04 Result error PETSc vs > Solver2, max 0.9435E-07 min -0.1808E-04 norm 0.3919E-04 > > But if I solve only one equation every time, then restart the program to > run another one, the results are like this: > > Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs > Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 > Equation 2: Iterations 1 norm 0.7949E-05 Result error PETSc vs > Solver2, max 0.3501E-05 min -0.8377E-06 norm 0.7949E-05 > Equation 3: Iterations 1 norm 0.1980E-04 Result error PETSc vs > Solver2, max 0.4168E-08 min -0.9085E-05 norm 0.1980E-04 > > Note: Solver2 is the original sequential solver used in this flow model. > > Though there are no big difference in the solution for the above > equations, I want to know why? > > For another large linear equations with more than 400,000 unknowns and > 10,000,000 non-zero entries, if the equations are solved repeatedly, they > need a lot of iterations or fail, but if the equations are solved one by > one, it only needs 1 to 2 iterations. > > How does this difference come from? > > The sample codes are attached bellow. > > Thanks and regards, > > Danyang > > > !***************************************************************************! > !Create matrix, rhs and solver > call MatCreateAIJ(Petsc_Comm_World, Petsc_Decide, Petsc_Decide, nb, nb, > nd_nzrow, & > Petsc_Null_Integer, nd_nzrow, Petsc_Null_Integer, a, > ierr) > call MatSetOption(a,Mat_New_Nonzero_Allocation_Err,Petsc_False,ierr) > call VecCreateMPI(Petsc_Comm_World, Petsc_Decide, nb, b, ierr) > call VecDuplicate(b, x, ierr) > call VecDuplicate(x, u, ierr) > call KSPCreate(Petsc_Comm_World,ksp,ierr) > call KSPSetTolerances(ksp,tol, & > PETSC_DEFAULT_DOUBLE_PRECISION, & > PETSC_DEFAULT_DOUBLE_PRECISION, & > 100,ierr) > call KSPSetFromOptions(ksp,ierr) > > !Do time loop > do i = 1, nTimeStep > call MatGetOwnershipRange(a,istart,iend,ierr) > do i = istart, iend - 1 > ii = ia_in(i+1) > jj = ia_in(i+2) > call MatSetValues(a, ione, i, jj-ii, ja_in(ii:jj-1)-1, > a_in(ii:jj-1), Insert_Values, ierr) > end do > call MatAssemblyBegin(a, Mat_Final_Assembly, ierr) > call MatAssemblyEnd(a, Mat_Final_Assembly, ierr) > > call VecGetOwnershipRange(b,istart,iend,ierr) > call VecSetValues(b, iend-istart, ix(istart+1:iend), > b_in(istart+1:iend), Insert_Values, ierr) > call VecAssemblyBegin(b,ierr) > call VecAssemblyEnd(b,ierr) > > if(i == 1) then > call MatConvert(a,MATSAME,MAT_INITIAL_MATRIX,a2,ierr) > Why are you doing this? > end if > !call KSPSetOperators(ksp,a,a2,SAME_PRECONDITIONER,ierr) > Just use a, a for the matrices > call KSPSetOperators(ksp,a,a2,SAME_NONZERO_PATTERN,ierr) > !These three patterns make no difference in current codes > This DOES matter here if you are using the default PC which is ILU. Matt > !call KSPSetOperators(ksp,a,a2,DIFFERENT_NONZERO_PATTERN,ierr) > > call KSPSolve(ksp,b,x,ierr) > > call KSPGetResidualNorm(ksp,norm,ierr) > call KSPGetIterationNumber(ksp,its,ierr) > end do > > !Destroy objects > !... > > !***************************************************************************! > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From hsahasra at purdue.edu Wed Aug 14 13:26:37 2013 From: hsahasra at purdue.edu (Harshad Sahasrabudhe) Date: Wed, 14 Aug 2013 14:26:37 -0400 (EDT) Subject: [petsc-users] Extracting data from a Petsc matrix In-Reply-To: Message-ID: <298189298.12495.1376504797449.JavaMail.root@mailhub027.itcs.purdue.edu> Sorry, I was talking about the Python build system in PETSc. I didn't mean the build system used by Python. And thanks, Chaco.py does clear my doubts. Harshad ----- Original Message ----- From: "Matthew Knepley" To: "Harshad Sahasrabudhe" Cc: "Barry Smith" , petsc-users at mcs.anl.gov Sent: Wednesday, August 14, 2013 2:11:07 PM Subject: Re: [petsc-users] Extracting data from a Petsc matrix On Wed, Aug 14, 2013 at 12:17 PM, Harshad Sahasrabudhe < hsahasra at purdue.edu > wrote: Thanks Barry. How do we get the compiler names from the Python build system? For example, MAGMA make.inc varies depending on whether the compiler is gcc or icc. So is there any easy way to get this information? That is a horrible way to organize a build. Just ignore what they are doing with flags and use the PETSc flags. Take a look at Chaco.py for a simple install. Matt Harshad ----- Original Message ----- From: "Barry Smith" < bsmith at mcs.anl.gov > To: "Harshad Sahasrabudhe" < hsahasra at purdue.edu > Cc: "Jed Brown" < jedbrown at mcs.anl.gov >, petsc-users at mcs.anl.gov Sent: Monday, August 12, 2013 5:25:53 PM Subject: Re: [petsc-users] Extracting data from a Petsc matrix On Aug 12, 2013, at 4:05 PM, Harshad Sahasrabudhe < hsahasra at purdue.edu > wrote: > Hi Jed, > > I am now working to add library support for LU decomposition using MAGMA. I need your help with the following: > > 1) How do I add the options --download-magma, --with-magma, etc. to the configure script for building with MAGMA? Add a new file in config/PETSc/packages (copy one that is already there and modify for magma). > > 2) I have a fair idea how the PETSc code is structured and how to add source code to the impls/ directory. How does PETSc get to know that there is an additional implementation (in this case MAGMA) in this directory? Is there a config file of some sort? Add the new directory name to list of directories in the makefile in that directory and add in MatRegisterAll(). Barry > > Thanks, > Harshad > > ----- Original Message ----- > From: "Jed Brown" < jedbrown at mcs.anl.gov > > To: hsahasra at purdue.edu , petsc-users at mcs.anl.gov > Sent: Saturday, July 13, 2013 12:43:08 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > " hsahasra at purdue.edu " < hsahasra at purdue.edu > writes: > >> Hi, >> >> I am working on solving a system of linear equations with square >> matrix. I'm first factoring the matrix using LU decomposition. > > I assume you're solving a dense problem because that is all MAGMA does. > >> I want to do the LU decomposition step using MAGMA on GPUs. MAGMA >> library implements LAPACK functions on a CPU+GPU based system. >> >> So my question is, how do I extract the data from a Petsc Mat so that >> it can be sent to the dgetrf routine in MAGMA. > > MatDenseGetArray > >> Is there any need for duplicating the data for this step? > > You're on your own for storage of factors. Alternatively, you could add > library support so that you could use PCLU and > '-pc_factor_mat_solver_package magma' (or PCFactorSetMatSolverPackage). > Doing this is not a priority for us, but we can provide guidance if you > want to tackle it. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener From hsahasra at purdue.edu Wed Aug 14 14:14:28 2013 From: hsahasra at purdue.edu (Harshad Sahasrabudhe) Date: Wed, 14 Aug 2013 15:14:28 -0400 (EDT) Subject: [petsc-users] Extracting data from a Petsc matrix In-Reply-To: <298189298.12495.1376504797449.JavaMail.root@mailhub027.itcs.purdue.edu> Message-ID: <862260421.12613.1376507668078.JavaMail.root@mailhub027.itcs.purdue.edu> Oh I get what you are trying to say. Please correct me if I'm wrong. I don't need to use the make.inc prototypes defined in MAGMA, PETSc build system gets all the required options. ----- Original Message ----- From: "Harshad Sahasrabudhe" To: "Matthew Knepley" Cc: "Barry Smith" , petsc-users at mcs.anl.gov Sent: Wednesday, August 14, 2013 2:26:37 PM Subject: Re: [petsc-users] Extracting data from a Petsc matrix Sorry, I was talking about the Python build system in PETSc. I didn't mean the build system used by Python. And thanks, Chaco.py does clear my doubts. Harshad ----- Original Message ----- From: "Matthew Knepley" To: "Harshad Sahasrabudhe" Cc: "Barry Smith" , petsc-users at mcs.anl.gov Sent: Wednesday, August 14, 2013 2:11:07 PM Subject: Re: [petsc-users] Extracting data from a Petsc matrix On Wed, Aug 14, 2013 at 12:17 PM, Harshad Sahasrabudhe < hsahasra at purdue.edu > wrote: Thanks Barry. How do we get the compiler names from the Python build system? For example, MAGMA make.inc varies depending on whether the compiler is gcc or icc. So is there any easy way to get this information? That is a horrible way to organize a build. Just ignore what they are doing with flags and use the PETSc flags. Take a look at Chaco.py for a simple install. Matt Harshad ----- Original Message ----- From: "Barry Smith" < bsmith at mcs.anl.gov > To: "Harshad Sahasrabudhe" < hsahasra at purdue.edu > Cc: "Jed Brown" < jedbrown at mcs.anl.gov >, petsc-users at mcs.anl.gov Sent: Monday, August 12, 2013 5:25:53 PM Subject: Re: [petsc-users] Extracting data from a Petsc matrix On Aug 12, 2013, at 4:05 PM, Harshad Sahasrabudhe < hsahasra at purdue.edu > wrote: > Hi Jed, > > I am now working to add library support for LU decomposition using MAGMA. I need your help with the following: > > 1) How do I add the options --download-magma, --with-magma, etc. to the configure script for building with MAGMA? Add a new file in config/PETSc/packages (copy one that is already there and modify for magma). > > 2) I have a fair idea how the PETSc code is structured and how to add source code to the impls/ directory. How does PETSc get to know that there is an additional implementation (in this case MAGMA) in this directory? Is there a config file of some sort? Add the new directory name to list of directories in the makefile in that directory and add in MatRegisterAll(). Barry > > Thanks, > Harshad > > ----- Original Message ----- > From: "Jed Brown" < jedbrown at mcs.anl.gov > > To: hsahasra at purdue.edu , petsc-users at mcs.anl.gov > Sent: Saturday, July 13, 2013 12:43:08 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > " hsahasra at purdue.edu " < hsahasra at purdue.edu > writes: > >> Hi, >> >> I am working on solving a system of linear equations with square >> matrix. I'm first factoring the matrix using LU decomposition. > > I assume you're solving a dense problem because that is all MAGMA does. > >> I want to do the LU decomposition step using MAGMA on GPUs. MAGMA >> library implements LAPACK functions on a CPU+GPU based system. >> >> So my question is, how do I extract the data from a Petsc Mat so that >> it can be sent to the dgetrf routine in MAGMA. > > MatDenseGetArray > >> Is there any need for duplicating the data for this step? > > You're on your own for storage of factors. Alternatively, you could add > library support so that you could use PCLU and > '-pc_factor_mat_solver_package magma' (or PCFactorSetMatSolverPackage). > Doing this is not a priority for us, but we can provide guidance if you > want to tackle it. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener From knepley at gmail.com Wed Aug 14 14:18:30 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 14 Aug 2013 14:18:30 -0500 Subject: [petsc-users] Extracting data from a Petsc matrix In-Reply-To: <862260421.12613.1376507668078.JavaMail.root@mailhub027.itcs.purdue.edu> References: <298189298.12495.1376504797449.JavaMail.root@mailhub027.itcs.purdue.edu> <862260421.12613.1376507668078.JavaMail.root@mailhub027.itcs.purdue.edu> Message-ID: On Wed, Aug 14, 2013 at 2:14 PM, Harshad Sahasrabudhe wrote: > Oh I get what you are trying to say. Please correct me if I'm wrong. I > don't need to use the make.inc prototypes defined in MAGMA, PETSc build > system gets all the required options. > Yes. If it misses any, they can be added in the CFLAGS line in the Install() function. Matt > ----- Original Message ----- > From: "Harshad Sahasrabudhe" > To: "Matthew Knepley" > Cc: "Barry Smith" , petsc-users at mcs.anl.gov > Sent: Wednesday, August 14, 2013 2:26:37 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > Sorry, I was talking about the Python build system in PETSc. I didn't mean > the build system used by Python. > > And thanks, Chaco.py does clear my doubts. > > Harshad > > ----- Original Message ----- > From: "Matthew Knepley" > To: "Harshad Sahasrabudhe" > Cc: "Barry Smith" , petsc-users at mcs.anl.gov > Sent: Wednesday, August 14, 2013 2:11:07 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > > On Wed, Aug 14, 2013 at 12:17 PM, Harshad Sahasrabudhe < > hsahasra at purdue.edu > wrote: > > > > > Thanks Barry. > > How do we get the compiler names from the Python build system? For > example, MAGMA make.inc varies depending on whether the compiler is gcc or > icc. So is there any easy way to get this information? > > > > That is a horrible way to organize a build. Just ignore what they are > doing with flags and use the PETSc flags. Take > a look at Chaco.py for a simple install. > > > Matt > > > Harshad > > ----- Original Message ----- > From: "Barry Smith" < bsmith at mcs.anl.gov > > To: "Harshad Sahasrabudhe" < hsahasra at purdue.edu > > Cc: "Jed Brown" < jedbrown at mcs.anl.gov >, petsc-users at mcs.anl.gov > Sent: Monday, August 12, 2013 5:25:53 PM > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > > On Aug 12, 2013, at 4:05 PM, Harshad Sahasrabudhe < hsahasra at purdue.edu > > wrote: > > > Hi Jed, > > > > I am now working to add library support for LU decomposition using > MAGMA. I need your help with the following: > > > > 1) How do I add the options --download-magma, --with-magma, etc. to the > configure script for building with MAGMA? > > Add a new file in config/PETSc/packages (copy one that is already there > and modify for magma). > > > > 2) I have a fair idea how the PETSc code is structured and how to add > source code to the impls/ directory. How does PETSc get to know that there > is an additional implementation (in this case MAGMA) in this directory? Is > there a config file of some sort? > > Add the new directory name to list of directories in the makefile in that > directory and add in MatRegisterAll(). > > Barry > > > > > Thanks, > > Harshad > > > > ----- Original Message ----- > > From: "Jed Brown" < jedbrown at mcs.anl.gov > > > To: hsahasra at purdue.edu , petsc-users at mcs.anl.gov > > Sent: Saturday, July 13, 2013 12:43:08 PM > > Subject: Re: [petsc-users] Extracting data from a Petsc matrix > > > > " hsahasra at purdue.edu " < hsahasra at purdue.edu > writes: > > > >> Hi, > >> > >> I am working on solving a system of linear equations with square > >> matrix. I'm first factoring the matrix using LU decomposition. > > > > I assume you're solving a dense problem because that is all MAGMA does. > > > >> I want to do the LU decomposition step using MAGMA on GPUs. MAGMA > >> library implements LAPACK functions on a CPU+GPU based system. > >> > >> So my question is, how do I extract the data from a Petsc Mat so that > >> it can be sent to the dgetrf routine in MAGMA. > > > > MatDenseGetArray > > > >> Is there any need for duplicating the data for this step? > > > > You're on your own for storage of factors. Alternatively, you could add > > library support so that you could use PCLU and > > '-pc_factor_mat_solver_package magma' (or PCFactorSetMatSolverPackage). > > Doing this is not a priority for us, but we can provide guidance if you > > want to tackle it. > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Wed Aug 14 14:34:27 2013 From: mrosso at uci.edu (Michele Rosso) Date: Wed, 14 Aug 2013 12:34:27 -0700 Subject: [petsc-users] GAMG speed In-Reply-To: <874nasld2x.fsf@mcs.anl.gov> References: <51FAA56D.60106@uci.edu> <51FAE69E.4020108@uci.edu> <51FAEF43.2050207@uci.edu> <4B76E0BB-D8A1-4F38-A8D1-0174801E0AD9@mcs.anl.gov> <51FC2A38.5000000@uci.edu> <406E1FE4-5CE5-4840-BD77-DDB5830AD520@mcs.anl.gov> <520A88F6.9070603@uci.edu> <520A9815.7030400@mcs.anl.gov> <520ABC8E.3040204@uci.edu> <520AC9DE.1050508@uci.edu> <520ADCA4.3030902@uci.edu> <520AE004.1010803@uci.edu> <87y585ko2v.fsf@mcs.anl.gov> <520AF220.3070900@uci.edu> <87k3jpkko7.fsf@mcs.anl.gov> <520B2743.7020405@uci.edu> <874nasld2x.fsf@mcs.anl.gov> Message-ID: <520BDBC3.1010805@uci.edu> Jed, thank you. I will proceed with the multiphase test case and I will let you know how it goes and eventually, if possible, try the matrix-free approach. Nevertheless I would like to give a try FMG with high-order prolongation: could you explain me how I can do that? Thank you. Michele On 08/14/2013 04:35 AM, Jed Brown wrote: > Please always use "reply-all" so that your messages go to the list. > This is standard mailing list etiquette. It is important to preserve > threading for people who find this discussion later and so that we do > not waste our time re-answering the same questions that have already > been answered in private side-conversations. You'll likely get an > answer faster that way too. > > Michele Rosso writes: > >> Jed, >> >> thank you very much for the detailed analysis. >> I confirm that >> >> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg >> -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson >> -mg_levels_ksp_max_it 1 >> >> results in a faster solve with the 256^3 grid (it finally beats CG + ICC). >> >> Yes, I perform different solves: I setup matrix and KSP only once at the beginning of the run and then I re-use them >> at each time step. The rhs term changes during the simulation though. For now (single phase flow) the matrix does not change, >> but I will be dealing soon with a multiphase flow and thus not even the matrix values will be constant in time (it will be a variable coefficients Poisson Equation). > Okay, the coefficient variation from the multiphase flow can drastically > change the characteristics of the solve. > >> I need to solve up to the discretization error, so maybe FMG is worth a try. >> The matrix-free approach is appealing given that the Poisson solver is really mission critical (basically it accounts for most of the simulation time). >> I will use the level set method and ghost fluid method in order to account for the discontinuities at the interface between phases: the computation of the matrix and rhs >> values will be influenced by such methods so my only concern is to be sure matrix-free can be used in these circumstances. > Matrix-free can be used in principle, but those problems can be several > orders of magnitude more ill-conditioned, so don't invest any more time > on it right now. Get the discretization set up using assembled > matrices, then go through the options we've tried to find an efficient > solver. The best choice will likely depend on the details of the > formulation, the types of fluids involved, and the geometric > configuration of the fluids. > >> I do not have any prior experience with matrix-free methods so I will have to rely on your assistance for this. >> Thank you very much. >> >> Michele >> >> >> >> >> On 08/13/2013 08:36 PM, Jed Brown wrote: >>> Michele Rosso writes: >>> >>>> Hi Jed, >>>> >>>> I attached the output for both the runs you suggested. At the beginning >>>> of each file I included the options I used. >>>> >>>> On a side note, I tried to run with a grid of 256^3 (exactly as before) >>>> but with less levels, i.e. 3 instead of 4 or 5. >>>> My system stops the run because of an Out Of Memory condition. It is >>>> really odd since I have not changed anything except >>>> - pc_mg_levels. I cannot send you any output since there is none. Do >>>> you have any guess where the problem comes from? >>> The selected algorithm does a direct solve on the coarse grid. Each >>> time you reduce the number of levels, the coarse grid size grows by a >>> factor of 8. Going from 5 to 3 levels is going from a 16^3 coarse grid >>> to a 64^3 coarse grid. Applying a direct solver to the latter ends up >>> using a lot of memory. I think this is not worth bothering with and it >>> might even be (slightly) faster to use 6 levels. That is not where the >>> time is being spent. >>> >>>> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg >>>> -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson >>>> -mg_levels_ksp_max_it 1 >>>> >>>> >>>> >>>> 0 KSP Residual norm 3.653965664551e-05 >>>> 1 KSP Residual norm 1.910638846094e-06 >>>> 2 KSP Residual norm 8.690440116045e-08 >>>> 3 KSP Residual norm 3.732213639394e-09 >>>> 4 KSP Residual norm 1.964855338020e-10 >>> This converges well. >>> >>>> Max Max/Min Avg Total >>>> Time (sec): 4.048e+00 1.00012 4.048e+00 >>>> Objects: 2.490e+02 1.00000 2.490e+02 >>>> Flops: 2.663e+08 1.00000 2.663e+08 2.130e+09 >>>> Flops/sec: 6.579e+07 1.00012 6.579e+07 5.263e+08 >>>> MPI Messages: 6.820e+02 1.00000 6.820e+02 5.456e+03 >>>> MPI Message Lengths: 8.245e+06 1.00000 1.209e+04 6.596e+07 >>>> MPI Reductions: 4.580e+02 1.00000 >>>> VecTDot 12 1.0 2.9428e-02 1.2 6.29e+06 1.0 0.0e+00 0.0e+00 1.2e+01 1 2 0 0 3 1 2 0 0 3 1710 >>>> VecNorm 9 1.0 1.0796e-02 1.2 4.72e+06 1.0 0.0e+00 0.0e+00 9.0e+00 0 2 0 0 2 0 2 0 0 2 3497 >>>> VecScale 24 1.0 2.4652e-04 1.1 1.99e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 6442 >>>> VecCopy 3 1.0 5.0740e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>>> VecSet 116 1.0 1.4349e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>>> VecAXPY 12 1.0 2.8027e-02 1.0 6.29e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1796 >>>> VecAYPX 29 1.0 3.0655e-02 1.4 4.16e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1085 >>>> VecScatterBegin 123 1.0 3.5391e-02 1.1 0.00e+00 0.0 3.5e+03 1.2e+04 0.0e+00 1 0 65 66 0 1 0 65 66 0 0 >>>> VecScatterEnd 123 1.0 2.5395e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>>> MatMult 31 1.0 2.3556e-01 1.0 5.62e+07 1.0 1.0e+03 2.3e+04 0.0e+00 6 21 19 36 0 6 21 19 36 0 1908 >>>> MatMultAdd 24 1.0 5.9044e-02 1.0 1.21e+07 1.0 5.8e+02 2.8e+03 0.0e+00 1 5 11 2 0 1 5 11 2 0 1644 >>>> MatMultTranspose 28 1.0 7.4601e-02 1.1 1.42e+07 1.0 6.7e+02 2.8e+03 0.0e+00 2 5 12 3 0 2 5 12 3 0 1518 >>>> MatSolve 6 1.0 3.8311e-03 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3006 >>>> MatSOR 48 1.0 5.8050e-01 1.0 1.01e+08 1.0 8.6e+02 1.5e+04 4.8e+01 14 38 16 19 10 14 38 16 19 11 1390 >>> Most of the solve time is in MatSOR and MatMult. That's expected since >>> the subdomains are pretty big. >>> >>>> MatLUFactorSym 1 1.0 3.0620e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 1 0 0 0 0 1 0 >>>> MatLUFactorNum 1 1.0 2.4665e-02 1.0 1.95e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 7 0 0 0 1 7 0 0 0 6329 >>>> MatAssemblyBegin 20 1.0 2.4351e-02 6.7 0.00e+00 0.0 0.0e+00 0.0e+00 2.2e+01 0 0 0 0 5 0 0 0 0 5 0 >>>> MatAssemblyEnd 20 1.0 1.3176e-01 1.0 0.00e+00 0.0 5.6e+02 2.1e+03 7.2e+01 3 0 10 2 16 3 0 10 2 16 0 >>>> MatGetRowIJ 1 1.0 1.1516e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>>> MatGetOrdering 1 1.0 4.1008e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>>> MatView 16 1.3 1.0209e-03 2.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 3 0 0 0 0 3 0 >>>> MatPtAP 4 1.0 6.4001e-01 1.0 4.06e+07 1.0 1.1e+03 1.7e+04 1.0e+02 16 15 21 30 22 16 15 21 30 22 507 >>> MatPtAP dominates the setup time. For profiling, you could register a >>> stage (PetscLogStageRegister) and time the setup separately from the >>> solve. >>> >>>> MatPtAPSymbolic 4 1.0 3.7003e-01 1.0 0.00e+00 0.0 7.2e+02 2.0e+04 6.0e+01 9 0 13 22 13 9 0 13 22 13 0 >>>> MatPtAPNumeric 4 1.0 2.7004e-01 1.0 4.06e+07 1.0 4.2e+02 1.2e+04 4.0e+01 7 15 8 8 9 7 15 8 8 9 1202 >>>> MatGetRedundant 1 1.0 7.9393e-04 1.0 0.00e+00 0.0 1.7e+02 7.1e+03 4.0e+00 0 0 3 2 1 0 0 3 2 1 0 >>>> MatGetLocalMat 4 1.0 3.9521e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 1 0 0 0 2 1 0 0 0 2 0 >>>> MatGetBrAoCol 4 1.0 1.7719e-02 1.0 0.00e+00 0.0 4.3e+02 2.7e+04 8.0e+00 0 0 8 18 2 0 0 8 18 2 0 >>>> MatGetSymTrans 8 1.0 1.3007e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>>> KSPSetUp 7 1.0 1.3097e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 0 0 0 0 5 0 0 0 0 5 0 >>>> KSPSolve 2 1.0 1.0450e+00 1.0 2.04e+08 1.0 3.4e+03 1.2e+04 7.5e+01 26 77 62 60 16 26 77 62 60 16 1563 >>>> PCSetUp 1 1.0 8.6248e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 21 23 35 32 69 21 23 35 32 69 576 >>>> PCApply 6 1.0 8.4384e-01 1.0 1.61e+08 1.0 3.2e+03 9.0e+03 4.8e+01 21 60 59 44 10 21 60 59 44 11 1523 >>> Do you know why there are 6 PCApply events? With four iterations of the >>> Krylov method, there should be only 5 events. Oh, it looks like you do >>> two solves. Is one of those with a different system? >>> >>> Recall that the old KSPSolve time was over 3.35 seconds. >>> >>>> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg >>>> -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson >>>> -mg_levels_ksp_max_it 1 -pc_mg_type full >>>> >>>> 0 KSP Residual norm 3.654533581988e-05 >>>> 1 KSP Residual norm 8.730776244351e-07 >>>> 2 KSP Residual norm 3.474626061661e-08 >>>> 3 KSP Residual norm 1.813665557493e-09 >>> This converges slightly faster, but ends up not paying off. >>> >>>> Time (sec): 4.261e+00 1.00012 4.261e+00 >>>> Objects: 2.950e+02 1.00000 2.950e+02 >>>> Flops: 3.322e+08 1.00000 3.322e+08 2.658e+09 >>>> Flops/sec: 7.797e+07 1.00012 7.796e+07 6.237e+08 >>>> MPI Messages: 1.442e+03 1.00000 1.442e+03 1.154e+04 >>>> MPI Message Lengths: 1.018e+07 1.00000 7.057e+03 8.141e+07 >>>> MPI Reductions: 5.460e+02 1.00000 >>> More messages, more work, etc., so not better. >>> >>>> KSPSolve 2 1.0 1.2287e+00 1.0 2.70e+08 1.0 9.5e+03 5.8e+03 1.6e+02 29 81 82 68 30 29 81 82 68 30 1758 >>>> PCSetUp 1 1.0 8.6414e-01 1.0 6.21e+07 1.0 1.9e+03 1.1e+04 3.2e+02 20 19 17 26 58 20 19 17 26 58 575 >>>> PCApply 5 1.0 1.0571e+00 1.0 2.33e+08 1.0 9.3e+03 4.9e+03 1.4e+02 24 70 81 56 26 24 70 81 56 26 1764 >>> It's still entirely possible that you can make Full MG beat V-cycles, >>> especially if you only need to converge up to discretization error. By >>> my figures, your good solver takes 12 work units to converge well below >>> discretization error (after Galerkin setup, but maybe you only need to >>> do that once?). If you only need to equal truncation error, this can be >>> brought down to about 5 (probably at best a 2x speedup in parallel). >>> This would involve a high-order (cubic) FMG prolongation. >>> >>> Alternatively, you can speed up the implementation (and significantly >>> reduce memory usage) by creating geometric coarse levels and a >>> matrix-free implementation of MatSOR and MatMult. (The matrices are >>> great for experimenting, but if this solver is mission critical and >>> still a bottleneck, the matrix is an inefficient way to represent the >>> operator since it has very low arithmetic intensity/requires a lot of >>> memory bandwidth.) I predict you can probably speed up the solve by >>> perhaps another factor of 2 with a good matrix-free FMG implementation. >>> Do you want to go down this path? -------------- next part -------------- An HTML attachment was scrubbed... URL: From u.tabak at tudelft.nl Wed Aug 14 14:39:50 2013 From: u.tabak at tudelft.nl (Umut Tabak) Date: Wed, 14 Aug 2013 21:39:50 +0200 Subject: [petsc-users] Iterative solution for schur complement Message-ID: <520BDD06.1030805@tudelft.nl> Dear all, I am looking at a system where I am trying to investigate this ill-conditioned problem with some iterative tricks or not. Namely, the system that I try to solve is (B - C^T A^{-1}C) x2 = b2 which results from block symmetric representation A C C^T B Unfortunately, B is indefinite, I tried some tries in MATLAB but none of them gave convergence. What I tried is listed below: + Even if B is indefinite, it is symmetric, and I am trying to use its LDLT decomposition as a preconditioner for the above system. Besides, I am also using the LDLT for the matrix vector multiplications which comes from A^{-1} related terms. I am not forming the operator matrix here but it is a function handle that represents the matrix-vector multiplication. Also the preconditioner solve related to B2 is also a function handle. Of course in this case, my preconditioner B is not SPD however it is the direct factor of B. Due to this reason, I was expecting to get better results while using also the direct factorization of A2. + The strange thing in MATLAB is that CG fails on the very first iteration due to the reason that some parameters are too small to continue, I can understand this since that implicitly boils down to the Cholesky decomposition of the projected system. But minres also fails with the same error. I am hesitating whether I shall program minres or gmres myself to detect the source of the problem, any ideas on where the problem might come from, especially while using minres? + C matrix can be relatively sparse in comparison to A and B which is a coupling matrix, I am not sure if I can make use of this information or not? + Are there any SPD preconditioners that I can try with this system? I guess the above mentioned break down is also related to the selection of the preconditioner although I am not sure on this. Any other pointers and ideas are appreciated on this problem. Best regards, Umut From bsmith at mcs.anl.gov Wed Aug 14 14:50:42 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 14 Aug 2013 14:50:42 -0500 Subject: [petsc-users] Result error in repeatedly solving linear equations In-Reply-To: <520BBE3B.3040401@gmail.com> References: <520BBE3B.3040401@gmail.com> Message-ID: <6FC0D93D-59F5-4220-80C3-A698EF1A7A45@mcs.anl.gov> On Aug 14, 2013, at 12:28 PM, Danyang Su wrote: > Hi All, > > I have many linear equations with the same matrix structure (same non-zero entries) that are derived from a flow problem at different time steps. I feel puzzled that the results are a little different when the solver run repeatedly and one by one. Say, I have three equations, I can get the following results if running three equations together > > Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 > Equation 2: Iterations 2 norm 0.2994E-05 Result error PETSc vs Solver2, max 0.1381E-05 min -0.7209E-08 norm 0.2994E-05 > Equation 3: Iterations 2 norm 0.3919E-04 Result error PETSc vs Solver2, max 0.9435E-07 min -0.1808E-04 norm 0.3919E-04 > > But if I solve only one equation every time, then restart the program to run another one, the results are like this: How are you saving the values and reloading them when you "restart the program"? Are you sure you are saving the exact values of everything in binary? Saving in ASCII and reading it in will introduce slight differences in the numerical values. Barry > > Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 > Equation 2: Iterations 1 norm 0.7949E-05 Result error PETSc vs Solver2, max 0.3501E-05 min -0.8377E-06 norm 0.7949E-05 > Equation 3: Iterations 1 norm 0.1980E-04 Result error PETSc vs Solver2, max 0.4168E-08 min -0.9085E-05 norm 0.1980E-04 > > Note: Solver2 is the original sequential solver used in this flow model. > > Though there are no big difference in the solution for the above equations, I want to know why? > > For another large linear equations with more than 400,000 unknowns and 10,000,000 non-zero entries, if the equations are solved repeatedly, they need a lot of iterations or fail, but if the equations are solved one by one, it only needs 1 to 2 iterations. > > How does this difference come from? > > The sample codes are attached bellow. > > Thanks and regards, > > Danyang > > !***************************************************************************! > !Create matrix, rhs and solver > call MatCreateAIJ(Petsc_Comm_World, Petsc_Decide, Petsc_Decide, nb, nb, nd_nzrow, & > Petsc_Null_Integer, nd_nzrow, Petsc_Null_Integer, a, ierr) > call MatSetOption(a,Mat_New_Nonzero_Allocation_Err,Petsc_False,ierr) > call VecCreateMPI(Petsc_Comm_World, Petsc_Decide, nb, b, ierr) > call VecDuplicate(b, x, ierr) > call VecDuplicate(x, u, ierr) > call KSPCreate(Petsc_Comm_World,ksp,ierr) > call KSPSetTolerances(ksp,tol, & > PETSC_DEFAULT_DOUBLE_PRECISION, & > PETSC_DEFAULT_DOUBLE_PRECISION, & > 100,ierr) > call KSPSetFromOptions(ksp,ierr) > > !Do time loop > do i = 1, nTimeStep > call MatGetOwnershipRange(a,istart,iend,ierr) > do i = istart, iend - 1 > ii = ia_in(i+1) > jj = ia_in(i+2) > call MatSetValues(a, ione, i, jj-ii, ja_in(ii:jj-1)-1, a_in(ii:jj-1), Insert_Values, ierr) > end do > call MatAssemblyBegin(a, Mat_Final_Assembly, ierr) > call MatAssemblyEnd(a, Mat_Final_Assembly, ierr) > > call VecGetOwnershipRange(b,istart,iend,ierr) > call VecSetValues(b, iend-istart, ix(istart+1:iend), b_in(istart+1:iend), Insert_Values, ierr) > call VecAssemblyBegin(b,ierr) > call VecAssemblyEnd(b,ierr) > > if(i == 1) then > call MatConvert(a,MATSAME,MAT_INITIAL_MATRIX,a2,ierr) > end if > !call KSPSetOperators(ksp,a,a2,SAME_PRECONDITIONER,ierr) > call KSPSetOperators(ksp,a,a2,SAME_NONZERO_PATTERN,ierr) !These three patterns make no difference in current codes > !call KSPSetOperators(ksp,a,a2,DIFFERENT_NONZERO_PATTERN,ierr) > > call KSPSolve(ksp,b,x,ierr) > > call KSPGetResidualNorm(ksp,norm,ierr) > call KSPGetIterationNumber(ksp,its,ierr) > end do > > !Destroy objects > !... > !***************************************************************************! From danyang.su at gmail.com Wed Aug 14 15:10:04 2013 From: danyang.su at gmail.com (Danyang Su) Date: Wed, 14 Aug 2013 13:10:04 -0700 Subject: [petsc-users] Result error in repeatedly solving linear equations In-Reply-To: References: <520BBE3B.3040401@gmail.com> Message-ID: <520BE41C.6080801@gmail.com> Hi Matthew, Thanks so much. It works out now. Danyang On 14/08/2013 11:14 AM, Matthew Knepley wrote: > On Wed, Aug 14, 2013 at 12:28 PM, Danyang Su > wrote: > > Hi All, > > I have many linear equations with the same matrix structure (same > non-zero entries) that are derived from a flow problem at > different time steps. I feel puzzled that the results are a little > different when the solver run repeatedly and one by one. Say, I > have three equations, I can get the following results if running > three equations together > > Equation 1: Iterations 1 norm 0.9457E-02 Result error > PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 > Equation 2: Iterations 2 norm 0.2994E-05 Result error PETSc > vs Solver2, max 0.1381E-05 min -0.7209E-08 norm 0.2994E-05 > Equation 3: Iterations 2 norm 0.3919E-04 Result error PETSc > vs Solver2, max 0.9435E-07 min -0.1808E-04 norm 0.3919E-04 > > But if I solve only one equation every time, then restart the > program to run another one, the results are like this: > > Equation 1: Iterations 1 norm 0.9457E-02 Result error > PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 > Equation 2: Iterations 1 norm 0.7949E-05 Result error PETSc > vs Solver2, max 0.3501E-05 min -0.8377E-06 norm 0.7949E-05 > Equation 3: Iterations 1 norm 0.1980E-04 Result error PETSc > vs Solver2, max 0.4168E-08 min -0.9085E-05 norm 0.1980E-04 > > Note: Solver2 is the original sequential solver used in this flow > model. > > Though there are no big difference in the solution for the above > equations, I want to know why? > > For another large linear equations with more than 400,000 unknowns > and 10,000,000 non-zero entries, if the equations are solved > repeatedly, they need a lot of iterations or fail, but if the > equations are solved one by one, it only needs 1 to 2 iterations. > > How does this difference come from? > > The sample codes are attached bellow. > > Thanks and regards, > > Danyang > > !***************************************************************************! > !Create matrix, rhs and solver > call MatCreateAIJ(Petsc_Comm_World, Petsc_Decide, Petsc_Decide, > nb, nb, nd_nzrow, & > Petsc_Null_Integer, nd_nzrow, > Petsc_Null_Integer, a, ierr) > call MatSetOption(a,Mat_New_Nonzero_Allocation_Err,Petsc_False,ierr) > call VecCreateMPI(Petsc_Comm_World, Petsc_Decide, nb, b, ierr) > call VecDuplicate(b, x, ierr) > call VecDuplicate(x, u, ierr) > call KSPCreate(Petsc_Comm_World,ksp,ierr) > call KSPSetTolerances(ksp,tol, & > PETSC_DEFAULT_DOUBLE_PRECISION, & > PETSC_DEFAULT_DOUBLE_PRECISION, & > 100,ierr) > call KSPSetFromOptions(ksp,ierr) > > !Do time loop > do i = 1, nTimeStep > call MatGetOwnershipRange(a,istart,iend,ierr) > do i = istart, iend - 1 > ii = ia_in(i+1) > jj = ia_in(i+2) > call MatSetValues(a, ione, i, jj-ii, ja_in(ii:jj-1)-1, > a_in(ii:jj-1), Insert_Values, ierr) > end do > call MatAssemblyBegin(a, Mat_Final_Assembly, ierr) > call MatAssemblyEnd(a, Mat_Final_Assembly, ierr) > > call VecGetOwnershipRange(b,istart,iend,ierr) > call VecSetValues(b, iend-istart, ix(istart+1:iend), > b_in(istart+1:iend), Insert_Values, ierr) > call VecAssemblyBegin(b,ierr) > call VecAssemblyEnd(b,ierr) > > if(i == 1) then > call MatConvert(a,MATSAME,MAT_INITIAL_MATRIX,a2,ierr) > > > Why are you doing this? > > end if > !call KSPSetOperators(ksp,a,a2,SAME_PRECONDITIONER,ierr) > > > Just use a, a for the matrices > > call KSPSetOperators(ksp,a,a2,SAME_NONZERO_PATTERN,ierr) > !These three patterns make no difference in current codes > > > This DOES matter here if you are using the default PC which is ILU. > > Matt > > !call KSPSetOperators(ksp,a,a2,DIFFERENT_NONZERO_PATTERN,ierr) > > call KSPSolve(ksp,b,x,ierr) > > call KSPGetResidualNorm(ksp,norm,ierr) > call KSPGetIterationNumber(ksp,its,ierr) > end do > > !Destroy objects > !... > !***************************************************************************! > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From danyang.su at gmail.com Wed Aug 14 15:14:27 2013 From: danyang.su at gmail.com (Danyang Su) Date: Wed, 14 Aug 2013 13:14:27 -0700 Subject: [petsc-users] Result error in repeatedly solving linear equations In-Reply-To: <6FC0D93D-59F5-4220-80C3-A698EF1A7A45@mcs.anl.gov> References: <520BBE3B.3040401@gmail.com> <6FC0D93D-59F5-4220-80C3-A698EF1A7A45@mcs.anl.gov> Message-ID: <520BE523.4080004@gmail.com> Hi Barry, The problem was caused by precondition. After updating the routine with "SAME_NONZERO_PATTERN ", I can get exactly the same results. call KSPSetOperators(ksp,a,a,SAME_NONZERO_PATTERN,ierr) Thanks, Danyang On 14/08/2013 12:50 PM, Barry Smith wrote: > On Aug 14, 2013, at 12:28 PM, Danyang Su wrote: > >> Hi All, >> >> I have many linear equations with the same matrix structure (same non-zero entries) that are derived from a flow problem at different time steps. I feel puzzled that the results are a little different when the solver run repeatedly and one by one. Say, I have three equations, I can get the following results if running three equations together >> >> Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 >> Equation 2: Iterations 2 norm 0.2994E-05 Result error PETSc vs Solver2, max 0.1381E-05 min -0.7209E-08 norm 0.2994E-05 >> Equation 3: Iterations 2 norm 0.3919E-04 Result error PETSc vs Solver2, max 0.9435E-07 min -0.1808E-04 norm 0.3919E-04 >> >> But if I solve only one equation every time, then restart the program to run another one, the results are like this: > How are you saving the values and reloading them when you "restart the program"? Are you sure you are saving the exact values of everything in binary? Saving in ASCII and reading it in will introduce slight differences in the numerical values. > > Barry > >> Equation 1: Iterations 1 norm 0.9457E-02 Result error PETSc vs Solver2, max 0.4362E-02 min -0.2277E-04 norm 0.9458E-02 >> Equation 2: Iterations 1 norm 0.7949E-05 Result error PETSc vs Solver2, max 0.3501E-05 min -0.8377E-06 norm 0.7949E-05 >> Equation 3: Iterations 1 norm 0.1980E-04 Result error PETSc vs Solver2, max 0.4168E-08 min -0.9085E-05 norm 0.1980E-04 >> >> Note: Solver2 is the original sequential solver used in this flow model. >> >> Though there are no big difference in the solution for the above equations, I want to know why? >> >> For another large linear equations with more than 400,000 unknowns and 10,000,000 non-zero entries, if the equations are solved repeatedly, they need a lot of iterations or fail, but if the equations are solved one by one, it only needs 1 to 2 iterations. >> >> How does this difference come from? >> >> The sample codes are attached bellow. >> >> Thanks and regards, >> >> Danyang >> >> !***************************************************************************! >> !Create matrix, rhs and solver >> call MatCreateAIJ(Petsc_Comm_World, Petsc_Decide, Petsc_Decide, nb, nb, nd_nzrow, & >> Petsc_Null_Integer, nd_nzrow, Petsc_Null_Integer, a, ierr) >> call MatSetOption(a,Mat_New_Nonzero_Allocation_Err,Petsc_False,ierr) >> call VecCreateMPI(Petsc_Comm_World, Petsc_Decide, nb, b, ierr) >> call VecDuplicate(b, x, ierr) >> call VecDuplicate(x, u, ierr) >> call KSPCreate(Petsc_Comm_World,ksp,ierr) >> call KSPSetTolerances(ksp,tol, & >> PETSC_DEFAULT_DOUBLE_PRECISION, & >> PETSC_DEFAULT_DOUBLE_PRECISION, & >> 100,ierr) >> call KSPSetFromOptions(ksp,ierr) >> >> !Do time loop >> do i = 1, nTimeStep >> call MatGetOwnershipRange(a,istart,iend,ierr) >> do i = istart, iend - 1 >> ii = ia_in(i+1) >> jj = ia_in(i+2) >> call MatSetValues(a, ione, i, jj-ii, ja_in(ii:jj-1)-1, a_in(ii:jj-1), Insert_Values, ierr) >> end do >> call MatAssemblyBegin(a, Mat_Final_Assembly, ierr) >> call MatAssemblyEnd(a, Mat_Final_Assembly, ierr) >> >> call VecGetOwnershipRange(b,istart,iend,ierr) >> call VecSetValues(b, iend-istart, ix(istart+1:iend), b_in(istart+1:iend), Insert_Values, ierr) >> call VecAssemblyBegin(b,ierr) >> call VecAssemblyEnd(b,ierr) >> >> if(i == 1) then >> call MatConvert(a,MATSAME,MAT_INITIAL_MATRIX,a2,ierr) >> end if >> !call KSPSetOperators(ksp,a,a2,SAME_PRECONDITIONER,ierr) >> call KSPSetOperators(ksp,a,a2,SAME_NONZERO_PATTERN,ierr) !These three patterns make no difference in current codes >> !call KSPSetOperators(ksp,a,a2,DIFFERENT_NONZERO_PATTERN,ierr) >> >> call KSPSolve(ksp,b,x,ierr) >> >> call KSPGetResidualNorm(ksp,norm,ierr) >> call KSPGetIterationNumber(ksp,its,ierr) >> end do >> >> !Destroy objects >> !... >> !***************************************************************************! From jedbrown at mcs.anl.gov Wed Aug 14 15:37:40 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 14 Aug 2013 15:37:40 -0500 Subject: [petsc-users] Iterative solution for schur complement In-Reply-To: <520BDD06.1030805@tudelft.nl> References: <520BDD06.1030805@tudelft.nl> Message-ID: <87y584gga3.fsf@mcs.anl.gov> Umut Tabak writes: > Dear all, > > I am looking at a system where I am trying to investigate this > ill-conditioned problem with some iterative tricks or not. Namely, the > system that I try to solve is > > (B - C^T A^{-1}C) x2 = b2 > > which results from block symmetric representation > > A C > C^T B What physics do you have here? > Unfortunately, B is indefinite, I tried some tries in MATLAB but none of > them gave convergence. What I tried is listed below: > > + Even if B is indefinite, it is symmetric, and I am trying to use its > LDLT decomposition as a preconditioner for the above system. Besides, I > am also using the LDLT for the matrix vector multiplications which comes > from A^{-1} related terms. I am not forming the operator matrix here but > it is a function handle that represents the matrix-vector > multiplication. Also the preconditioner solve related to B2 is also a > function handle. Of course in this case, my preconditioner B is not SPD > however it is the direct factor of B. Due to this reason, I was > expecting to get better results while using also the direct > factorization of A2. > > + The strange thing in MATLAB is that CG fails on the very first > iteration due to the reason that some parameters are too small to > continue, I can understand this since that implicitly boils down to the > Cholesky decomposition of the projected system. But minres also fails > with the same error. I am hesitating whether I shall program minres or > gmres myself to detect the source of the problem, any ideas on where the > problem might come from, especially while using minres? Both CG and MINRES require an SPD preconditioner. It sounds like B is a poor approximation to the Schur complement S = B - C^T A^{-1} C. Depending on your application area, there are a few classes of preconditioners that you might consider. These include the least-squares commutator, physics-based approximate commutator, SIMPLE(R), and DD and multigrid methods applied directly to the indefinite problem. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From u.tabak at tudelft.nl Wed Aug 14 15:56:50 2013 From: u.tabak at tudelft.nl (Umut Tabak) Date: Wed, 14 Aug 2013 22:56:50 +0200 Subject: [petsc-users] Iterative solution for schur complement In-Reply-To: <87y584gga3.fsf@mcs.anl.gov> References: <520BDD06.1030805@tudelft.nl> <87y584gga3.fsf@mcs.anl.gov> Message-ID: <520BEF12.9030405@tudelft.nl> On 08/14/2013 10:37 PM, Jed Brown wrote: > Umut Tabak writes: > >> Dear all, >> >> I am looking at a system where I am trying to investigate this >> ill-conditioned problem with some iterative tricks or not. Namely, the >> system that I try to solve is >> >> (B - C^T A^{-1}C) x2 = b2 >> >> which results from block symmetric representation >> >> A C >> C^T B > What physics do you have here? Hi Jed, 'A' results from the discretization of structural field equations which is also ill-conditioned. More specifically, it is (Ks-a*Ms) where Ks and Ms are stiffness and mass matrices of the structural domain. However, 'B' results from the discretization of the Helmholtz operator for the fluid domain. It is also similarly represented as (Kf-a*Mf) as above. > > Both CG and MINRES require an SPD preconditioner. It sounds like B is a > poor approximation to the Schur complement S = B - C^T A^{-1} C. > Depending on your application area, there are a few classes of > preconditioners that you might consider. These include the > least-squares commutator, physics-based approximate commutator, > SIMPLE(R), and DD and multigrid methods applied directly to the > indefinite problem. Unfortunaltely, yes, even if I have the complete factor for B and even if C is a pretty sparse matrix, this is not a good preconditioner eventually, that is clear to me as well. Before leaving these ideas, I am trying to convince myselft that this idea is not useful and cannot be improved further. But, as a poor engineer ;), I had the feeling that since the fluid part only includes one variable which is the pressure and the domain is homogeneous, I would expect some better ways to exist in order to solve this problem. Since the domain is homogeneous, at least the fluid domain, and it is modelled with a scalar variable, I was thinking that scaling should not be a problem. But, there is another important point, due to the modelling approach used Kf is a singular matrix with one zero eigenvalue(and this is always the case for a specific type of boundary condition which is the hard wall condition) and Mf is pretty well conditioned as a standalone matrix. The source of the problem is writing representing B as (Kf-a*Mf) or as (Kf/a-Mf) in the original block diagonal representation. Can you figure out something more after these explanations? What would you suggest as a first try and, maybe, a couple of more? Thanks, Umut From knepley at gmail.com Wed Aug 14 16:01:56 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 14 Aug 2013 16:01:56 -0500 Subject: [petsc-users] Iterative solution for schur complement In-Reply-To: <520BEF12.9030405@tudelft.nl> References: <520BDD06.1030805@tudelft.nl> <87y584gga3.fsf@mcs.anl.gov> <520BEF12.9030405@tudelft.nl> Message-ID: On Wed, Aug 14, 2013 at 3:56 PM, Umut Tabak wrote: > On 08/14/2013 10:37 PM, Jed Brown wrote: > >> Umut Tabak writes: >> >> Dear all, >>> >>> I am looking at a system where I am trying to investigate this >>> ill-conditioned problem with some iterative tricks or not. Namely, the >>> system that I try to solve is >>> >>> (B - C^T A^{-1}C) x2 = b2 >>> >>> which results from block symmetric representation >>> >>> A C >>> C^T B >>> >> What physics do you have here? >> > Hi Jed, > > 'A' results from the discretization of structural field equations which is > also ill-conditioned. More specifically, it is (Ks-a*Ms) where Ks and Ms > are stiffness and mass matrices of the structural domain. > However, 'B' results from the discretization of the Helmholtz operator for > the fluid domain. It is also similarly represented as (Kf-a*Mf) as above. > Okay, this is fluid-structure interaction. Why not start with the multiplicative combination in PCFIELDSPLIT first? Then you can move to Schur complement with just an option if you figure out a good preconditioner? Matt > >> Both CG and MINRES require an SPD preconditioner. It sounds like B is a >> poor approximation to the Schur complement S = B - C^T A^{-1} C. >> Depending on your application area, there are a few classes of >> preconditioners that you might consider. These include the >> least-squares commutator, physics-based approximate commutator, >> SIMPLE(R), and DD and multigrid methods applied directly to the >> indefinite problem. >> > Unfortunaltely, yes, even if I have the complete factor for B and even if > C is a pretty sparse matrix, this is not a good preconditioner eventually, > that is clear to me as well. > > Before leaving these ideas, I am trying to convince myselft that this idea > is not useful and cannot be improved further. But, as a poor engineer ;), I > had the feeling that since the fluid part only includes one variable which > is the pressure and the domain is homogeneous, I would expect some better > ways to exist in order to solve this problem. > > Since the domain is homogeneous, at least the fluid domain, and it is > modelled with a scalar variable, I was thinking that scaling should not be > a problem. > > But, there is another important point, due to the modelling approach used > Kf is a singular matrix with one zero eigenvalue(and this is always the > case for a specific type of boundary condition which is the hard wall > condition) and Mf is pretty well conditioned as a standalone matrix. The > source of the problem is writing representing B as (Kf-a*Mf) or as > (Kf/a-Mf) in the original block diagonal representation. > > Can you figure out something more after these explanations? What would you > suggest as a first try and, maybe, a couple of more? > Thanks, > Umut > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From u.tabak at tudelft.nl Wed Aug 14 16:06:37 2013 From: u.tabak at tudelft.nl (Umut Tabak) Date: Wed, 14 Aug 2013 23:06:37 +0200 Subject: [petsc-users] Iterative solution for schur complement In-Reply-To: References: <520BDD06.1030805@tudelft.nl> <87y584gga3.fsf@mcs.anl.gov> <520BEF12.9030405@tudelft.nl> Message-ID: <520BF15D.8010306@tudelft.nl> On 08/14/2013 11:01 PM, Matthew Knepley wrote: > > > Okay, this is fluid-structure interaction. Why not start with the > multiplicative combination in PCFIELDSPLIT first? Then you > can move to Schur complement with just an option if you figure out a > good preconditioner? Yes, related to vibro-acoustic coupling, I guess I should read the section for 'Solving Block Matrices' right? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Thu Aug 15 01:30:49 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Thu, 15 Aug 2013 14:30:49 +0800 (CST) Subject: [petsc-users] some problem with the VecCreateGhost. Message-ID: <77a7d69e.9952.14080ab5eb7.Coremail.ztdepyahoo@163.com> I want to know 1. What is the relationship between Vhat and LocalVhat. Is the LocalVhat a copy of the Vhat plus the Ghost values. I think the LocalVhat will consume more memeory than the Vhat, am i right? 2. can i direct call VecGetArrary to the Vhat if i do not operate on the ghost value. Vec Vhat; Vec LocalVhat; double* VhatVec; VecCreateGhost(PETSC_COMM_WORLD,aMesh->Nx*aMesh->Ny/Commsize,PETSC_DECIDE,aMesh->nghosts,&aMesh->ghosts[0],&Vhat); VecGhostGetLocalForm(Vhat,&LocalVhat); VecGhostUpdateBegin(Vhat,INSERT_VALUES,SCATTER_FORWARD); VecGhostUpdateEnd(Vhat,INSERT_VALUES,SCATTER_FORWARD); -------------- next part -------------- An HTML attachment was scrubbed... URL: From Wadud.Miah at awe.co.uk Thu Aug 15 04:31:35 2013 From: Wadud.Miah at awe.co.uk (Wadud.Miah at awe.co.uk) Date: Thu, 15 Aug 2013 09:31:35 +0000 Subject: [petsc-users] -no_signal_handler flag Message-ID: <201308150931.r7F9VhnG024224@msw2.awe.co.uk> Hello, Would it be possible to build PETSc so that the flag -no_signal_handler does not have to specified at run time? Or is there a PETSc subroutine that can be called to implement the -no_signal_handler feature? Many thanks, -------------------------- Wadud Miah HPC, Design and Theoretical Physics Direct: 0118 98 56220 AWE, Aldermaston, Reading, RG7 4PR ___________________________________________________ ____________________________ The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited. Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer. While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected. AWE Plc Registered in England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 15 06:12:19 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Aug 2013 06:12:19 -0500 Subject: [petsc-users] some problem with the VecCreateGhost. In-Reply-To: <77a7d69e.9952.14080ab5eb7.Coremail.ztdepyahoo@163.com> References: <77a7d69e.9952.14080ab5eb7.Coremail.ztdepyahoo@163.com> Message-ID: On Thu, Aug 15, 2013 at 1:30 AM, ??? wrote: > I want to know > 1. What is the relationship between Vhat and LocalVhat. Is the LocalVhat a > copy of the Vhat plus the Ghost values. I think the LocalVhat will consume > more memeory than the Vhat, am i right? > No, the reuse the same memory. > 2. can i direct call VecGetArrary to the Vhat if i do not operate on the > ghost value. > Yes. Matt > Vec Vhat; > Vec LocalVhat; > double* VhatVec; > > > VecCreateGhost(PETSC_COMM_WORLD,aMesh->Nx*aMesh->Ny/Commsize,PETSC_DECIDE,aMesh->nghosts,&aMesh->ghosts[0],&Vhat); &nb > sp; > VecGhostGetLocalForm(Vhat,&LocalVhat); > VecGhostUpdateBegin(Vhat,INSERT_VALUES,SCATTER_FORWARD); > VecGhostUpdateEnd(Vhat,INSERT_VALUES,SCATTER_FORWARD); > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 15 06:13:28 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Aug 2013 06:13:28 -0500 Subject: [petsc-users] -no_signal_handler flag In-Reply-To: <201308150931.r7F9VhnG024224@msw2.awe.co.uk> References: <201308150931.r7F9VhnG024224@msw2.awe.co.uk> Message-ID: On Thu, Aug 15, 2013 at 4:31 AM, wrote: > ****** > > Hello,**** > > ** ** > > Would it be possible to build PETSc so that the flag -no_signal_handler > does not have to specified at run time? Or is there a PETSc subroutine that > can be called to implement the -no_signal_handler feature? > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscPopSignalHandler.html Matt > > > Many thanks,**** > > ** ** > > *--------------------------***** > > *Wadud Miah* > *HPC, Design and Theoretical Physics** > *Direct: 0118 98 56220 > AWE, Aldermaston, ****Reading**, ** RG7 4PR******** > > **** > > ** ** > > ___________________________________________________ > ____________________________ The information in this email and in any > attachment(s) is commercial in confidence. If you are not the named > addressee(s) or if you receive this email in error then any distribution, > copying or use of this communication or the information in it is strictly > prohibited. Please notify us immediately by email at admin.internet(at) > awe.co.uk, and then delete this message from your computer. While > attachments are virus checked, AWE plc does not accept any liability in > respect of any virus which is not detected. AWE Plc Registered in England > and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu Aug 15 06:13:52 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 15 Aug 2013 06:13:52 -0500 Subject: [petsc-users] some problem with the VecCreateGhost. In-Reply-To: <77a7d69e.9952.14080ab5eb7.Coremail.ztdepyahoo@163.com> References: <77a7d69e.9952.14080ab5eb7.Coremail.ztdepyahoo@163.com> Message-ID: <878v03fbpr.fsf@mcs.anl.gov> ??? writes: > I want to know > 1. What is the relationship between Vhat and LocalVhat. Is the > LocalVhat a copy of the Vhat plus the Ghost values. I think the > LocalVhat will consume more memeory than the Vhat, am i right? Logically, you should think of them as separate vectors. The global form contains only the global entries (non-overlapping partition) and the local form also contains the ghosted entries. If you want to operate strictly on the global entries, you can access the global form directly. Implementation-wise, the vectors share memory for the global part. This means that if you build a Krylov space with VecGhost, you have allocated storage for the ghosted entries for every Krylov vector, even though the algorithm will never use them. VecGhost is premature "optimization" ("pessimization") in many cases, so consider just using a local work vector and performing a scatter from global to local. It's probably much cheaper than you think. > 2. can i direct call VecGetArrary to the Vhat if i do not operate on the ghost value. > > > > Vec Vhat; > Vec LocalVhat; > double* VhatVec; > > VecCreateGhost(PETSC_COMM_WORLD,aMesh->Nx*aMesh->Ny/Commsize,PETSC_DECIDE,aMesh->nghosts,&aMesh->ghosts[0],&Vhat); > VecGhostGetLocalForm(Vhat,&LocalVhat); Update, _then_ get the local form. Remember to restore the local form when you are done with it. > VecGhostUpdateBegin(Vhat,INSERT_VALUES,SCATTER_FORWARD); > VecGhostUpdateEnd(Vhat,INSERT_VALUES,SCATTER_FORWARD); -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From Wadud.Miah at awe.co.uk Thu Aug 15 07:42:57 2013 From: Wadud.Miah at awe.co.uk (Wadud.Miah at awe.co.uk) Date: Thu, 15 Aug 2013 12:42:57 +0000 Subject: [petsc-users] EXTERNAL: Re: -no_signal_handler flag In-Reply-To: References: <201308150931.r7F9VhnG024224@msw2.awe.co.uk> Message-ID: <201308151243.r7FCh3Y6025034@msw2.awe.co.uk> Hi Matthew, Thanks for the info. Would the PetscPopSignalHandler() routine be called immediately after PetscInitialize() ? Regards, Wadud. ________________________________ From: Matthew Knepley [mailto:knepley at gmail.com] Sent: 15 August 2013 12:13 To: Miah Wadud AWE Cc: petsc-users at mcs.anl.gov Subject: EXTERNAL: Re: [petsc-users] -no_signal_handler flag On Thu, Aug 15, 2013 at 4:31 AM, > wrote: Hello, Would it be possible to build PETSc so that the flag -no_signal_handler does not have to specified at run time? Or is there a PETSc subroutine that can be called to implement the -no_signal_handler feature? http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscPopSignalHandler.html Matt Many thanks, -------------------------- Wadud Miah HPC, Design and Theoretical Physics Direct: 0118 98 56220 AWE, Aldermaston, Reading, RG7 4PR ___________________________________________________ ____________________________ The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited. Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer. While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected. AWE Plc Registered in England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener ___________________________________________________ ____________________________ The information in this email and in any attachment(s) is commercial in confidence. If you are not the named addressee(s) or if you receive this email in error then any distribution, copying or use of this communication or the information in it is strictly prohibited. Please notify us immediately by email at admin.internet(at)awe.co.uk, and then delete this message from your computer. While attachments are virus checked, AWE plc does not accept any liability in respect of any virus which is not detected. AWE Plc Registered in England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 15 07:55:49 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 15 Aug 2013 07:55:49 -0500 Subject: [petsc-users] EXTERNAL: Re: -no_signal_handler flag In-Reply-To: <201308151243.r7FCh3Y6025034@msw2.awe.co.uk> References: <201308150931.r7F9VhnG024224@msw2.awe.co.uk> <201308151243.r7FCh3Y6025034@msw2.awe.co.uk> Message-ID: On Thu, Aug 15, 2013 at 7:42 AM, wrote: > ********** > > Hi Matthew,**** > > ** ** > > Thanks for the info. Would the PetscPopSignalHandler() routine be called > immediately after PetscInitialize() ? > Yes Matt > > > Regards,**** > > Wadud.**** > > ** ** > ------------------------------ > > *From:* Matthew Knepley [mailto:knepley at gmail.com] > *Sent:* 15 August 2013 12:13 > *To:* **Miah Wadud AWE** > *Cc:* petsc-users at mcs.anl.gov > *Subject:* EXTERNAL: Re: [petsc-users] -no_signal_handler flag**** > > ** ** > > On Thu, Aug 15, 2013 at 4:31 AM, wrote:**** > > Hello,**** > > **** > > Would it be possible to build PETSc so that the flag -no_signal_handler > does not have to specified at run time? Or is there a PETSc subroutine that > can be called to implement the -no_signal_handler feature?**** > > ** ** > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscPopSignalHandler.html > **** > > ** ** > > Matt**** > > **** > > **** > > Many thanks,**** > > **** > > *--------------------------***** > > *Wadud Miah* > *HPC, Design and Theoretical Physics** > *Direct: 0118 98 56220 > AWE, Aldermaston, ****Reading**, ** RG7 4PR******** > > **** > > **** > > ___________________________________________________ > ____________________________ The information in this email and in any > attachment(s) is commercial in confidence. If you are not the named > addressee(s) or if you receive this email in error then any distribution, > copying or use of this communication or the information in it is strictly > prohibited. Please notify us immediately by email at admin.internet(at) > awe.co.uk, and then delete this message from your computer. While > attachments are virus checked, AWE plc does not accept any liability in > respect of any virus which is not detected. AWE Plc Registered in ** > England** and ** Wales** Registration No 02763902 AWE, Aldermaston, ** ** > Reading**, **RG7 4PR**** **** > > > > **** > > ** ** > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener **** > > ___________________________________________________ > ____________________________ The information in this email and in any > attachment(s) is commercial in confidence. If you are not the named > addressee(s) or if you receive this email in error then any distribution, > copying or use of this communication or the information in it is strictly > prohibited. Please notify us immediately by email at admin.internet(at) > awe.co.uk, and then delete this message from your computer. While > attachments are virus checked, AWE plc does not accept any liability in > respect of any virus which is not detected. AWE Plc Registered in England > and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Thu Aug 15 09:12:20 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Thu, 15 Aug 2013 22:12:20 +0800 (CST) Subject: [petsc-users] some problem with the VecCreateGhost. In-Reply-To: <878v03fbpr.fsf@mcs.anl.gov> References: <77a7d69e.9952.14080ab5eb7.Coremail.ztdepyahoo@163.com> <878v03fbpr.fsf@mcs.anl.gov> Message-ID: <6c69226a.114c5.1408251e730.Coremail.ztdepyahoo@163.com> What the function of "restore the local form", can i use VecDestroy to free the local form only; ? 2013-08-15 19:13:52?"Jed Brown" ??? >??? writes: > >> I want to know >> 1. What is the relationship between Vhat and LocalVhat. Is the >> LocalVhat a copy of the Vhat plus the Ghost values. I think the >> LocalVhat will consume more memeory than the Vhat, am i right? > >Logically, you should think of them as separate vectors. The global >form contains only the global entries (non-overlapping partition) and >the local form also contains the ghosted entries. If you want to >operate strictly on the global entries, you can access the global form >directly. > >Implementation-wise, the vectors share memory for the global part. This >means that if you build a Krylov space with VecGhost, you have allocated >storage for the ghosted entries for every Krylov vector, even though the >algorithm will never use them. VecGhost is premature "optimization" >("pessimization") in many cases, so consider just using a local work >vector and performing a scatter from global to local. It's probably >much cheaper than you think. > >> 2. can i direct call VecGetArrary to the Vhat if i do not operate on the ghost value. >> >> >> >> Vec Vhat; >> Vec LocalVhat; >> double* VhatVec; >> >> VecCreateGhost(PETSC_COMM_WORLD,aMesh->Nx*aMesh->Ny/Commsize,PETSC_DECIDE,aMesh->nghosts,&aMesh->ghosts[0],&Vhat); >> VecGhostGetLocalForm(Vhat,&LocalVhat); > >Update, _then_ get the local form. Remember to restore the local form >when you are done with it. > >> VecGhostUpdateBegin(Vhat,INSERT_VALUES,SCATTER_FORWARD); >> VecGhostUpdateEnd(Vhat,INSERT_VALUES,SCATTER_FORWARD); -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu Aug 15 09:25:15 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 15 Aug 2013 09:25:15 -0500 Subject: [petsc-users] some problem with the VecCreateGhost. In-Reply-To: <6c69226a.114c5.1408251e730.Coremail.ztdepyahoo@163.com> References: <77a7d69e.9952.14080ab5eb7.Coremail.ztdepyahoo@163.com> <878v03fbpr.fsf@mcs.anl.gov> <6c69226a.114c5.1408251e730.Coremail.ztdepyahoo@163.com> Message-ID: <87li43doac.fsf@mcs.anl.gov> ??? writes: > What the function of "restore the local form", It synchronizes state. > can i use VecDestroy to free the local form only; No, use VecGhostRestoreLocalForm. Also, do not perform operations with the global Vec before restoring the local form. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From Shuangshuang.Jin at pnnl.gov Thu Aug 15 13:57:24 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Thu, 15 Aug 2013 11:57:24 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <877gfpm47j.fsf@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> Hi, Jed, I followed your suggestion and profiled the IJacobian stage, please see the related profile below: Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 4.0670e+01 11.5% 1.2019e+11 100.0% 1.082e+07 100.0% 5.757e+02 100.0% 6.386e+04 81.6% 1: My IJacobian stage: 3.1379e+02 88.5% 0.0000e+00 0.0% 1.984e+03 0.0% 1.357e-02 0.0% 1.438e+04 18.4% Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 1: My IJacobian stage VecSet 1797 1.0 4.7467e-02 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 1796 1.0 4.3967e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+03 0 0 0 0 2 0 0 0 0 12 0 MatAssemblyBegin 1796 1.0 7.2787e+00 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+03 2 0 0 0 5 2 0 0 0 25 0 MatAssemblyEnd 1796 1.0 2.1605e-01 1.1 0.00e+00 0.0 2.0e+03 7.4e+01 1.8e+03 0 0 0 0 2 0 0100100 13 0 Object Type Creations Destructions Memory Descendants' Mem. --- Event Stage 1: My IJacobian stage Vector 1798 1 1552 0 Vector Scatter 1797 1796 1156624 0 Index Set 1798 1798 1370952 0 It seems that IJacobian occupies 88.5% of the total computation time. Anything else can you interpret from the profile which can help me to accelerate the IJacobian computation? Thanks, Shuangshuang -----Original Message----- From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown Sent: Tuesday, August 13, 2013 6:50 PM To: Jin, Shuangshuang; Shri Cc: Barry Smith; petsc-users at mcs.anl.gov Subject: RE: [petsc-users] Performance of PETSc TS solver "Jin, Shuangshuang" writes: > Hi, Shri, > > From the log_summary, we can see that the TSJacobianEval/SNESJacobianEval dominates the computation time as you mentioned. > > Event Count Time (sec) Fl -------------- next part -------------- A non-text attachment was scrubbed... Name: job.out.3717091 Type: application/octet-stream Size: 21740 bytes Desc: job.out.3717091 URL: From mcgrory at aerosoftinc.com Thu Aug 15 16:07:07 2013 From: mcgrory at aerosoftinc.com (Bill McGrory) Date: Thu, 15 Aug 2013 17:07:07 -0400 Subject: [petsc-users] Narrowing down "Matrix is missing diagonal entry" errors Message-ID: <520D42FB.8080401@aerosoftinc.com> I have a large legacy code that I am in the process of porting to use PETSc. my software is a multi-block structured Navier Stokes solver. I have successfully run a number of our verification problems using the KSP solver to invert my linearized problem, and this works in parallel, with fairly complex multi-block topologies. However, when I take a certain problem, and change the load balance, (distribution of nodes) for a different set of cores, then I suddenly get the dreaded "Matrix is missing diagonal entry" errors. It is my understanding from the documentation, and list archives that this means the I have a pre-allocated main diagonal entry that is missing. But I can't find it simply by code review. So, I need to try to glean a little more diagnostic information from the error output. Note, that I get this error, whether I run 3.0, 3.3 or 3.4, this message simply from the 3.0 package my matrix is a block aij matrix. so when I interpret the row index for this message... this is the local row element, not the global element, correct? and it is the block row element? so I can convert eh value 88023 to a given structured subset, and the local i,j,k index for that subset, I don't need to divide that value by the blocksize? Finally, is there a way for me to query the matrix after it is assembled so that I can see whether or not I have for example, row,column entry 88023,88023? Thank you for you assistance Bill McGrory [8]PETSC ERROR: --------------------- Error Message ------------------------------------ [8]PETSC ERROR: Object is in wrong state! [8]PETSC ERROR: Matrix is missing diagonal entry in row 88023! [8]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: Petsc Release Version 3.0.0, Patch 10, Tue Nov 24 16:38:09 CST 2009 [8]PETSC ERROR: See docs/changes/index.html for recent updates. [8]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [8]PETSC ERROR: See docs/index.html for manual pages. [8]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: /usr/Develop/Hocus/mcgrory/GASP52/bin/ubuntu64/gasp on a linux-gnu named node33 by mcgrory Thu Aug 15 15:58:10 2013 [8]PETSC ERROR: Libraries linked from /build/buildd/petsc-3.0.0.dfsg/linux-gnu-c-opt/lib [8]PETSC ERROR: Configure run at Thu Dec 31 09:53:16 2009 [8]PETSC ERROR: Configure options --with-shared --with-debugging=0 --useThreads 0 --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas-3gf --with-lapack-lib=-llapackgf-3 --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-superlu=1 --with-superlu-include=/usr/include/superlu --with-superlu-lib=/usr/lib/libsuperlu.so --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-scotch=1 --with-scotch-include=/usr/include/scotch --with-scotch-lib=/usr/lib/libscotch.so [8]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: MatILUFactorSymbolic_SeqBAIJ() line 3118 in src/mat/impls/baij/seq/baijfact2.c [8]PETSC ERROR: MatILUFactorSymbolic() line 5314 in src/mat/interface/matrix.c [8]PETSC ERROR: PCSetUp_ILU() line 293 in src/ksp/pc/impls/factor/ilu/ilu.c [8]PETSC ERROR: PCSetUp() line 794 in src/ksp/pc/interface/precon.c [8]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c [8]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 753 in src/ksp/pc/impls/bjacobi/bjacobi.c [8]PETSC ERROR: PCSetUpOnBlocks() line 827 in src/ksp/pc/interface/precon.c [8]PETSC ERROR: KSPSetUpOnBlocks() line 159 in src/ksp/ksp/interface/itfunc.c [8]PETSC ERROR: KSPSolve() line 354 in src/ksp/ksp/interface/itfunc.c -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 5396 bytes Desc: S/MIME Cryptographic Signature URL: From ztdepyahoo at 163.com Thu Aug 15 20:34:42 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 16 Aug 2013 09:34:42 +0800 (CST) Subject: [petsc-users] How to use jacobi or Gauss-seilder iteration to solve problem Message-ID: KSPSetType(ksp,?); -------------- next part -------------- An HTML attachment was scrubbed... URL: From rupp at mcs.anl.gov Thu Aug 15 20:40:52 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Thu, 15 Aug 2013 20:40:52 -0500 Subject: [petsc-users] How to use jacobi or Gauss-seilder iteration to solve problem In-Reply-To: References: Message-ID: <520D8324.9010002@mcs.anl.gov> Hi, > > KSPSetType(ksp,?); Jacobi and Gauss-Seidel are available as preconditioners. To use them as standalone solvers, make the linear solver run the preconditioner only via KSPPREONLY from here: http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPType.html and then set the preconditioner (e.g. PCJACOBI) via PCSetType(): http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCType.html Best regards, Karli From bsmith at mcs.anl.gov Thu Aug 15 20:41:28 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 15 Aug 2013 20:41:28 -0500 Subject: [petsc-users] How to use jacobi or Gauss-seilder iteration to solve problem In-Reply-To: References: Message-ID: KSPSetType(ksp,KSPRICHARDSON); On Aug 15, 2013, at 8:34 PM, ??? wrote: > > KSPSetType(ksp,?); > > > From bsmith at mcs.anl.gov Thu Aug 15 20:43:53 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 15 Aug 2013 20:43:53 -0500 Subject: [petsc-users] How to use jacobi or Gauss-seilder iteration to solve problem In-Reply-To: <520D8324.9010002@mcs.anl.gov> References: <520D8324.9010002@mcs.anl.gov> Message-ID: <29C6BED0-A240-49A4-9652-05A0DA0381AE@mcs.anl.gov> On Aug 15, 2013, at 8:40 PM, Karl Rupp wrote: > Hi, > > > > KSPSetType(ksp,?); > > Jacobi and Gauss-Seidel are available as preconditioners. To use them as standalone solvers, make the linear solver run the preconditioner only via KSPPREONLY from here: > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPType.html > and then set the preconditioner (e.g. PCJACOBI) via PCSetType(): > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCType.html This won't work for jacobi (it will only do one iteration) and will only work for sor if you use PCSetSORIterations() instead of -ksp_max_its Barry > > > Best regards, > Karli From rupp at mcs.anl.gov Thu Aug 15 20:45:16 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Thu, 15 Aug 2013 20:45:16 -0500 Subject: [petsc-users] How to use jacobi or Gauss-seilder iteration to solve problem In-Reply-To: <29C6BED0-A240-49A4-9652-05A0DA0381AE@mcs.anl.gov> References: <520D8324.9010002@mcs.anl.gov> <29C6BED0-A240-49A4-9652-05A0DA0381AE@mcs.anl.gov> Message-ID: <520D842C.2080304@mcs.anl.gov> Hey, >> >>>> KSPSetType(ksp,?); >> >> Jacobi and Gauss-Seidel are available as preconditioners. To use them as standalone solvers, make the linear solver run the preconditioner only via KSPPREONLY from here: >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPType.html >> and then set the preconditioner (e.g. PCJACOBI) via PCSetType(): >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCType.html > > This won't work for jacobi (it will only do one iteration) and will only work for sor if you use PCSetSORIterations() instead of -ksp_max_its Hmm, should have tried it first... :-/ Thanks for the correction. Best regards, Karli From jedbrown at mcs.anl.gov Thu Aug 15 21:27:01 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 15 Aug 2013 21:27:01 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> Message-ID: <87fvuabcay.fsf@mcs.anl.gov> "Jin, Shuangshuang" writes: > Hi, Jed, > > I followed your suggestion and profiled the IJacobian stage, please see the related profile below: Cool, all of these are pretty inexpensive, so your time is probably in computation. Are all data structures distributed? Is there any work that you do redundantly or does each core only compute its local part? > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > 0: Main Stage: 4.0670e+01 11.5% 1.2019e+11 100.0% 1.082e+07 100.0% 5.757e+02 100.0% 6.386e+04 81.6% > 1: My IJacobian stage: 3.1379e+02 88.5% 0.0000e+00 0.0% 1.984e+03 0.0% 1.357e-02 0.0% 1.438e+04 18.4% > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s > ------------------------------------------------------------------------------------------------------------------------ > --- Event Stage 1: My IJacobian stage > > VecSet 1797 1.0 4.7467e-02 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecScatterBegin 1796 1.0 4.3967e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.8e+03 0 0 0 0 2 0 0 0 0 12 0 > MatAssemblyBegin 1796 1.0 7.2787e+00 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 3.6e+03 2 0 0 0 5 2 0 0 0 25 0 > MatAssemblyEnd 1796 1.0 2.1605e-01 1.1 0.00e+00 0.0 2.0e+03 7.4e+01 1.8e+03 0 0 0 0 2 0 0100100 13 0 > > Object Type Creations Destructions Memory Descendants' Mem. > --- Event Stage 1: My IJacobian stage > > Vector 1798 1 1552 0 > Vector Scatter 1797 1796 1156624 0 > Index Set 1798 1798 1370952 0 > > It seems that IJacobian occupies 88.5% of the total computation time. Anything else can you interpret from the profile which can help me to accelerate the IJacobian computation? > > Thanks, > Shuangshuang > > > > -----Original Message----- > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Tuesday, August 13, 2013 6:50 PM > To: Jin, Shuangshuang; Shri > Cc: Barry Smith; petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] Performance of PETSc TS solver > > "Jin, Shuangshuang" writes: > >> Hi, Shri, >> >> From the log_summary, we can see that the TSJacobianEval/SNESJacobianEval dominates the computation time as you mentioned. >> >> Event Count Time (sec) Fl -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From solvercorleone at gmail.com Fri Aug 16 02:11:05 2013 From: solvercorleone at gmail.com (Cong Li) Date: Fri, 16 Aug 2013 16:11:05 +0900 Subject: [petsc-users] Should I synchronize processes explicitly, for instance using MPI_Barrier? Message-ID: Hi I am a rookie to PETSc, and I am wondering about whether I should call MPI_Barrier to explicitly synchronize processes between PETSc calls. For example, a piece of code like below ierr = MatMult(A,x,Ax); CHKERRQ(ierr); ierr = VecWAXPY(r, neg_one, Ax, b);CHKERRQ(ierr); Should I add MPI_Barrier inbetween MatMult call and VecWAXPY call ? My guess is that it is unnecessary. However, I am not so confident in this guess, for I don't have much experience on using PETSc. Thanks in advance. Cong Li -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Fri Aug 16 04:50:16 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 16 Aug 2013 17:50:16 +0800 (CST) Subject: [petsc-users] what is the meaning of matrix file output by matview. Message-ID: <5af9e772.20d81.140868855e9.Coremail.ztdepyahoo@163.com> Matrix Object: 1 MPI processes type: mpiaij 16 40 1 4 7 10 12 15 18 21 23 26 29 32 34 36 38 40 41 1 2 5 2 3 6 3 4 7 4 8 5 6 9 6 7 10 7 8 11 8 12 9 10 13 10 11 14 11 12 15 12 16 13 14 14 15 15 16 16 2.2916666666666665e+00 -1.1458333333333333e+00 -1.1458333333333333e+00 3.8020833333333330e+00 -1.2500000000000000e+00 -1.4062500000000000e+00 3.8020833333333330e+00 -1.1458333333333333e+00 -1.4062500000000000e+00 2.2916666666666665e+00 -1.1458333333333333e+00 3.8020833333333330e+00 -1.4062500000000000e+00 -1.2500000000000000e+00 5.9375000000000000e+00 -1.5625000000000000e+00 -1.5625000000000000e+00 5.9375000000000000e+00 -1.4062500000000000e+00 -1.5625000000000000e+00 3.8020833333333330e+00 -1.2500000000000000e+00 3.8020833333333330e+00 -1.4062500000000000e+00 -1.1458333333333333e+00 5.9375000000000000e+00 -1.5625000000000000e+00 -1.4062500000000000e+00 5.9375000000000000e+00 -1.4062500000000000e+00 -1.4062500000000000e+00 3.8020833333333330e+00 -1.1458333333333333e+00 2.2916666666666665e+00 -1.1458333333333333e+00 3.8020833333333330e+00 -1.2500000000000000e+00 3.8020833333333330e+00 -1.1458333333333333e+00 2.2916666666666665e+00 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 16 05:56:17 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Aug 2013 05:56:17 -0500 Subject: [petsc-users] Should I synchronize processes explicitly, for instance using MPI_Barrier? In-Reply-To: References: Message-ID: On Fri, Aug 16, 2013 at 2:11 AM, Cong Li wrote: > Hi > > I am a rookie to PETSc, and I am wondering about whether I should call > MPI_Barrier to explicitly synchronize processes between PETSc calls. > For example, a piece of code like below > > ierr = MatMult(A,x,Ax); CHKERRQ(ierr); > ierr = VecWAXPY(r, neg_one, Ax, b);CHKERRQ(ierr); > > Should I add MPI_Barrier inbetween MatMult call and VecWAXPY call ? > No Matt > My guess is that it is unnecessary. However, I am not so confident in this > guess, for I don't have much experience on using PETSc. > > Thanks in advance. > > Cong Li > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Fri Aug 16 06:13:06 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 16 Aug 2013 19:13:06 +0800 (CST) Subject: [petsc-users] I write the following program, it can converge with 4 processor, but can not converge with 2 , or 3 processor. Could you help me out Message-ID: /* Program usage: mpiexec ex1 [-help] [all PETSc options] */ static char help[] = "Basic vector routines.\n\n"; #include "petscksp.h" #include "petscvec.h" int main(int argc,char **argv) { int N=16; int MyRank; Mat A; PC Pc; KSP ksp; Vec b,x; PetscInitialize(&argc,&argv,(char*)0,help); MPI_Comm_rank(MPI_COMM_WORLD,&MyRank); MatCreate(PETSC_COMM_WORLD,&A); MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N); MatSetFromOptions(A); MatMPIAIJSetPreallocation(A,5,PETSC_NULL,5,PETSC_NULL); VecCreate(PETSC_COMM_WORLD,&b); VecSetSizes(b,PETSC_DECIDE,N); VecSetFromOptions(b); VecDuplicate(b,&x); int row=0; int* col3; int* col4; int* col5; PetscMalloc(3*sizeof(PetscInt),&col3); PetscMalloc(4*sizeof(PetscInt),&col4); PetscMalloc(5*sizeof(PetscInt),&col5); col3[0]=0; col3[1]=1; col3[2]=4; double value3[3]={2.2916666666666665e+00,-1.1458333333333333e+00,-1.1458333333333333e+00}; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); row=1; col4[0]=0; col4[1]=1; col4[2]=2; col4[3]=5; double value4[4]={-1.1458333333333333e+00,3.8020833333333330e+00,-1.2500000000000000e+00 ,-1.4062500000000000e+00}; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=2; col4[0]=1; col4[1]=2; col4[2]=3; col4[3]=6; value4[0]=-1.2500000000000000e+00; value4[1]=3.8020833333333330e+00; value4[2]=-1.1458333333333333e+00 ; value4[3]=-1.4062500000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=3; col3[0]=2; col3[1]=3; col3[2]=7; value3[0]=-1.1458333333333333e+00; value3[1]=2.2916666666666665e+00; value3[2]=-1.1458333333333333e+00; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); row=4; col4[0]=0; col4[1]=4; col4[2]=5; col4[3]=8; value4[0]=-1.1458333333333333e+00; value4[1]=3.8020833333333330e+00; value4[2]=-1.4062500000000000e+00; value4[3]=-1.2500000000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=5; col5[0]=1; col5[1]=4; col5[2]=5; col5[3]=6; col5[4]=9; double value5[5]; value5[0]=-1.4062500000000000e+00; value5[1]=-1.4062500000000000e+00; value5[2]=5.9375000000000000e+00; value5[3]=-1.5625000000000000e+00 ; value5[4]=-1.5625000000000000e+00 ; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=6; col5[0]=2; col5[1]=5; col5[2]=6; col5[3]=7; col5[4]=10; value5[0]=-1.4062500000000000e+00; value5[1]=-1.5625000000000000e+00; value5[2]=5.9375000000000000e+00; value5[3]=-1.4062500000000000e+00 ; value5[4]=-1.5625000000000000e+00 ; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=7; col4[0]=3; col4[1]=6; col4[2]=7; col4[3]=11; value4[0]=-1.1458333333333333e+00; value4[1]=-1.4062500000000000e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.2500000000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=8; col4[0]=4; col4[1]=8; col4[2]=9; col4[3]=12; value4[0]=-1.2500000000000000e+00; value4[1]=3.8020833333333330e+00; value4[2]=-1.4062500000000000e+00; value4[3]=-1.1458333333333333e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=9; col5[0]=5; col5[1]=8; col5[2]=9; col5[3]=10; col5[4]=13; value5[0]=-1.5625000000000000e+00; value5[1]=-1.4062500000000000e+00; value5[2]=5.9375000000000000e+00; value5[3]=-1.5625000000000000e+00 ; value5[4]= -1.4062500000000000e+00; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=10; col5[0]=6; col5[1]=9; col5[2]=10; col5[3]=11; col5[4]=14; value5[0]=-1.5625000000000000e+00; value5[1]=-1.5625000000000000e+00; value5[2]=5.9375000000000000e+00; value5[3]=-1.4062500000000000e+00 ; value5[4]= -1.4062500000000000e+00; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=11; col4[0]=7; col4[1]=10; col4[2]=11; col4[3]=15; value4[0]=-1.2500000000000000e+00; value4[1]=-1.4062500000000000e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.1458333333333333e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=12; col3[0]=8; col3[1]=12; col3[2]=13; value3[0]=-1.1458333333333333e+00; value3[1]=2.2916666666666665e+00; value3[2]=-1.1458333333333333e+00; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); row=13; col4[0]=9; col4[1]=12; col4[2]=13; col4[3]=14; value4[0]=-1.4062500000000000e+00; value4[1]=-1.1458333333333333e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.2500000000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=14; col4[0]=10; col4[1]=13; col4[2]=14; col4[3]=15; value4[0]=-1.4062500000000000e+00; value4[1]=-1.2500000000000000e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.1458333333333333e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=15; col3[0]=11; col3[1]=14; col3[2]=15; value3[0]=-1.1458333333333333e+00; value3[1]=-1.1458333333333333e+00; value3[2]=2.2916666666666665e+00; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); // MatView(A,PETSC_VIEWER_STDOUT_WORLD); double val[16]={-7.6233779415782715e-04, -3.0457072596705860e-04, 3.0457072596705860e-04, 7.6233779415782715e-04, -4.4764543813290442e-03, -1.6196451741044846e-03, 1.6196451741044846e-03, 4.4764543813290442e-03, -1.9333475373837013e-02, -5.4815619458573189e-03, 5.4815619458573189e-03, 1.9333475373837013e-02, -8.4153777598326651e-02, -1.2883385353962010e-02, 1.2883385353962010e-02, 8.4153777598326651e-02}; int* col16; PetscMalloc(16*sizeof(PetscInt),&col16); for(int i=0;i<16;i++) col16[i]=i; VecSetValues(b,16,col16,val,INSERT_VALUES); VecAssemblyBegin(b); VecAssemblyEnd(b); // VecView(b,PETSC_VIEWER_STDOUT_WORLD); KSPCreate(PETSC_COMM_WORLD,&ksp); KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); KSPSetInitialGuessNonzero(ksp,PETSC_FALSE); KSPSetType(ksp,KSPBCGS); // KSPSetType(ksp,KSPLSQR); // KSPSetType(ksp,KSPFGMRES); // KSPSetType(ksp,KSPDGMRES); //KSPSetType(ksp,KSPTCQMR); // KSPSetType(ksp,KSPPREONLY); //KSPGetPC(ksp,&Pc); // PCSetType(Pc,PCJACOBI); KSPSetFromOptions(ksp); KSPSetTolerances(ksp,1.e-20,1.e-20,PETSC_DEFAULT,1000); KSPSolve(ksp,b,x); VecView(x,PETSC_VIEWER_STDOUT_WORLD); PetscFinalize(); return 0; } -------------- next part -------------- An HTML attachment was scrubbed... URL: From solvercorleone at gmail.com Fri Aug 16 06:31:58 2013 From: solvercorleone at gmail.com (Cong Li) Date: Fri, 16 Aug 2013 20:31:58 +0900 Subject: [petsc-users] Should I synchronize processes explicitly, for instance using MPI_Barrier? In-Reply-To: References: Message-ID: Hi, Matthew Thank you very much for the answer. And I met a new problem. When there some PetscScalar type data operations before the VecWAXPY call, for example PetscScalar, a,b, c; ..... ..... c=a+b; ierr = VecWAXPY(r, c, Ax, b);CHKERRQ(ierr); , should I add MPI_Barrier call right before VecWAXPY call ? Best regards Cong On Fri, Aug 16, 2013 at 7:56 PM, Matthew Knepley wrote: > On Fri, Aug 16, 2013 at 2:11 AM, Cong Li wrote: > >> Hi >> >> I am a rookie to PETSc, and I am wondering about whether I should call >> MPI_Barrier to explicitly synchronize processes between PETSc calls. >> For example, a piece of code like below >> >> ierr = MatMult(A,x,Ax); CHKERRQ(ierr); >> ierr = VecWAXPY(r, neg_one, Ax, b);CHKERRQ(ierr); >> >> Should I add MPI_Barrier inbetween MatMult call and VecWAXPY call ? >> > > No > > Matt > > >> My guess is that it is unnecessary. However, I am not so confident in >> this guess, for I don't have much experience on using PETSc. >> >> Thanks in advance. >> >> Cong Li >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 16 06:54:01 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Aug 2013 06:54:01 -0500 Subject: [petsc-users] I write the following program, it can converge with 4 processor, but can not converge with 2 , or 3 processor. Could you help me out In-Reply-To: References: Message-ID: On Fri, Aug 16, 2013 at 6:13 AM, ??? wrote: Its common for iterative solvers to converge differently for different PC. You are using block Jacobi-ILU which is different for different numbers of processes. Matt /* Program usage: mpiexec ex1 [-help] [all PETSc options] */ > > static char help[] = "Basic vector routines.\n\n"; > > > #include "petscksp.h" > #include "petscvec.h" > > > > int main(int argc,char **argv) > { > int N=16; > > int MyRank; > > Mat A; > PC Pc; > KSP ksp; > Vec b,x; > > PetscInitialize(&argc,&argv,(char*)0,help); > MPI_Comm_rank(MPI_COMM_WORLD,&MyRank); > > MatCreate(PETSC_COMM_WORLD,&A); > MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N); > MatSetFromOptions(A); > MatMPIAIJSetPreallocation(A,5,PETSC_NULL,5,PETSC_NULL); > > VecCreate(PETSC_COMM_WORLD,&b); > VecSetSizes(b,PETSC_DECIDE,N); > VecSetFromOptions(b); > VecDuplicate(b,&x); > &n bsp; > int row=0; > int* col3; > int* col4; > int* col5; > PetscMalloc(3*sizeof(PetscInt),&col3); > PetscMalloc(4*sizeof(PetscInt),&col4); > PetscMalloc(5*sizeof(PetscInt),&col5); > > col3[0]=0; col3[1]=1; col3[2]=4; > double > value3[3]={2.2916666666666665e+00,-1.1458333333333333e+00,-1.1458333333333333e+00}; > MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); > > row=1; > col4[0]=0; col4[1]=1; col4[2]=2; col4[3]=5; > double > value4[4]={-1.1458333333333333e+00,3.8020833333333330e+00,-1.2500000000000000e+00 > ,-1.4062500000000000e+00}; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > > row=2; > col4[0]=1; col4[1]=2; col4[2]=3; col4[3]=6; > value4[0]=-1.2500000000000000e+00; > value4[1]=3.8020833333333330e+00; > value4[2]=-1.1458333333333333e+00 ; > &n bsp;value4[3]=-1.4062500000000000e+00; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > > row=3; > col3[0]=2; col3[1]=3; col3[2]=7; > value3[0]=-1.1458333333333333e+00; > value3[1]=2.2916666666666665e+00; > value3[2]=-1.1458333333333333e+00; > MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); > > > row=4; > col4[0]=0; col4[1]=4; col4[2]=5; col4[3]=8; > value4[0]=-1.1458333333333333e+00; > value4[1]=3.8020833333333330e+00; > value4[2]=-1.4062500000000000e+00; > value4[3]=-1.2500000000000000e+00; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > row=5; > col5[0]=1; col5[1]=4; col5[2]=5; col5[3]=6; col5[4]=9; > double value5[5]; > value5[0]=-1.4062500000000000e+00; > value5[1]=-1.4062500000000000e+00; > value5[2]=5 .9375000000000000e+00; > value5[3]=-1.5625000000000000e+00 ; > value5[4]=-1.5625000000000000e+00 ; > MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); > > > > row=6; > col5[0]=2; col5[1]=5; col5[2]=6; col5[3]=7; col5[4]=10; > value5[0]=-1.4062500000000000e+00; > value5[1]=-1.5625000000000000e+00; > value5[2]=5.9375000000000000e+00; > value5[3]=-1.4062500000000000e+00 ; > value5[4]=-1.5625000000000000e+00 ; > MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); > > row=7; > col4[0]=3; col4[1]=6; col4[2]=7; col4[3]=11; > value4[0]=-1.1458333333333333e+00; > value4[1]=-1.4062500000000000e+00; > value4[2]=3.8020833333333330e+00; > value4[3]=-1.2500000000000000e+00; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > row=8; > col4[0]=4; col4[1]=8; col4[2]=9; col4[3]=12; > value4[0]=-1.2500000000000000e+00; > value4[1]=3.8020833333333330e+00; > value4[2]=-1.4062500000000000e+00; > value4[3]=-1.1458333333333333e+00; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > > row=9; > col5[0]=5; col5[1]=8; col5[2]=9; col5[3]=10; col5[4]=13; > value5[0]=-1.5625000000000000e+00; > value5[1]=-1.4062500000000000e+00; > value5[2]=5.9375000000000000e+00; > value5[3]=-1.5625000000000000e+00 ; > value5[4]= -1.4062500000000000e+00; > MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); > > row=10; > col5[0]=6; col5[1]=9; col5[2]=10; col5[3]=11; col5[4]=14; > value5[0]=-1.5625000000000000e+00; > value5[1]=-1.5625000000000000e+00; > value5[2]=5.93750000000000 00e+00; > value5[3]=-1.4062500000000000e+00 ; > value5[4]= -1.4062500000000000e+00; > MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); > > > row=11; > col4[0]=7; col4[1]=10; col4[2]=11; col4[3]=15; > value4[0]=-1.2500000000000000e+00; > value4[1]=-1.4062500000000000e+00; > value4[2]=3.8020833333333330e+00; > value4[3]=-1.1458333333333333e+00; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > > > > row=12; > col3[0]=8; col3[1]=12; col3[2]=13; > value3[0]=-1.1458333333333333e+00; > value3[1]=2.2916666666666665e+00; > value3[2]=-1.1458333333333333e+00; > MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); > > > row=13; > col4[0]=9; col4[1]=12; col4[2]=13; col4[3]= 14; > value4[0]=-1.4062500000000000e+00; > value4[1]=-1.1458333333333333e+00; > value4[2]=3.8020833333333330e+00; > value4[3]=-1.2500000000000000e+00; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > > row=14; > col4[0]=10; col4[1]=13; col4[2]=14; col4[3]=15; > value4[0]=-1.4062500000000000e+00; > value4[1]=-1.2500000000000000e+00; > value4[2]=3.8020833333333330e+00; > value4[3]=-1.1458333333333333e+00; > MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); > > > > row=15; > col3[0]=11; col3[1]=14; col3[2]=15; > value3[0]=-1.1458333333333333e+00; > value3[1]=-1.1458333333333333e+00; > value3[2]=2.2916666666666665e+00; > MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); > > > &n bsp; > MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > // MatView(A,PETSC_VIEWER_STDOUT_WORLD); > > > > double val[16]={-7.6233779415782715e-04, > -3.0457072596705860e-04, > 3.0457072596705860e-04, > 7.6233779415782715e-04, > -4.4764543813290442e-03, > -1.6196451741044846e-03, > 1.6196451741044846e-03, > 4.4764543813290442e-03, > -1.9333475373837013e-02, > -5.4815619458573189e-03, > 5.4815619458573189e-03, > 1.9333475373837013e-02, > -8.4153777598326651e-02, > -1.2883385353962010e-02, > 1.2883385353962010e-02, > 8.4153777598326651e-02}; > > int* col16; > PetscMalloc(16*sizeof(PetscInt),&col16); > for(int i=0;i<16;i++) > col16[i]=i; > > VecSetValues(b,16,col16,val,INSERT_VALUES); > VecAssemblyBegin(b); > VecAssemblyEnd(b); > // > &nbs p; VecView(b,PETSC_VIEWER_STDOUT_WORLD); > > > KSPCreate(PETSC_COMM_WORLD,&ksp); > KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); > KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); > KSPSetInitialGuessNonzero(ksp,PETSC_FALSE); > > KSPSetType(ksp,KSPBCGS); > // KSPSetType(ksp,KSPLSQR); > // KSPSetType(ksp,KSPFGMRES); > // KSPSetType(ksp,KSPDGMRES); > //KSPSetType(ksp,KSPTCQMR); > // KSPSetType(ksp,KSPPREONLY); > > //KSPGetPC(ksp,&Pc); > // PCSetType(Pc,PCJACOBI); > KSPSetFromOptions(ksp); > KSPSetTolerances(ksp,1.e-20,1.e-20,PETSC_DEFAULT,1000); > KSPSolve(ksp,b,x); > > > VecView(x,PETSC_VIEWER_STDOUT_WORLD); > > > PetscFinalize(); > return 0; > } > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 16 06:54:30 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Aug 2013 06:54:30 -0500 Subject: [petsc-users] Should I synchronize processes explicitly, for instance using MPI_Barrier? In-Reply-To: References: Message-ID: On Fri, Aug 16, 2013 at 6:31 AM, Cong Li wrote: > Hi, Matthew > Thank you very much for the answer. > > And I met a new problem. > When there some PetscScalar type data operations before the VecWAXPY call, > for example > > PetscScalar, a,b, c; > ..... > ..... > c=a+b; > ierr = VecWAXPY(r, c, Ax, b);CHKERRQ(ierr); > > , should I add MPI_Barrier call right before VecWAXPY call ? > Never put in a barrier. Matt > Best regards > > Cong > > > On Fri, Aug 16, 2013 at 7:56 PM, Matthew Knepley wrote: > >> On Fri, Aug 16, 2013 at 2:11 AM, Cong Li wrote: >> >>> Hi >>> >>> I am a rookie to PETSc, and I am wondering about whether I should call >>> MPI_Barrier to explicitly synchronize processes between PETSc calls. >>> For example, a piece of code like below >>> >>> ierr = MatMult(A,x,Ax); CHKERRQ(ierr); >>> ierr = VecWAXPY(r, neg_one, Ax, b);CHKERRQ(ierr); >>> >>> Should I add MPI_Barrier inbetween MatMult call and VecWAXPY call ? >>> >> >> No >> >> Matt >> >> >>> My guess is that it is unnecessary. However, I am not so confident in >>> this guess, for I don't have much experience on using PETSc. >>> >>> Thanks in advance. >>> >>> Cong Li >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Fri Aug 16 07:32:49 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 16 Aug 2013 20:32:49 +0800 (CST) Subject: [petsc-users] I write the following program, it can converge with 4 processor, but can not converge with 2 , or 3 processor. Could you help me out In-Reply-To: References: Message-ID: <609ec1a3.22c90.140871d25a5.Coremail.ztdepyahoo@163.com> thank you very much! could you please suggest me a robust preconditioner which is independent of the number of processor. ? 2013-08-16 19:54:01?"Matthew Knepley" ??? On Fri, Aug 16, 2013 at 6:13 AM, ??? wrote: Its common for iterative solvers to converge differently for different PC. You are using block Jacobi-ILU which is different for different numbers of processes. Matt /* Program usage: mpiexec ex1 [-help] [all PETSc options] */ static char help[] = "Basic vector routines.\n\n"; #include "petscksp.h" #include "petscvec.h" int main(int argc,char **argv) { int N=16; int MyRank; Mat A; PC Pc; KSP ksp; Vec b,x; PetscInitialize(&argc,&argv,(char*)0,help); MPI_Comm_rank(MPI_COMM_WORLD,&MyRank); MatCreate(PETSC_COMM_WORLD,&A); MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N); MatSetFromOptions(A); MatMPIAIJSetPreallocation(A,5,PETSC_NULL,5,PETSC_NULL); VecCreate(PETSC_COMM_WORLD,&b); VecSetSizes(b,PETSC_DECIDE,N); VecSetFromOptions(b); VecDuplicate(b,&x); &n bsp; int row=0; int* col3; int* col4; int* col5; PetscMalloc(3*sizeof(PetscInt),&col3); PetscMalloc(4*sizeof(PetscInt),&col4); PetscMalloc(5*sizeof(PetscInt),&col5); col3[0]=0; col3[1]=1; col3[2]=4; double value3[3]={2.2916666666666665e+00,-1.1458333333333333e+00,-1.1458333333333333e+00}; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); row=1; col4[0]=0; col4[1]=1; col4[2]=2; col4[3]=5; double value4[4]={-1.1458333333333333e+00,3.8020833333333330e+00,-1.2500000000000000e+00 ,-1.4062500000000000e+00}; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=2; col4[0]=1; col4[1]=2; col4[2]=3; col4[3]=6; value4[0]=-1.2500000000000000e+00; value4[1]=3.8020833333333330e+00; value4[2]=-1.1458333333333333e+00 ; &n bsp;value4[3]=-1.4062500000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=3; col3[0]=2; col3[1]=3; col3[2]=7; value3[0]=-1.1458333333333333e+00; value3[1]=2.2916666666666665e+00; value3[2]=-1.1458333333333333e+00; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); row=4; col4[0]=0; col4[1]=4; col4[2]=5; col4[3]=8; value4[0]=-1.1458333333333333e+00; value4[1]=3.8020833333333330e+00; value4[2]=-1.4062500000000000e+00; value4[3]=-1.2500000000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=5; col5[0]=1; col5[1]=4; col5[2]=5; col5[3]=6; col5[4]=9; double value5[5]; value5[0]=-1.4062500000000000e+00; value5[1]=-1.4062500000000000e+00; value5[2]=5 .9375000000000000e+00; value5[3]=-1.5625000000000000e+00 ; value5[4]=-1.5625000000000000e+00 ; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=6; col5[0]=2; col5[1]=5; col5[2]=6; col5[3]=7; col5[4]=10; value5[0]=-1.4062500000000000e+00; value5[1]=-1.5625000000000000e+00; value5[2]=5.9375000000000000e+00; value5[3]=-1.4062500000000000e+00 ; value5[4]=-1.5625000000000000e+00 ; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=7; col4[0]=3; col4[1]=6; col4[2]=7; col4[3]=11; value4[0]=-1.1458333333333333e+00; value4[1]=-1.4062500000000000e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.2500000000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=8; col4[0]=4; col4[1]=8; col4[2]=9; col4[3]=12; value4[0]=-1.2500000000000000e+00; value4[1]=3.8020833333333330e+00; value4[2]=-1.4062500000000000e+00; value4[3]=-1.1458333333333333e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=9; col5[0]=5; col5[1]=8; col5[2]=9; col5[3]=10; col5[4]=13; value5[0]=-1.5625000000000000e+00; value5[1]=-1.4062500000000000e+00; value5[2]=5.9375000000000000e+00; value5[3]=-1.5625000000000000e+00 ; value5[4]= -1.4062500000000000e+00; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=10; col5[0]=6; col5[1]=9; col5[2]=10; col5[3]=11; col5[4]=14; value5[0]=-1.5625000000000000e+00; value5[1]=-1.5625000000000000e+00; value5[2]=5.93750000000000 00e+00; value5[3]=-1.4062500000000000e+00 ; value5[4]= -1.4062500000000000e+00; MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); row=11; col4[0]=7; col4[1]=10; col4[2]=11; col4[3]=15; value4[0]=-1.2500000000000000e+00; value4[1]=-1.4062500000000000e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.1458333333333333e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=12; col3[0]=8; col3[1]=12; col3[2]=13; value3[0]=-1.1458333333333333e+00; value3[1]=2.2916666666666665e+00; value3[2]=-1.1458333333333333e+00; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); row=13; col4[0]=9; col4[1]=12; col4[2]=13; col4[3]= 14; value4[0]=-1.4062500000000000e+00; value4[1]=-1.1458333333333333e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.2500000000000000e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=14; col4[0]=10; col4[1]=13; col4[2]=14; col4[3]=15; value4[0]=-1.4062500000000000e+00; value4[1]=-1.2500000000000000e+00; value4[2]=3.8020833333333330e+00; value4[3]=-1.1458333333333333e+00; MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); row=15; col3[0]=11; col3[1]=14; col3[2]=15; value3[0]=-1.1458333333333333e+00; value3[1]=-1.1458333333333333e+00; value3[2]=2.2916666666666665e+00; MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); &n bsp; MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); // MatView(A,PETSC_VIEWER_STDOUT_WORLD); double val[16]={-7.6233779415782715e-04, -3.0457072596705860e-04, 3.0457072596705860e-04, 7.6233779415782715e-04, -4.4764543813290442e-03, -1.6196451741044846e-03, 1.6196451741044846e-03, 4.4764543813290442e-03, -1.9333475373837013e-02, -5.4815619458573189e-03, 5.4815619458573189e-03, 1.9333475373837013e-02, -8.4153777598326651e-02, -1.2883385353962010e-02, 1.2883385353962010e-02, 8.4153777598326651e-02}; int* col16; PetscMalloc(16*sizeof(PetscInt),&col16); for(int i=0;i<16;i++) col16[i]=i; VecSetValues(b,16,col16,val,INSERT_VALUES); VecAssemblyBegin(b); VecAssemblyEnd(b); // &nbs p; VecView(b,PETSC_VIEWER_STDOUT_WORLD); KSPCreate(PETSC_COMM_WORLD,&ksp); KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); KSPSetInitialGuessNonzero(ksp,PETSC_FALSE); KSPSetType(ksp,KSPBCGS); // KSPSetType(ksp,KSPLSQR); // KSPSetType(ksp,KSPFGMRES); // KSPSetType(ksp,KSPDGMRES); //KSPSetType(ksp,KSPTCQMR); // KSPSetType(ksp,KSPPREONLY); //KSPGetPC(ksp,&Pc); // PCSetType(Pc,PCJACOBI); KSPSetFromOptions(ksp); KSPSetTolerances(ksp,1.e-20,1.e-20,PETSC_DEFAULT,1000); KSPSolve(ksp,b,x); VecView(x,PETSC_VIEWER_STDOUT_WORLD); PetscFinalize(); return 0; } -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 16 07:34:00 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Aug 2013 07:34:00 -0500 Subject: [petsc-users] I write the following program, it can converge with 4 processor, but can not converge with 2 , or 3 processor. Could you help me out In-Reply-To: <609ec1a3.22c90.140871d25a5.Coremail.ztdepyahoo@163.com> References: <609ec1a3.22c90.140871d25a5.Coremail.ztdepyahoo@163.com> Message-ID: On Fri, Aug 16, 2013 at 7:32 AM, ??? wrote: > thank you very much! could you please suggest me a robust preconditioner > which is independent of the number of processor. > For generic problems, they do not exist. I suggest looking in the literature for your specific problem, and then trying to construct the best few choices in PETSc. Matt > > ? 2013-08-16 19:54:01?"Matthew Knepley" ??? > > On Fri, Aug 16, 2013 at 6:13 AM, ??? wrote: > > Its common for iterative solvers to converge differently for different PC. > You are using block Jacobi-ILU > which is different for different numbers of processes. > > Matt > > /* Program usage: mpiexec ex1 [-help] [all PETSc options] */ >> >> static char help[] = "Basic vector routines.\n\n"; >> >> >> #include "petscksp.h" >> #include "petscvec.h" >> >> >> >> int main(int argc,char **argv) >> { >> int N=16; >> >> int MyRank; >> >> Mat A; >> PC Pc; >> KSP ksp; >> Vec b,x; >> >> PetscInitialize(&argc,&argv,(char*)0,help); >> MPI_Comm_rank(MPI_COMM_WORLD,&MyRank); >> >> MatCreate(PETSC_COMM_WORLD,&A); >> MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N); >> MatSetFromOptions(A); >> MatMPIAIJSetPreallocation(A,5,PETSC_NULL,5,PETSC_NULL); >> >> VecCreate(PETSC_COMM_WORLD,&b); >> VecSetSizes(b,PETSC_DECIDE,N); >> VecSetFromOptions(b); >> VecDuplicate(b,&x); >> &n bsp; >> int row=0; >> int* col3; >> int* col4; >> int* col5; >> PetscMalloc(3*sizeof(PetscInt),&col3); >> PetscMalloc(4*sizeof(PetscInt),&col4); >> PetscMalloc(5*sizeof(PetscInt),&col5); >> >> col3[0]=0; col3[1]=1; col3[2]=4; >> double >> value3[3]={2.2916666666666665e+00,-1.1458333333333333e+00,-1.1458333333333333e+00}; >> MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); >> >> row=1; >> col4[0]=0; col4[1]=1; col4[2]=2; col4[3]=5; >> double >> value4[4]={-1.1458333333333333e+00,3.8020833333333330e+00,-1.2500000000000000e+00 >> ,-1.4062500000000000e+00}; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> >> row=2; >> col4[0]=1; col4[1]=2; col4[2]=3; col4[3]=6; >> value4[0]=-1.2500000000000000e+00; >> value4[1]=3.8020833333333330e+00; >> value4[2]=-1.1458333333333333e+00 ; >> &n bsp;value4[3]=-1.4062500000000000e+00; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> >> row=3; >> col3[0]=2; col3[1]=3; col3[2]=7; >> value3[0]=-1.1458333333333333e+00; >> value3[1]=2.2916666666666665e+00; >> value3[2]=-1.1458333333333333e+00; >> MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); >> >> >> row=4; >> col4[0]=0; col4[1]=4; col4[2]=5; col4[3]=8; >> value4[0]=-1.1458333333333333e+00; >> value4[1]=3.8020833333333330e+00; >> value4[2]=-1.4062500000000000e+00; >> value4[3]=-1.2500000000000000e+00; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> row=5; >> col5[0]=1; col5[1]=4; col5[2]=5; col5[3]=6; col5[4]=9; >> double value5[5]; >> value5[0]=-1.4062500000000000e+00; >> value5[1]=-1.4062500000000000e+00; >> value5[2]=5 .9375000000000000e+00; >> value5[3]=-1.5625000000000000e+00 ; >> value5[4]=-1.5625000000000000e+00 ; >> MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); >> >> >> >> row=6; >> col5[0]=2; col5[1]=5; col5[2]=6; col5[3]=7; col5[4]=10; >> value5[0]=-1.4062500000000000e+00; >> value5[1]=-1.5625000000000000e+00; >> value5[2]=5.9375000000000000e+00; >> value5[3]=-1.4062500000000000e+00 ; >> value5[4]=-1.5625000000000000e+00 ; >> MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); >> >> row=7; >> col4[0]=3; col4[1]=6; col4[2]=7; col4[3]=11; >> value4[0]=-1.1458333333333333e+00; >> value4[1]=-1.4062500000000000e+00; >> value4[2]=3.8020833333333330e+00; >> value4[3]=-1.2500000000000000e+00; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> row=8; >> col4[0]=4; col4[1]=8; col4[2]=9; col4[3]=12; >> value4[0]=-1.2500000000000000e+00; >> value4[1]=3.8020833333333330e+00; >> value4[2]=-1.4062500000000000e+00; >> value4[3]=-1.1458333333333333e+00; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> >> row=9; >> col5[0]=5; col5[1]=8; col5[2]=9; col5[3]=10; col5[4]=13; >> value5[0]=-1.5625000000000000e+00; >> value5[1]=-1.4062500000000000e+00; >> value5[2]=5.9375000000000000e+00; >> value5[3]=-1.5625000000000000e+00 ; >> value5[4]= -1.4062500000000000e+00; >> MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); >> >> row=10; >> col5[0]=6; col5[1]=9; col5[2]=10; col5[3]=11; col5[4]=14; >> value5[0]=-1.5625000000000000e+00; >> value5[1]=-1.5625000000000000e+00; >> value5[2]=5.93750000000000 00e+00; >> value5[3]=-1.4062500000000000e+00 ; >> value5[4]= -1.4062500000000000e+00; >> MatSetValues(A,1,&row,5,col5,value5,INSERT_VALUES); >> >> >> row=11; >> col4[0]=7; col4[1]=10; col4[2]=11; col4[3]=15; >> value4[0]=-1.2500000000000000e+00; >> value4[1]=-1.4062500000000000e+00; >> value4[2]=3.8020833333333330e+00; >> value4[3]=-1.1458333333333333e+00; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> >> >> >> row=12; >> col3[0]=8; col3[1]=12; col3[2]=13; >> value3[0]=-1.1458333333333333e+00; >> value3[1]=2.2916666666666665e+00; >> value3[2]=-1.1458333333333333e+00; >> MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); >> >> >> row=13; >> col4[0]=9; col4[1]=12; col4[2]=13; col4[3]= 14; >> value4[0]=-1.4062500000000000e+00; >> value4[1]=-1.1458333333333333e+00; >> value4[2]=3.8020833333333330e+00; >> value4[3]=-1.2500000000000000e+00; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> >> row=14; >> col4[0]=10; col4[1]=13; col4[2]=14; col4[3]=15; >> value4[0]=-1.4062500000000000e+00; >> value4[1]=-1.2500000000000000e+00; >> value4[2]=3.8020833333333330e+00; >> value4[3]=-1.1458333333333333e+00; >> MatSetValues(A,1,&row,4,col4,value4,INSERT_VALUES); >> >> >> >> row=15; >> col3[0]=11; col3[1]=14; col3[2]=15; >> value3[0]=-1.1458333333333333e+00; >> value3[1]=-1.1458333333333333e+00; >> value3[2]=2.2916666666666665e+00; >> MatSetValues(A,1,&row,3,col3,value3,INSERT_VALUES); >> >> >> &n bsp; >> MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); >> MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); >> >> // MatView(A,PETSC_VIEWER_STDOUT_WORLD); >> >> >> >> double val[16]={-7.6233779415782715e-04, >> -3.0457072596705860e-04, >> 3.0457072596705860e-04, >> 7.6233779415782715e-04, >> -4.4764543813290442e-03, >> -1.6196451741044846e-03, >> 1.6196451741044846e-03, >> 4.4764543813290442e-03, >> -1.9333475373837013e-02, >> -5.4815619458573189e-03, >> 5.4815619458573189e-03, >> 1.9333475373837013e-02, >> -8.4153777598326651e-02, >> -1.2883385353962010e-02, >> 1.2883385353962010e-02, >> 8.4153777598326651e-02}; >> >> int* col16; >> PetscMalloc(16*sizeof(PetscInt),&col16); >> for(int i=0;i<16;i++) >> col16[i]=i; >> >> VecSetValues(b,16,col16,val,INSERT_VALUES); >> VecAssemblyBegin(b); >> VecAssemblyEnd(b); >> // >> &nbs p; VecView(b,PETSC_VIEWER_STDOUT_WORLD); >> >> >> KSPCreate(PETSC_COMM_WORLD,&ksp); >> KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); >> KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); >> KSPSetInitialGuessNonzero(ksp,PETSC_FALSE); >> >> KSPSetType(ksp,KSPBCGS); >> // KSPSetType(ksp,KSPLSQR); >> // KSPSetType(ksp,KSPFGMRES); >> // KSPSetType(ksp,KSPDGMRES); >> //KSPSetType(ksp,KSPTCQMR); >> // KSPSetType(ksp,KSPPREONLY); >> >> //KSPGetPC(ksp,&Pc); >> // PCSetType(Pc,PCJACOBI); >> KSPSetFromOptions(ksp); >> KSPSetTolerances(ksp,1.e-20,1.e-20,PETSC_DEFAULT,1000); >> KSPSolve(ksp,b,x); >> >> >> VecView(x,PETSC_VIEWER_STDOUT_WORLD); >> >> >> PetscFinalize(); >> return 0; >> } >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri Aug 16 08:18:39 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 16 Aug 2013 08:18:39 -0500 Subject: [petsc-users] Should I synchronize processes explicitly, for instance using MPI_Barrier? In-Reply-To: References: Message-ID: <8738q9bwpc.fsf@mcs.anl.gov> Matthew Knepley writes: > Never put in a barrier. Barriers serve no functional or correctness purpose in a properly-written pure-MPI code. They are sometimes useful for debugging and sometimes necessary if you use despicable hacks like communicating through a side channel such as the file system. If you're not putting yourself into that unenviable position, don't use barriers. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Fri Aug 16 08:21:18 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 16 Aug 2013 08:21:18 -0500 Subject: [petsc-users] what is the meaning of matrix file output by matview. In-Reply-To: <5af9e772.20d81.140868855e9.Coremail.ztdepyahoo@163.com> References: <5af9e772.20d81.140868855e9.Coremail.ztdepyahoo@163.com> Message-ID: <87zjshai0h.fsf@mcs.anl.gov> It looks like you changed the output format. ??? writes: > Matrix Object: 1 MPI processes > type: mpiaij > 16 40 Matrix dimensions. > 1 4 7 10 12 15 > 18 21 23 26 29 32 > 34 36 38 40 41 CSR format: offsets of row starts are indices into the arrays below, containing column indices and values of nonzero entries. > 1 2 5 > 2 3 6 > 3 4 7 > 4 8 > 5 6 9 > 6 7 10 > 7 8 11 > 8 12 > 9 10 13 > 10 11 14 > 11 12 15 > 12 16 > 13 14 > 14 15 > 15 16 > 16 > > 2.2916666666666665e+00 -1.1458333333333333e+00 -1.1458333333333333e+00 > 3.8020833333333330e+00 -1.2500000000000000e+00 -1.4062500000000000e+00 > 3.8020833333333330e+00 -1.1458333333333333e+00 -1.4062500000000000e+00 > 2.2916666666666665e+00 -1.1458333333333333e+00 > 3.8020833333333330e+00 -1.4062500000000000e+00 -1.2500000000000000e+00 > 5.9375000000000000e+00 -1.5625000000000000e+00 -1.5625000000000000e+00 > 5.9375000000000000e+00 -1.4062500000000000e+00 -1.5625000000000000e+00 > 3.8020833333333330e+00 -1.2500000000000000e+00 > 3.8020833333333330e+00 -1.4062500000000000e+00 -1.1458333333333333e+00 > 5.9375000000000000e+00 -1.5625000000000000e+00 -1.4062500000000000e+00 > 5.9375000000000000e+00 -1.4062500000000000e+00 -1.4062500000000000e+00 > 3.8020833333333330e+00 -1.1458333333333333e+00 > 2.2916666666666665e+00 -1.1458333333333333e+00 > 3.8020833333333330e+00 -1.2500000000000000e+00 > 3.8020833333333330e+00 -1.1458333333333333e+00 > 2.2916666666666665e+00 -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mcgrory at aerosoftinc.com Fri Aug 16 10:09:24 2013 From: mcgrory at aerosoftinc.com (Bill McGrory) Date: Fri, 16 Aug 2013 11:09:24 -0400 Subject: [petsc-users] Narrowing down "Matrix is missing diagonal entry" errors Message-ID: <520E40A4.9080503@aerosoftinc.com> More information pertaining to the referenced post. I am using default parameters for my KSP solver, so my routines are pretty simple I don't pass any options in through the command line, so this is all default stuff here. I create, and fill A and b, and then call MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); VecAssemblyBegin(b); VecAssemblyEnd(b); VecSet(x,0.); // Solve A x = b KSPCreate(PETSC_COMM_WORLD,&ksp); KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); KSPSetFromOptions(ksp); KSPSolve(ksp,b,x); The missing diagonals, I see are in the preconditioner, not the original Matrix, so I thought I would check my original. When I make a call to MatMissingDiagonal, after assembling my matrix, I get the error telling me that MatMissingDiagonal is not supported for a mpibaij matrix. Do I have any alternative for querying my assembled matrix? Thanks again Bill -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 5396 bytes Desc: S/MIME Cryptographic Signature URL: From knepley at gmail.com Fri Aug 16 10:34:42 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Aug 2013 10:34:42 -0500 Subject: [petsc-users] Narrowing down "Matrix is missing diagonal entry" errors In-Reply-To: <520E40A4.9080503@aerosoftinc.com> References: <520E40A4.9080503@aerosoftinc.com> Message-ID: On Fri, Aug 16, 2013 at 10:09 AM, Bill McGrory wrote: > More information pertaining to the referenced post. > > I am using default parameters for my KSP solver, so my routines are pretty > simple I don't pass any options in through the command line, so this is all > default stuff here. > > I create, and fill A and b, and then call > MatAssemblyBegin(A,MAT_FINAL_**ASSEMBLY); > > MatAssemblyEnd(A,MAT_FINAL_**ASSEMBLY); > > VecAssemblyBegin(b); > VecAssemblyEnd(b); > > VecSet(x,0.); > > // Solve A x = b > > KSPCreate(PETSC_COMM_WORLD,&**ksp); > > KSPSetOperators(ksp,A,A,**DIFFERENT_NONZERO_PATTERN); > > KSPSetFromOptions(ksp); > > KSPSolve(ksp,b,x); > > The missing diagonals, I see are in the preconditioner, not the original > Matrix, so I thought I would check my original. > >From the code above, this makes no sense. You do not have a separate preconditioner matrix. When I make a call to MatMissingDiagonal, after assembling my matrix, I get > the error telling me that MatMissingDiagonal is not supported for a mpibaij > matrix. > > Do I have any alternative for querying my assembled matrix? > You can always use MatGetValues(). Matt > Thanks again > Bill > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcgrory at aerosoftinc.com Fri Aug 16 13:07:08 2013 From: mcgrory at aerosoftinc.com (Bill McGrory) Date: Fri, 16 Aug 2013 14:07:08 -0400 Subject: [petsc-users] Narrowing down "Matrix is missing diagonal entry" errors In-Reply-To: References: <520E40A4.9080503@aerosoftinc.com> Message-ID: <520E6A4C.1090005@aerosoftinc.com> Thanks for the reply Matthew Your suggestion to use MatGetValues helped me to figure out my problem. The legacy part of my software decided I didn't have enough memory to solve the problem in one shot, so it split it into two pieces, unbeknownst to me. So my A matrix was being partially loaded (hence the unfilled diagonals) On 08/16/2013 11:34 AM, Matthew Knepley wrote: > On Fri, Aug 16, 2013 at 10:09 AM, Bill McGrory > > wrote: > > More information pertaining to the referenced post. > > I am using default parameters for my KSP solver, so my routines > are pretty simple I don't pass any options in through the command > line, so this is all default stuff here. > > I create, and fill A and b, and then call > MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); > > MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); > > VecAssemblyBegin(b); > VecAssemblyEnd(b); > > VecSet(x,0.); > > // Solve A x = b > > KSPCreate(PETSC_COMM_WORLD,&ksp); > > KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN); > > KSPSetFromOptions(ksp); > > KSPSolve(ksp,b,x); > > The missing diagonals, I see are in the preconditioner, not the > original Matrix, so I thought I would check my original. > > > From the code above, this makes no sense. You do not have a separate > preconditioner matrix. > > When I make a call to MatMissingDiagonal, after assembling my > matrix, I get the error telling me that MatMissingDiagonal is not > supported for a mpibaij matrix. > > Do I have any alternative for querying my assembled matrix? > > > You can always use MatGetValues(). > > Matt > > Thanks again > Bill > > > > > Very true statement in my case below. Ah, but to know what you are taking for granted. > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 5396 bytes Desc: S/MIME Cryptographic Signature URL: From Shuangshuang.Jin at pnnl.gov Fri Aug 16 16:01:50 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 16 Aug 2013 14:01:50 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <87fvuabcay.fsf@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> Hello, Jed, in my IJacobian subroutine, I defined a PetscScalar J[4*n][4*n], and filled in the values for this J matrix by MatSetValues(). 245 seconds out of the total 351 seconds in the DAE TS solving part are due to this J matrix computation. For that J matrix, half of them are constants values which doesn't change in each iteration. However, since my J is created inside each IJacobian() call, I couldn't reuse it. If that part of work belongs to redundant computation, I would like to know if there's a way to set up the Jacobian matrix outside of the IJacobian() subroutine, so that I can keep the constant part of values in J for all the iterations but only updates the changing values which depends on X? Thanks, Shuangshuang -----Original Message----- From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown Sent: Thursday, August 15, 2013 7:27 PM To: Jin, Shuangshuang Cc: petsc-users at mcs.anl.gov Subject: RE: [petsc-users] Performance of PETSc TS solver "Jin, Shuangshuang" writes: > Hi, Jed, > > I followed your suggestion and profiled the IJacobian stage, please see the related profile below: Cool, all of these are pretty inexpensive, so your time is probably in compu From bsmith at mcs.anl.gov Fri Aug 16 16:07:56 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 16 Aug 2013 16:07:56 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> Message-ID: <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> On Aug 16, 2013, at 4:01 PM, "Jin, Shuangshuang" wrote: > Hello, Jed, in my IJacobian subroutine, I defined a PetscScalar J[4*n][4*n], and filled in the values for this J matrix by MatSetValues(). What is n? It should not be taking anywhere this much time. How sparse is the matrix? Do you preallocate the nonzero structure? Do you reuse the same matrix for each time step? > > 245 seconds out of the total 351 seconds in the DAE TS solving part are due to this J matrix computation. > > For that J matrix, half of them are constants values which doesn't change in each iteration. However, since my J is created inside each IJacobian() call, I couldn't reuse it. If that part of work belongs to redundant computation, I would like to know if there's a way to set up the Jacobian matrix outside of the IJacobian() subroutine, so that I can keep the constant part of values in J for all the iterations but only updates the changing values which depends on X? MatStoreValues() and MatRetrieveValues() but you can only call this after you have assembled the matrix with the correct nonzero structure. So you need to put the constants values in, put zeros in all the locations with non constant values (that are not permeant zeros), call MatAssemblyBegin/End() then call MatStoreValues() then for each computation of the Jacobian you first call MatRetrieveValues() and then put in the non constant values. Then call MatAssemblyBegin/End() Barry > > Thanks, > Shuangshuang > > -----Original Message----- > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Thursday, August 15, 2013 7:27 PM > To: Jin, Shuangshuang > Cc: petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] Performance of PETSc TS solver > > "Jin, Shuangshuang" writes: > >> Hi, Jed, >> >> I followed your suggestion and profiled the IJacobian stage, please see the related profile below: > > Cool, all of these are pretty inexpensive, so your time is probably in compu From abhyshr at mcs.anl.gov Fri Aug 16 16:29:33 2013 From: abhyshr at mcs.anl.gov (Shri) Date: Fri, 16 Aug 2013 16:29:33 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> Message-ID: Is it possible for you to share the IJacobian code? That would help us to understand the issue better and faster. Thanks, Shri On Aug 16, 2013, at 4:07 PM, Barry Smith wrote: > > On Aug 16, 2013, at 4:01 PM, "Jin, Shuangshuang" wrote: > >> Hello, Jed, in my IJacobian subroutine, I defined a PetscScalar J[4*n][4*n], and filled in the values for this J matrix by MatSetValues(). > > What is n? > > It should not be taking anywhere this much time. How sparse is the matrix? Do you preallocate the nonzero structure? Do you reuse the same matrix for each time step? >> >> 245 seconds out of the total 351 seconds in the DAE TS solving part are due to this J matrix computation. >> >> For that J matrix, half of them are constants values which doesn't change in each iteration. However, since my J is created inside each IJacobian() call, I couldn't reuse it. If that part of work belongs to redundant computation, I would like to know if there's a way to set up the Jacobian matrix outside of the IJacobian() subroutine, so that I can keep the constant part of values in J for all the iterations but only updates the changing values which depends on X? > > MatStoreValues() and MatRetrieveValues() but you can only call this after you have assembled the matrix with the correct nonzero structure. So you need to put the constants values in, put zeros in all the locations with non constant values (that are not permeant zeros), call MatAssemblyBegin/End() then call MatStoreValues() then for each computation of the Jacobian you first call MatRetrieveValues() and then put in the non constant values. Then call MatAssemblyBegin/End() > > Barry > >> >> Thanks, >> Shuangshuang >> >> -----Original Message----- >> From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown >> Sent: Thursday, August 15, 2013 7:27 PM >> To: Jin, Shuangshuang >> Cc: petsc-users at mcs.anl.gov >> Subject: RE: [petsc-users] Performance of PETSc TS solver >> >> "Jin, Shuangshuang" writes: >> >>> Hi, Jed, >>> >>> I followed your suggestion and profiled the IJacobian stage, please see the related profile below: >> >> Cool, all of these are pretty inexpensive, so your time is probably in compu > From Shuangshuang.Jin at pnnl.gov Fri Aug 16 17:14:09 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 16 Aug 2013 15:14:09 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> Hello, Barry and Shri, thanks for your helps. To answer your question, n is 288. Usually there're greater than 1/8 nonzeros in the matrix, so it's pretty dense. We didn't preallocate the J[4*n][4*n] matrix, it was created in each IJacobian() call. I would like to create it in the main function, and pass it to the IJacobian() function to reuse it for each time step. I guess in that way it can save much time and memory usage? I have this piece of code in the main function: ierr = MatCreate(PETSC_COMM_WORLD, &J); CHKERRQ(ierr); // J: Jacobian matrix ierr = MatSetSizes(J, PETSC_DECIDE, PETSC_DECIDE, 4*n, 4*n); CHKERRQ(ierr); ierr = MatSetFromOptions(J); CHKERRQ(ierr); ierr = MatSetUp(J); CHKERRQ(ierr); ierr = TSSetIJacobian(ts, J, J, (TSIJacobian) IJacobian, &user); CHKERRQ(ierr); Shall I add the constants values for the J matrix here somewhere to use what you mentioned " MatStoreValues() and MatRetrieveValues()"? Sorry I cannot post the Jacobian matrix equations here for nondisclosure policy. But I can paste the framework of our IJacobian function with the equations skipped for trouble shooting: PetscErrorCode Simulation::IJacobian(TS ts, PetscReal t, Vec X, Vec Xdot, PetscReal a, Mat *A, Mat *B, MatStructure *flag, Userctx *ctx) { PetscLogStage stage = ctx->stage; PetscLogStagePush(stage); PetscErrorCode ierr; PetscInt n = ctx->n; PetscScalar *x; PetscScalar J[4*n][4*n]; PetscInt rowcol[4*n]; PetscScalar val[n]; PetscInt i, j; DM da; PetscInt xstart, xlen; int me; double t0, t1; PetscFunctionBeginUser; ierr = TSGetDM(ts, &da); CHKERRQ(ierr); // Get pointers to vector data MPI_Comm_rank(PETSC_COMM_WORLD, &me); scatterMyVec(X, &x); CHKERRQ(ierr); // Get local grid boundaries ierr = DMDAGetCorners(da, &xstart, NULL, NULL, &xlen, NULL, NULL); CHKERRQ(ierr); //////////////////////////////////////////////////////////////////////////////////////// // This proves to be the most time-consuming block in the computation: // Assign values to J matrix for the first 2*n rows (constant values) ... (skipped) // Assign values to J matrix for the following 2*n rows (depends on X values) for (i = 0; i < n; i++) { for (j = 0; j < n; j++) { ...(skipped) } //////////////////////////////////////////////////////////////////////////////////////// for (i = 0; i < 4*n; i++) { rowcol[i] = i; } // Compute function over the locally owned part of the grid for (i = xstart; i < xstart+xlen; i++) { ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], INSERT_VALUES); CHKERRQ(ierr); } ierr = DMDAVecRestoreArray(da, X, &x); CHKERRQ(ierr); ierr = MatAssemblyBegin(*A, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); ierr = MatAssemblyEnd(*A, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); if (*A != *B) { ierr = MatAssemblyBegin(*B, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); ierr = MatAssemblyEnd(*B, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); } *flag = SAME_NONZERO_PATTERN; PetscLogStagePop(); PetscFunctionReturn(0); } Hope this code can show you a better picture of our problem here. Thanks, Shuangshuang -----Original Message----- From: Barry Smith [mailto:bsmith at mcs.anl.gov] Sent: Friday, August 16, 2013 2:08 PM To: Jin, Shuangshuang Cc: Jed Brown; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Performance of PETSc TS solver On Aug 16, 2013, at 4:01 PM, "Jin, Shuangshuang" wrote: > Hello, Jed, in my IJacobian subroutine, I defined a PetscScalar J[4*n][4*n], and filled in the values for this J matrix by MatSetValues(). What is n? It should not be taking anywhere this much time. How sparse is the matrix? Do you preallocate the nonzero structure? Do you reuse the same matrix for each time step? > > 245 seconds out of the total 351 seconds in the DAE TS solving part are due to this J matrix computation. > > For that J matrix, half of them are constants values which doesn't change in each iteration. However, since my J is created inside each IJacobian() call, I couldn't reuse it. If that part of work belongs to redundant computation, I would like to know if there's a way to set up the Jacobian matrix outside of the IJacobian() subroutine, so that I can keep the constant part of values in J for all the iterations but only updates the changing values which depends on X? MatStoreValues() and MatRetrieveValues() but you can only call this after you have assembled the matrix with the correct nonzero structure. So you need to put the constants values in, put zeros in all the locations with non constant values (that are not permeant zeros), call MatAssemblyBegin/End() then call MatStoreValues() then for each computation of the Jacobian you first call MatRetrieveValues() and then put in the non constant values. Then call MatAssemblyBegin/End() Barry > > Thanks, > Shuangshuang > > -----Original Message----- > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Thursday, August 15, 2013 7:27 PM > To: Jin, Shuangshuang > Cc: petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] Performance of PETSc TS solver > > "Jin, Shuangshuang" writes: > >> Hi, Jed, >> >> I followed your suggestion and profiled the IJacobian stage, please see the related profile below: > > Cool, all of these are pretty inexpensive, so your time is probably in compu From fangxingjun0319 at gmail.com Fri Aug 16 18:37:05 2013 From: fangxingjun0319 at gmail.com (Frank) Date: Fri, 16 Aug 2013 18:37:05 -0500 Subject: [petsc-users] FORTRAN 90 with PETSc Message-ID: <520EB7A1.5060505@gmail.com> Hi, I am using PETSc to iterate a problem, that is to say I call KSPSolve repeatedly. Firstly, I write all the PETSc components in one subroutine, including "MatCreate", "VecCreateMPI", etc. Everything works fine. Then, I want to only initialize ksp once outside the loop, and the matrix and rhs is changed within the loop repeatedly. Here are my problem: 1. I tried to use COMMON to transfer the following variables. I include "petsc.var" in the solver subroutine. It cannot be compiled. "petsc.var" Vec x,b Mat A KSP ksp PC pc COMMON /MYPETSC/x, b, A,ksp,pc 2. I defined the following in the main program: PROGRAM MAIN #include #include #include #include #include Vec x,b Mat A KSP ksp PC pc ...... CALL INIT_PETSC(ksp,pc,A,x,b) ...... CALL LOOP(ksp,pc,A,x,b) END PROGRAM !--------------------------------------------------- SUBROUTINE LOOP(ksp,pc,A,x,b) Vec x,b Mat A KSP ksp PC pc ...... CALL SOLVE(ksp,pc,A,x,b) ....... END SUBROUTINE !--------------------------------------------------- SUBROUTINE SOLVE(ksp,pc,A,x,b) Vec x,b Mat A KSP ksp PC pc ...... CALL (ksp, b,x,ierr) END SUBROUTINE It can be compiled, but ksp does not iterate. Could you please explain to me the reason and solution for this problem. Thank you very much. From bsmith at mcs.anl.gov Fri Aug 16 18:38:15 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 16 Aug 2013 18:38:15 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> Message-ID: What percentage of the matrix entries are non-zero? If it more than, say 20 -30 percent nonzero then you should just use a dense matrix format. But then I see you are using a DMDA which implies it comes from some kind of mesh? Is there coupling more than nearest neighbor in the matrix. Are there 4 by 4 blocks in the matrix? You could use BAIJ matrices with a block size of 4. Barry On Aug 16, 2013, at 5:14 PM, "Jin, Shuangshuang" wrote: > Hello, Barry and Shri, thanks for your helps. > > To answer your question, n is 288. Usually there're greater than 1/8 nonzeros in the matrix, so it's pretty dense. > > We didn't preallocate the J[4*n][4*n] matrix, it was created in each IJacobian() call. I would like to create it in the main function, and pass it to the IJacobian() function to reuse it for each time step. I guess in that way it can save much time and memory usage? > > I have this piece of code in the main function: > > ierr = MatCreate(PETSC_COMM_WORLD, &J); CHKERRQ(ierr); // J: Jacobian matrix > ierr = MatSetSizes(J, PETSC_DECIDE, PETSC_DECIDE, 4*n, 4*n); CHKERRQ(ierr); > ierr = MatSetFromOptions(J); CHKERRQ(ierr); > ierr = MatSetUp(J); CHKERRQ(ierr); > > ierr = TSSetIJacobian(ts, J, J, (TSIJacobian) IJacobian, &user); CHKERRQ(ierr); > > Shall I add the constants values for the J matrix here somewhere to use what you mentioned " MatStoreValues() and MatRetrieveValues()"? > > Sorry I cannot post the Jacobian matrix equations here for nondisclosure policy. But I can paste the framework of our IJacobian function with the equations skipped for trouble shooting: > > PetscErrorCode Simulation::IJacobian(TS ts, PetscReal t, Vec X, Vec Xdot, PetscReal a, Mat *A, Mat *B, MatStructure *flag, Userctx *ctx) > { > PetscLogStage stage = ctx->stage; > PetscLogStagePush(stage); > > PetscErrorCode ierr; > PetscInt n = ctx->n; > PetscScalar *x; > PetscScalar J[4*n][4*n]; > > PetscInt rowcol[4*n]; > PetscScalar val[n]; > > PetscInt i, j; > > DM da; > PetscInt xstart, xlen; > > int me; > double t0, t1; > > PetscFunctionBeginUser; > > ierr = TSGetDM(ts, &da); CHKERRQ(ierr); > > // Get pointers to vector data > MPI_Comm_rank(PETSC_COMM_WORLD, &me); > scatterMyVec(X, &x); CHKERRQ(ierr); > > // Get local grid boundaries > ierr = DMDAGetCorners(da, &xstart, NULL, NULL, &xlen, NULL, NULL); CHKERRQ(ierr); > > //////////////////////////////////////////////////////////////////////////////////////// > // This proves to be the most time-consuming block in the computation: > // Assign values to J matrix for the first 2*n rows (constant values) > ... (skipped) > > // Assign values to J matrix for the following 2*n rows (depends on X values) > for (i = 0; i < n; i++) { > for (j = 0; j < n; j++) { > ...(skipped) > } > //////////////////////////////////////////////////////////////////////////////////////// > > for (i = 0; i < 4*n; i++) { > rowcol[i] = i; > } > > // Compute function over the locally owned part of the grid > for (i = xstart; i < xstart+xlen; i++) { > ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], INSERT_VALUES); CHKERRQ(ierr); > } > > ierr = DMDAVecRestoreArray(da, X, &x); CHKERRQ(ierr); > > ierr = MatAssemblyBegin(*A, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyEnd(*A, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > if (*A != *B) { > ierr = MatAssemblyBegin(*B, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyEnd(*B, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > } > *flag = SAME_NONZERO_PATTERN; > > PetscLogStagePop(); > PetscFunctionReturn(0); > } > > Hope this code can show you a better picture of our problem here. > > Thanks, > Shuangshuang > > > -----Original Message----- > From: Barry Smith [mailto:bsmith at mcs.anl.gov] > Sent: Friday, August 16, 2013 2:08 PM > To: Jin, Shuangshuang > Cc: Jed Brown; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Performance of PETSc TS solver > > > On Aug 16, 2013, at 4:01 PM, "Jin, Shuangshuang" wrote: > >> Hello, Jed, in my IJacobian subroutine, I defined a PetscScalar J[4*n][4*n], and filled in the values for this J matrix by MatSetValues(). > > What is n? > > It should not be taking anywhere this much time. How sparse is the matrix? Do you preallocate the nonzero structure? Do you reuse the same matrix for each time step? >> >> 245 seconds out of the total 351 seconds in the DAE TS solving part are due to this J matrix computation. >> >> For that J matrix, half of them are constants values which doesn't change in each iteration. However, since my J is created inside each IJacobian() call, I couldn't reuse it. If that part of work belongs to redundant computation, I would like to know if there's a way to set up the Jacobian matrix outside of the IJacobian() subroutine, so that I can keep the constant part of values in J for all the iterations but only updates the changing values which depends on X? > > MatStoreValues() and MatRetrieveValues() but you can only call this after you have assembled the matrix with the correct nonzero structure. So you need to put the constants values in, put zeros in all the locations with non constant values (that are not permeant zeros), call MatAssemblyBegin/End() then call MatStoreValues() then for each computation of the Jacobian you first call MatRetrieveValues() and then put in the non constant values. Then call MatAssemblyBegin/End() > > Barry > >> >> Thanks, >> Shuangshuang >> >> -----Original Message----- >> From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown >> Sent: Thursday, August 15, 2013 7:27 PM >> To: Jin, Shuangshuang >> Cc: petsc-users at mcs.anl.gov >> Subject: RE: [petsc-users] Performance of PETSc TS solver >> >> "Jin, Shuangshuang" writes: >> >>> Hi, Jed, >>> >>> I followed your suggestion and profiled the IJacobian stage, please see the related profile below: >> >> Cool, all of these are pretty inexpensive, so your time is probably in compu > From jedbrown at mcs.anl.gov Fri Aug 16 19:00:23 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 16 Aug 2013 19:00:23 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> Message-ID: <87d2pd89uw.fsf@mcs.anl.gov> "Jin, Shuangshuang" writes: > //////////////////////////////////////////////////////////////////////////////////////// > // This proves to be the most time-consuming block in the computation: > // Assign values to J matrix for the first 2*n rows (constant values) > ... (skipped) > > // Assign values to J matrix for the following 2*n rows (depends on X values) > for (i = 0; i < n; i++) { > for (j = 0; j < n; j++) { > ...(skipped) This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? > } > //////////////////////////////////////////////////////////////////////////////////////// > > for (i = 0; i < 4*n; i++) { > rowcol[i] = i; > } > > // Compute function over the locally owned part of the grid > for (i = xstart; i < xstart+xlen; i++) { > ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], INSERT_VALUES); CHKERRQ(ierr); This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Fri Aug 16 21:17:02 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 16 Aug 2013 21:17:02 -0500 Subject: [petsc-users] FORTRAN 90 with PETSc In-Reply-To: <520EB7A1.5060505@gmail.com> References: <520EB7A1.5060505@gmail.com> Message-ID: On Fri, Aug 16, 2013 at 6:37 PM, Frank wrote: > Hi, > > I am using PETSc to iterate a problem, that is to say I call KSPSolve > repeatedly. > Firstly, I write all the PETSc components in one subroutine, including > "MatCreate", "VecCreateMPI", etc. Everything works fine. > Then, I want to only initialize ksp once outside the loop, and the matrix > and rhs is changed within the loop repeatedly. Here are my problem: > > 1. I tried to use COMMON to transfer the following variables. I include > "petsc.var" in the solver subroutine. It cannot be compiled. > "petsc.var" > Vec x,b > Mat A > KSP ksp > PC pc > COMMON /MYPETSC/x, b, A,ksp,pc > > 2. I defined the following in the main program: > PROGRAM MAIN > #include > #include > #include > #include > #include > Vec x,b > Mat A > KSP ksp > PC pc > ...... > CALL INIT_PETSC(ksp,pc,A,x,b) > ...... > CALL LOOP(ksp,pc,A,x,b) > > END PROGRAM > !-----------------------------**---------------------- > SUBROUTINE LOOP(ksp,pc,A,x,b) > Vec x,b > Mat A > KSP ksp > PC pc > ...... > CALL SOLVE(ksp,pc,A,x,b) > ....... > END SUBROUTINE > !-----------------------------**---------------------- > SUBROUTINE SOLVE(ksp,pc,A,x,b) > Vec x,b > Mat A > KSP ksp > PC pc > > ...... > CALL (ksp, b,x,ierr) > END SUBROUTINE > > It can be compiled, but ksp does not iterate. > > Could you please explain to me the reason and solution for this problem. > I do not understand "does not iterate". Is there an error message? Thanks, Matt > Thank you very much. > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From suyan0 at gmail.com Sat Aug 17 01:40:07 2013 From: suyan0 at gmail.com (Su Yan) Date: Sat, 17 Aug 2013 01:40:07 -0500 Subject: [petsc-users] SNES Line Search Monitor Message-ID: Hi, I ran into something really weird when I tried to solve a nonlinear equation with Newton method and line search. Specifically, I used SNESSetType(snesMbr, SNESNEWTONLS); and PetscOptionsSetValue("-snes_linesearch_type", "l2"); When I execute my program abc.bin with the following command: ./abc.bin -snes_monitor -snes_linesearch_monitor I got the following output: 0 SNES Function norm 1.457697974866e+07 Line search: lambdas = [1, 0.5, 0], fnorms = [669102, 7.35102e+06, 1.4577e+07] Line search terminated: lambda = 1.00553, fnorms = 652606 1 SNES Function norm 6.526060362905e+05 Line search: lambdas = [1, 0.5, 0], fnorms = [3406.6, 326873, 652606] Line search terminated: lambda = 1.00171, fnorms = 2801.6 2 SNES Function norm 2.801596249480e+03 Line search: lambdas = [1, 0.5, 0], fnorms = [2.51242, 1401.2, 2801.6] Line search terminated: lambda = 1.00029, fnorms = 2.09292 3 SNES Function norm 2.092918540169e+00 Line search: lambdas = [1, 0.5, 0], fnorms = [0.000123295, 1.04646, 2.09292] Line search terminated: lambda = 1, fnorms = 0.000122588 4 SNES Function norm 1.225883678418e-04 Converged Reason: FNORM_RELATIVE The nonlinear problem converged normally with a relative f_norm set as 1E-7. However, if I execute exactly the same program, but with a slightly different runtime command: ./abc.bin -snes_monitor I got the following output: 0 SNES Function norm 1.457697974975e+07 1 SNES Function norm 6.526060348917e+05 2 SNES Function norm 2.801608208510e+03 3 SNES Function norm 2.450488738084e+03 4 SNES Function norm 3.269507987119e+02 5 SNES Function norm 3.016606325384e+02 6 SNES Function norm 2.463851989463e+02 7 SNES Function norm 1.546266418976e+02 8 SNES Function norm 1.492518400407e+02 9 SNES Function norm 1.477122410995e+02 10 SNES Function norm 1.503359418680e+02 11 SNES Function norm 1.504759910776e+02 12 SNES Function norm 1.417592634863e+02 13 SNES Function norm 3.047096130411e+05 and the solver diverged. The only difference was that whether I used "-snes_linesearch_monitor" or not. In my understanding, this runtime option only turns on the screen print. So why did it make such a big difference? Is there anything special with this option turned on? Hope someone could help me out. Thanks a lot. Regards, Su -------------- next part -------------- An HTML attachment was scrubbed... URL: From solvercorleone at gmail.com Sat Aug 17 06:49:52 2013 From: solvercorleone at gmail.com (Cong Li) Date: Sat, 17 Aug 2013 20:49:52 +0900 Subject: [petsc-users] Should I synchronize processes explicitly, for instance using MPI_Barrier? In-Reply-To: <8738q9bwpc.fsf@mcs.anl.gov> References: <8738q9bwpc.fsf@mcs.anl.gov> Message-ID: hi Matthew, Jed Thank you very much for the answer and advise. Cong On Fri, Aug 16, 2013 at 10:18 PM, Jed Brown wrote: > Matthew Knepley writes: > > Never put in a barrier. > > Barriers serve no functional or correctness purpose in a > properly-written pure-MPI code. They are sometimes useful for debugging > and sometimes necessary if you use despicable hacks like communicating > through a side channel such as the file system. If you're not putting > yourself into that unenviable position, don't use barriers. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brune at mcs.anl.gov Sat Aug 17 08:49:19 2013 From: brune at mcs.anl.gov (Peter Brune) Date: Sat, 17 Aug 2013 08:49:19 -0500 Subject: [petsc-users] SNES Line Search Monitor In-Reply-To: References: Message-ID: On Sat, Aug 17, 2013 at 1:40 AM, Su Yan wrote: > Hi, I ran into something really weird when I tried to solve a nonlinear > equation with Newton method and line search. Specifically, I used > SNESSetType(snesMbr, SNESNEWTONLS); and > PetscOptionsSetValue("-snes_linesearch_type", "l2"); > > Which version of PETSc are you using? > When I execute my program abc.bin with the following command: > > ./abc.bin -snes_monitor -snes_linesearch_monitor > > I got the following output: > > 0 SNES Function norm 1.457697974866e+07 > Line search: lambdas = [1, 0.5, 0], fnorms = [669102, 7.35102e+06, > 1.4577e+07] > Line search terminated: lambda = 1.00553, fnorms = 652606 > 1 SNES Function norm 6.526060362905e+05 > Line search: lambdas = [1, 0.5, 0], fnorms = [3406.6, 326873, > 652606] > Line search terminated: lambda = 1.00171, fnorms = 2801.6 > 2 SNES Function norm 2.801596249480e+03 > Line search: lambdas = [1, 0.5, 0], fnorms = [2.51242, 1401.2, > 2801.6] > Line search terminated: lambda = 1.00029, fnorms = 2.09292 > 3 SNES Function norm 2.092918540169e+00 > Line search: lambdas = [1, 0.5, 0], fnorms = [0.000123295, 1.04646, > 2.09292] > Line search terminated: lambda = 1, fnorms = 0.000122588 > 4 SNES Function norm 1.225883678418e-04 > > Converged Reason: FNORM_RELATIVE > > The nonlinear problem converged normally with a relative f_norm set as > 1E-7. > > However, if I execute exactly the same program, but with a slightly > different runtime command: > > ./abc.bin -snes_monitor > > I got the following output: > > 0 SNES Function norm 1.457697974975e+07 > 1 SNES Function norm 6.526060348917e+05 > 2 SNES Function norm 2.801608208510e+03 > 3 SNES Function norm 2.450488738084e+03 > 4 SNES Function norm 3.269507987119e+02 > 5 SNES Function norm 3.016606325384e+02 > 6 SNES Function norm 2.463851989463e+02 > 7 SNES Function norm 1.546266418976e+02 > 8 SNES Function norm 1.492518400407e+02 > 9 SNES Function norm 1.477122410995e+02 > 10 SNES Function norm 1.503359418680e+02 > 11 SNES Function norm 1.504759910776e+02 > 12 SNES Function norm 1.417592634863e+02 > 13 SNES Function norm 3.047096130411e+05 > > Is this problem reproducible? Note that even your 0th norm is very slightly different, and the first three norms are quite similar. > and the solver diverged. > > The only difference was that whether I used "-snes_linesearch_monitor" or > not. > > In my understanding, this runtime option only turns on the screen print. > So why did it make such a big difference? Is there anything special with > this option turned on? Hope someone could help me out. > This is correct. It should not influence the solution at all, and all it does is enable printing. Also, the l2 line search takes significantly more work for well-behaved Newton convergence than bt, as bt automatically accepts the full step. l2 is mostly meant for cases where the step is automatically ill-scaled. - Peter > > Thanks a lot. > > Regards, > Su > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From juhaj at iki.fi Sat Aug 17 18:37:22 2013 From: juhaj at iki.fi (Juha =?ISO-8859-1?Q?J=E4ykk=E4?=) Date: Sun, 18 Aug 2013 01:37:22 +0200 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 Message-ID: <2569883.TWAWmgZMUo@rigel> Hi list! I wonder when and why (to make writing faster, I guess) this (and the corresponding line in src/vec/vec/impls/mpi/pdvec.c) was added: petsc-3.3-p6/src/dm/impls/da/gr2.c:360: status = H5Pset_chunk(chunkspace, dim, chunkDims); CHKERRQ(status); I know that 3.3.6 is "old", but 3.4.2 performs the same call, so I'm not any better off with that. This line causes problems, at least on a Cray XC30 machine, with HDF5 library telling me that "#010: H5Dchunk.c line 443 in H5D_chunk_construct(): chunk size must be < 4GB." I know chunked IO is good idea, but apparently PETSc uses disallowed chunksizes. Can I change the chunk size used by PETSc somehow or am I reduced to commenting out the two offending lines? (Which I already did and it solves the issue, but presumably leads to inferior performance.) Cheers, Juha -- ----------------------------------------------- | Juha J?ykk?, juhaj at iki.fi | | http://koti.kapsi.fi/~juhaj/ | ----------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: This is a digitally signed message part. URL: From suyan0 at gmail.com Sat Aug 17 18:46:13 2013 From: suyan0 at gmail.com (Su Yan) Date: Sat, 17 Aug 2013 18:46:13 -0500 Subject: [petsc-users] SNES Line Search Monitor In-Reply-To: References: Message-ID: I am using PETSc 3.4.1. This problem is reproducible. I tested several other cases, almost every case has the similar problem. Take this one as an example: with ./abc.bin -snes_monitor -snes_linesearch_monitor 0 SNES Function norm 1.470975405802e+06 Line search: lambdas = [1, 0.5, 0], fnorms = [7051.02, 735338, 1.47098e+06] Line search terminated: lambda = 0.999762, fnorms = 7026.38 1 SNES Function norm 7.026377609722e+03 Line search: lambdas = [1, 0.5, 0], fnorms = [189.711, 3514.55, 7026.38] Line search terminated: lambda = 0.999295, fnorms = 189.783 2 SNES Function norm 1.897827226353e+02 Line search: lambdas = [1, 0.5, 0], fnorms = [32.3811, 100.302, 189.783] Line search terminated: lambda = 1.01592, fnorms = 32.5125 3 SNES Function norm 3.251248928384e+01 Line search: lambdas = [1, 0.5, 0], fnorms = [0.00236829, 16.2564, 32.5125] Line search terminated: lambda = 1.00001, fnorms = 0.00233087 4 SNES Function norm 2.330867616400e-03 with ./abc.bin -snes_monitor 0 SNES Function norm 1.470975405802e+06 1 SNES Function norm 7.026377609170e+03 2 SNES Function norm 1.897827231670e+02 3 SNES Function norm 3.251248934919e+01 4 SNES Function norm 2.171496483661e-02 Significant difference can be observed. The equation I am solving is quite ill-conditioned. Without L2 line search it is hard to converge in some cases. Still try to figure out the reason. Thanks, Su On Sat, Aug 17, 2013 at 8:49 AM, Peter Brune wrote: > > > > On Sat, Aug 17, 2013 at 1:40 AM, Su Yan wrote: > >> Hi, I ran into something really weird when I tried to solve a nonlinear >> equation with Newton method and line search. Specifically, I used >> SNESSetType(snesMbr, SNESNEWTONLS); and >> PetscOptionsSetValue("-snes_linesearch_type", "l2"); >> >> > Which version of PETSc are you using? > > >> When I execute my program abc.bin with the following command: >> >> ./abc.bin -snes_monitor -snes_linesearch_monitor >> >> I got the following output: >> >> 0 SNES Function norm 1.457697974866e+07 >> Line search: lambdas = [1, 0.5, 0], fnorms = [669102, 7.35102e+06, >> 1.4577e+07] >> Line search terminated: lambda = 1.00553, fnorms = 652606 >> 1 SNES Function norm 6.526060362905e+05 >> Line search: lambdas = [1, 0.5, 0], fnorms = [3406.6, 326873, >> 652606] >> Line search terminated: lambda = 1.00171, fnorms = 2801.6 >> 2 SNES Function norm 2.801596249480e+03 >> Line search: lambdas = [1, 0.5, 0], fnorms = [2.51242, 1401.2, >> 2801.6] >> Line search terminated: lambda = 1.00029, fnorms = 2.09292 >> 3 SNES Function norm 2.092918540169e+00 >> Line search: lambdas = [1, 0.5, 0], fnorms = [0.000123295, >> 1.04646, 2.09292] >> Line search terminated: lambda = 1, fnorms = 0.000122588 >> 4 SNES Function norm 1.225883678418e-04 >> >> Converged Reason: FNORM_RELATIVE >> >> The nonlinear problem converged normally with a relative f_norm set as >> 1E-7. >> >> However, if I execute exactly the same program, but with a slightly >> different runtime command: >> >> ./abc.bin -snes_monitor >> >> I got the following output: >> >> 0 SNES Function norm 1.457697974975e+07 >> 1 SNES Function norm 6.526060348917e+05 >> 2 SNES Function norm 2.801608208510e+03 >> 3 SNES Function norm 2.450488738084e+03 >> 4 SNES Function norm 3.269507987119e+02 >> 5 SNES Function norm 3.016606325384e+02 >> 6 SNES Function norm 2.463851989463e+02 >> 7 SNES Function norm 1.546266418976e+02 >> 8 SNES Function norm 1.492518400407e+02 >> 9 SNES Function norm 1.477122410995e+02 >> 10 SNES Function norm 1.503359418680e+02 >> 11 SNES Function norm 1.504759910776e+02 >> 12 SNES Function norm 1.417592634863e+02 >> 13 SNES Function norm 3.047096130411e+05 >> >> > Is this problem reproducible? Note that even your 0th norm is very > slightly different, and the first three norms are quite similar. > > >> and the solver diverged. >> >> The only difference was that whether I used "-snes_linesearch_monitor" or >> not. >> >> In my understanding, this runtime option only turns on the screen print. >> So why did it make such a big difference? Is there anything special with >> this option turned on? Hope someone could help me out. >> > > This is correct. It should not influence the solution at all, and all it > does is enable printing. Also, the l2 line search takes significantly more > work for well-behaved Newton convergence than bt, as bt automatically > accepts the full step. l2 is mostly meant for cases where the step is > automatically ill-scaled. > > - Peter > > >> >> Thanks a lot. >> >> Regards, >> Su >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Sat Aug 17 20:07:19 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sat, 17 Aug 2013 20:07:19 -0500 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 In-Reply-To: <2569883.TWAWmgZMUo@rigel> References: <2569883.TWAWmgZMUo@rigel> Message-ID: <87ioz36c3c.fsf@mcs.anl.gov> Juha J?ykk? writes: > Hi list! > > I wonder when and why (to make writing faster, I guess) this (and the > corresponding line in src/vec/vec/impls/mpi/pdvec.c) was added: > > petsc-3.3-p6/src/dm/impls/da/gr2.c:360: status = H5Pset_chunk(chunkspace, > dim, chunkDims); CHKERRQ(status); > > I know that 3.3.6 is "old", but 3.4.2 performs the same call, so I'm not any > better off with that. > > This line causes problems, at least on a Cray XC30 machine, with HDF5 library > telling me that "#010: H5Dchunk.c line 443 in H5D_chunk_construct(): chunk > size must be < 4GB." > > I know chunked IO is good idea, but apparently PETSc uses disallowed > chunksizes. It doesn't make any sense to use a chunk size that is as large as the entire spatial domain. As I understand it, H5Pset_chunk has the restriction that the chunk size cannot be larger than the data, so to be able to write small data files, PETSc has to set small chunk sizes for small data. I think this issue can be fixed by changing chunkDims[dim] = dims[dim]; to something like chunkDims[dim] = PetscMin(dims[dim],dim_for_norminal_chunk_size); Does anyone have data on what chunk sizes would make a good default? 10 MiB? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From solvercorleone at gmail.com Sun Aug 18 03:00:44 2013 From: solvercorleone at gmail.com (Cong Li) Date: Sun, 18 Aug 2013 17:00:44 +0900 Subject: [petsc-users] How can get the inverse matrix of P(transpose)AP Message-ID: Hi I met a problem when trying to get a inverse matrix. The problem is like this: I have a matrix P (m x k matrix, and k< From juhaj at iki.fi Sun Aug 18 05:13:50 2013 From: juhaj at iki.fi (Juha =?ISO-8859-1?Q?J=E4ykk=E4?=) Date: Sun, 18 Aug 2013 12:13:50 +0200 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 In-Reply-To: <87ioz36c3c.fsf@mcs.anl.gov> References: <2569883.TWAWmgZMUo@rigel> <87ioz36c3c.fsf@mcs.anl.gov> Message-ID: <13652175.rAY5a1HGtb@rigel> > It doesn't make any sense to use a chunk size that is as large as the > entire spatial domain. As I understand it, H5Pset_chunk has the Indeed. And even if the data was 1 TB (in my problematic case it was just 15 GB), a chunk size even approaching 4 GB is unlikely to be optimal: HDF5 will always read at least a chunk, so if you want to do anything with a small subset of the data, say just 10 MB, you'll end up reading 4 GB. > restriction that the chunk size cannot be larger than the data, so to be > able to write small data files, PETSc has to set small chunk sizes for > small data. For small files, chunking is probably not going to change performance in any significant manner, so one option could be to simply not chunk small files at all and then chunk big files "optimally" ? whatever that means. HDFgroup seems to think that "the chunk size be approximately equal to the average expected size of the data block needed by the application." (http://www.hdfgroup.org/training/HDFtraining/UsersGuide/Perform.fm2.html) For more chunking stuff: In the case of PETSc I think that means not the WHOLE application, but one MPI rank (or perhaps one SMP host running a mixture of MPI ranks and OpenMP threads), which is probably always going to be < 4 GB (except perhaps in the mixture case). > I think this issue can be fixed by changing > > chunkDims[dim] = dims[dim]; > > to something like > > chunkDims[dim] = PetscMin(dims[dim],dim_for_norminal_chunk_size); I'll see how that affects the performance in my case: turning chunking completely off works too, but I would not expect that to excel in performance. > Does anyone have data on what chunk sizes would make a good default? 10 > MiB? See above, but note also that there can at most be 64k chunks in the file, so fixing the chunk size to 10 MiB means limiting file size to 640 GiB. My suggestion is to give PETSc a little more logic here, something like this: if sizeof(data) > 4GiB * 64k: no chunking # impossible to chunk! elif sizeof(data) < small_file_limit: no chunking # probably best for speed elif current rank's data size < 4 GB: chunk using current ranks data size else divide current rank's data size by 2**(number of dimensions) until < 4 GB and then use that chunk size. It is obvious what dimensions the chunks would be in that case. I hope I covered all contingencies there. Cheers, Juha -- ----------------------------------------------- | Juha J?ykk?, juhaj at iki.fi | | http://koti.kapsi.fi/~juhaj/ | ----------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: This is a digitally signed message part. URL: From solvercorleone at gmail.com Sun Aug 18 05:28:39 2013 From: solvercorleone at gmail.com (Cong Li) Date: Sun, 18 Aug 2013 19:28:39 +0900 Subject: [petsc-users] How can I convert parall matrix to MATSEQDENSE Message-ID: Hi, all Could someone tell me how I can convert a parallel MATDENSE (or MATMPIAIJ) matrix to a MATSEQDENSE matrix? I need this converted sequential matrix for direct solver to get its inverse matrix. Thanks very much. Cong -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Aug 18 05:59:50 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 18 Aug 2013 05:59:50 -0500 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 In-Reply-To: <87ioz36c3c.fsf@mcs.anl.gov> References: <2569883.TWAWmgZMUo@rigel> <87ioz36c3c.fsf@mcs.anl.gov> Message-ID: On Sat, Aug 17, 2013 at 8:07 PM, Jed Brown wrote: > Juha J?ykk? writes: > > > Hi list! > > > > I wonder when and why (to make writing faster, I guess) this (and the > > corresponding line in src/vec/vec/impls/mpi/pdvec.c) was added: > > > > petsc-3.3-p6/src/dm/impls/da/gr2.c:360: status = > H5Pset_chunk(chunkspace, > > dim, chunkDims); CHKERRQ(status); > > > > I know that 3.3.6 is "old", but 3.4.2 performs the same call, so I'm not > any > > better off with that. > > > > This line causes problems, at least on a Cray XC30 machine, with HDF5 > library > > telling me that "#010: H5Dchunk.c line 443 in H5D_chunk_construct(): > chunk > > size must be < 4GB." > > > > I know chunked IO is good idea, but apparently PETSc uses disallowed > > chunksizes. > > It doesn't make any sense to use a chunk size that is as large as the > entire spatial domain. As I understand it, H5Pset_chunk has the > restriction that the chunk size cannot be larger than the data, so to be > able to write small data files, PETSc has to set small chunk sizes for > small data. > > I think this issue can be fixed by changing > > chunkDims[dim] = dims[dim]; > > to something like > > chunkDims[dim] = PetscMin(dims[dim],dim_for_norminal_chunk_size); > > Does anyone have data on what chunk sizes would make a good default? 10 > MiB? > This was not a performance optimization. IIRC, I did this so we could leave the time domain of unspecified length, and write a chunk of values at each timestep. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Sun Aug 18 08:10:19 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 18 Aug 2013 08:10:19 -0500 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 In-Reply-To: <13652175.rAY5a1HGtb@rigel> References: <2569883.TWAWmgZMUo@rigel> <87ioz36c3c.fsf@mcs.anl.gov> <13652175.rAY5a1HGtb@rigel> Message-ID: <8738q75emc.fsf@mcs.anl.gov> Juha J?ykk? writes: > For small files, chunking is probably not going to change performance in any > significant manner, so one option could be to simply not chunk small files at > all This is effectively what is done now, considering that HDF5 needs chunking to be enabled to use H5S_UNLIMITED. > and then chunk big files "optimally" ? whatever that means. HDFgroup > seems to think that "the chunk size be approximately equal to the > average expected size of the data block needed by the application." > (http://www.hdfgroup.org/training/HDFtraining/UsersGuide/Perform.fm2.html) > For more chunking stuff: > > In the case of PETSc I think that means not the WHOLE application, but one MPI > rank (or perhaps one SMP host running a mixture of MPI ranks and OpenMP > threads), which is probably always going to be < 4 GB (except perhaps in the > mixture case). Output uses a collective write, so the granularity of the IO node is probably more relevant for writing (e.g., BG/Q would have one IO node per 128 compute nodes), but almost any chunk size should perform similarly. It would make a lot more difference for something like visualization where subsets of the data are read, typically with independent IO. > turning chunking completely off works too Are you sure? Did you try writing a second time step? The documentation says that H5S_UNLIMITED requires chunking. > See above, but note also that there can at most be 64k chunks in the file, so > fixing the chunk size to 10 MiB means limiting file size to 640 GiB. Thanks for noticing this limit. This might come from the 64k limit on attribute sizes. > My suggestion is to give PETSc a little more logic here, something like this: > > if sizeof(data) > 4GiB * 64k: no chunking # impossible to chunk! > elif sizeof(data) < small_file_limit: no chunking # probably best for speed > elif current rank's data size < 4 GB: chunk using current ranks data size Chunk size needs to be collective. We could compute an average size From each subdomain, but can't just use the subdomain size. > else divide current rank's data size by 2**(number of dimensions) until < 4 GB > and then use that chunk size. We might want the chunk size to be smaller than 4GiB anyway to avoid out-of-memory problems for readers and writers. I think the chunk size (or maximum chunk size) should be settable by the user. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Sun Aug 18 08:12:51 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 18 Aug 2013 08:12:51 -0500 Subject: [petsc-users] How can I convert parall matrix to MATSEQDENSE In-Reply-To: References: Message-ID: <87zjsf3zxo.fsf@mcs.anl.gov> Cong Li writes: > Hi, all > > Could someone tell me how I can convert a parallel MATDENSE (or MATMPIAIJ) > matrix to a MATSEQDENSE matrix? Get the array and use MPI_Gather. With MATMPIAIJ, you almost certainly should not compute an explicit inverse. > I need this converted sequential matrix for direct solver to get its > inverse matrix. You can do that with Elemental without needing to serialize the matrix inversion. Also, why do you need an explicit inverse? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Sun Aug 18 08:15:53 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 18 Aug 2013 08:15:53 -0500 Subject: [petsc-users] How can get the inverse matrix of P(transpose)AP In-Reply-To: References: Message-ID: <87wqnj3zsm.fsf@mcs.anl.gov> Cong Li writes: > Hi > > I met a problem when trying to get a inverse matrix. > > The problem is like this: > I have a matrix P (m x k matrix, and k< large SPD matrix). Now I want to get the inverse matrix of > P(transpose)*A*p. Can you back up and explain at a high level what you're trying to accomplish? > Since P(transpose)*A*p is only a k x k matrix, and PETSc only supports > sequential direct solver, I want to store P(transpose)*A*p in MATSEQDENSE > type. > And the questions are > 1. given P and A are PETSc parallel matrix, for instance MATMPIAIJ, how can > I get sequential dense matrix P(transpose)*A*p? > 2. If I need to multiply P(transpose)*A*p with other parallel dense matrix, > how I can do it? > > Thanks a lot. > > Cong -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From prbrune at gmail.com Sun Aug 18 08:28:30 2013 From: prbrune at gmail.com (Peter Brune) Date: Sun, 18 Aug 2013 08:28:30 -0500 Subject: [petsc-users] SNES Line Search Monitor In-Reply-To: References: Message-ID: On Sat, Aug 17, 2013 at 6:46 PM, Su Yan wrote: > I am using PETSc 3.4.1. This problem is reproducible. > This is not reproduction of the problem. Does it happen with a simple test problem that you can allow us to debug? Additionally, does your problem have the same convergence history every time that it is run with the same options? If not, then the -snes_linesearch_monitor is most likely a misdiagnosis of some other difficulty. - Peter I tested several other cases, almost every case has the similar problem. > Take this one as an example: > > with ./abc.bin -snes_monitor -snes_linesearch_monitor > > 0 SNES Function norm 1.470975405802e+06 > Line search: lambdas = [1, 0.5, 0], fnorms = [7051.02, 735338, > 1.47098e+06] > Line search terminated: lambda = 0.999762, fnorms = 7026.38 > 1 SNES Function norm 7.026377609722e+03 > Line search: lambdas = [1, 0.5, 0], fnorms = [189.711, 3514.55, > 7026.38] > Line search terminated: lambda = 0.999295, fnorms = 189.783 > 2 SNES Function norm 1.897827226353e+02 > Line search: lambdas = [1, 0.5, 0], fnorms = [32.3811, 100.302, > 189.783] > Line search terminated: lambda = 1.01592, fnorms = 32.5125 > 3 SNES Function norm 3.251248928384e+01 > Line search: lambdas = [1, 0.5, 0], fnorms = [0.00236829, 16.2564, > 32.5125] > Line search terminated: lambda = 1.00001, fnorms = 0.00233087 > 4 SNES Function norm 2.330867616400e-03 > > with ./abc.bin -snes_monitor > > 0 SNES Function norm 1.470975405802e+06 > 1 SNES Function norm 7.026377609170e+03 > 2 SNES Function norm 1.897827231670e+02 > 3 SNES Function norm 3.251248934919e+01 > 4 SNES Function norm 2.171496483661e-02 > > Significant difference can be observed. The equation I am solving is quite > ill-conditioned. Without L2 line search it is hard to converge in some > cases. Still try to figure out the reason. > > Thanks, > Su > > On Sat, Aug 17, 2013 at 8:49 AM, Peter Brune wrote: > >> >> >> >> On Sat, Aug 17, 2013 at 1:40 AM, Su Yan wrote: >> >>> Hi, I ran into something really weird when I tried to solve a nonlinear >>> equation with Newton method and line search. Specifically, I used >>> SNESSetType(snesMbr, SNESNEWTONLS); and >>> PetscOptionsSetValue("-snes_linesearch_type", "l2"); >>> >>> >> Which version of PETSc are you using? >> >> >>> When I execute my program abc.bin with the following command: >>> >>> ./abc.bin -snes_monitor -snes_linesearch_monitor >>> >>> I got the following output: >>> >>> 0 SNES Function norm 1.457697974866e+07 >>> Line search: lambdas = [1, 0.5, 0], fnorms = [669102, >>> 7.35102e+06, 1.4577e+07] >>> Line search terminated: lambda = 1.00553, fnorms = 652606 >>> 1 SNES Function norm 6.526060362905e+05 >>> Line search: lambdas = [1, 0.5, 0], fnorms = [3406.6, 326873, >>> 652606] >>> Line search terminated: lambda = 1.00171, fnorms = 2801.6 >>> 2 SNES Function norm 2.801596249480e+03 >>> Line search: lambdas = [1, 0.5, 0], fnorms = [2.51242, 1401.2, >>> 2801.6] >>> Line search terminated: lambda = 1.00029, fnorms = 2.09292 >>> 3 SNES Function norm 2.092918540169e+00 >>> Line search: lambdas = [1, 0.5, 0], fnorms = [0.000123295, >>> 1.04646, 2.09292] >>> Line search terminated: lambda = 1, fnorms = 0.000122588 >>> 4 SNES Function norm 1.225883678418e-04 >>> >>> Converged Reason: FNORM_RELATIVE >>> >>> The nonlinear problem converged normally with a relative f_norm set as >>> 1E-7. >>> >>> However, if I execute exactly the same program, but with a slightly >>> different runtime command: >>> >>> ./abc.bin -snes_monitor >>> >>> I got the following output: >>> >>> 0 SNES Function norm 1.457697974975e+07 >>> 1 SNES Function norm 6.526060348917e+05 >>> 2 SNES Function norm 2.801608208510e+03 >>> 3 SNES Function norm 2.450488738084e+03 >>> 4 SNES Function norm 3.269507987119e+02 >>> 5 SNES Function norm 3.016606325384e+02 >>> 6 SNES Function norm 2.463851989463e+02 >>> 7 SNES Function norm 1.546266418976e+02 >>> 8 SNES Function norm 1.492518400407e+02 >>> 9 SNES Function norm 1.477122410995e+02 >>> 10 SNES Function norm 1.503359418680e+02 >>> 11 SNES Function norm 1.504759910776e+02 >>> 12 SNES Function norm 1.417592634863e+02 >>> 13 SNES Function norm 3.047096130411e+05 >>> >>> >> Is this problem reproducible? Note that even your 0th norm is very >> slightly different, and the first three norms are quite similar. >> >> >>> and the solver diverged. >>> >>> The only difference was that whether I used "-snes_linesearch_monitor" >>> or not. >>> >>> In my understanding, this runtime option only turns on the screen print. >>> So why did it make such a big difference? Is there anything special with >>> this option turned on? Hope someone could help me out. >>> >> >> This is correct. It should not influence the solution at all, and all it >> does is enable printing. Also, the l2 line search takes significantly more >> work for well-behaved Newton convergence than bt, as bt automatically >> accepts the full step. l2 is mostly meant for cases where the step is >> automatically ill-scaled. >> >> - Peter >> >> >>> >>> Thanks a lot. >>> >>> Regards, >>> Su >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Aug 18 09:03:12 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 18 Aug 2013 09:03:12 -0500 Subject: [petsc-users] SNES Line Search Monitor In-Reply-To: References: Message-ID: On Sun, Aug 18, 2013 at 8:28 AM, Peter Brune wrote: > > > > On Sat, Aug 17, 2013 at 6:46 PM, Su Yan wrote: > >> I am using PETSc 3.4.1. This problem is reproducible. >> > > This is not reproduction of the problem. Does it happen with a simple > test problem that you can allow us to debug? > Please repeat the runs with -snes_view. That way we can see what line search is used in each run. Matt > Additionally, does your problem have the same convergence history every > time that it is run with the same options? If not, then the > -snes_linesearch_monitor is most likely a misdiagnosis of some other > difficulty. > > - Peter > > I tested several other cases, almost every case has the similar problem. >> Take this one as an example: >> >> with ./abc.bin -snes_monitor -snes_linesearch_monitor >> >> 0 SNES Function norm 1.470975405802e+06 >> Line search: lambdas = [1, 0.5, 0], fnorms = [7051.02, 735338, >> 1.47098e+06] >> Line search terminated: lambda = 0.999762, fnorms = 7026.38 >> 1 SNES Function norm 7.026377609722e+03 >> Line search: lambdas = [1, 0.5, 0], fnorms = [189.711, 3514.55, >> 7026.38] >> Line search terminated: lambda = 0.999295, fnorms = 189.783 >> 2 SNES Function norm 1.897827226353e+02 >> Line search: lambdas = [1, 0.5, 0], fnorms = [32.3811, 100.302, >> 189.783] >> Line search terminated: lambda = 1.01592, fnorms = 32.5125 >> 3 SNES Function norm 3.251248928384e+01 >> Line search: lambdas = [1, 0.5, 0], fnorms = [0.00236829, 16.2564, >> 32.5125] >> Line search terminated: lambda = 1.00001, fnorms = 0.00233087 >> 4 SNES Function norm 2.330867616400e-03 >> >> with ./abc.bin -snes_monitor >> >> 0 SNES Function norm 1.470975405802e+06 >> 1 SNES Function norm 7.026377609170e+03 >> 2 SNES Function norm 1.897827231670e+02 >> 3 SNES Function norm 3.251248934919e+01 >> 4 SNES Function norm 2.171496483661e-02 >> >> Significant difference can be observed. The equation I am solving is >> quite ill-conditioned. Without L2 line search it is hard to converge in >> some cases. Still try to figure out the reason. >> >> Thanks, >> Su >> >> On Sat, Aug 17, 2013 at 8:49 AM, Peter Brune wrote: >> >>> >>> >>> >>> On Sat, Aug 17, 2013 at 1:40 AM, Su Yan wrote: >>> >>>> Hi, I ran into something really weird when I tried to solve a nonlinear >>>> equation with Newton method and line search. Specifically, I used >>>> SNESSetType(snesMbr, SNESNEWTONLS); and >>>> PetscOptionsSetValue("-snes_linesearch_type", "l2"); >>>> >>>> >>> Which version of PETSc are you using? >>> >>> >>>> When I execute my program abc.bin with the following command: >>>> >>>> ./abc.bin -snes_monitor -snes_linesearch_monitor >>>> >>>> I got the following output: >>>> >>>> 0 SNES Function norm 1.457697974866e+07 >>>> Line search: lambdas = [1, 0.5, 0], fnorms = [669102, >>>> 7.35102e+06, 1.4577e+07] >>>> Line search terminated: lambda = 1.00553, fnorms = 652606 >>>> 1 SNES Function norm 6.526060362905e+05 >>>> Line search: lambdas = [1, 0.5, 0], fnorms = [3406.6, 326873, >>>> 652606] >>>> Line search terminated: lambda = 1.00171, fnorms = 2801.6 >>>> 2 SNES Function norm 2.801596249480e+03 >>>> Line search: lambdas = [1, 0.5, 0], fnorms = [2.51242, 1401.2, >>>> 2801.6] >>>> Line search terminated: lambda = 1.00029, fnorms = 2.09292 >>>> 3 SNES Function norm 2.092918540169e+00 >>>> Line search: lambdas = [1, 0.5, 0], fnorms = [0.000123295, >>>> 1.04646, 2.09292] >>>> Line search terminated: lambda = 1, fnorms = 0.000122588 >>>> 4 SNES Function norm 1.225883678418e-04 >>>> >>>> Converged Reason: FNORM_RELATIVE >>>> >>>> The nonlinear problem converged normally with a relative f_norm set as >>>> 1E-7. >>>> >>>> However, if I execute exactly the same program, but with a slightly >>>> different runtime command: >>>> >>>> ./abc.bin -snes_monitor >>>> >>>> I got the following output: >>>> >>>> 0 SNES Function norm 1.457697974975e+07 >>>> 1 SNES Function norm 6.526060348917e+05 >>>> 2 SNES Function norm 2.801608208510e+03 >>>> 3 SNES Function norm 2.450488738084e+03 >>>> 4 SNES Function norm 3.269507987119e+02 >>>> 5 SNES Function norm 3.016606325384e+02 >>>> 6 SNES Function norm 2.463851989463e+02 >>>> 7 SNES Function norm 1.546266418976e+02 >>>> 8 SNES Function norm 1.492518400407e+02 >>>> 9 SNES Function norm 1.477122410995e+02 >>>> 10 SNES Function norm 1.503359418680e+02 >>>> 11 SNES Function norm 1.504759910776e+02 >>>> 12 SNES Function norm 1.417592634863e+02 >>>> 13 SNES Function norm 3.047096130411e+05 >>>> >>>> >>> Is this problem reproducible? Note that even your 0th norm is very >>> slightly different, and the first three norms are quite similar. >>> >>> >>>> and the solver diverged. >>>> >>>> The only difference was that whether I used "-snes_linesearch_monitor" >>>> or not. >>>> >>>> In my understanding, this runtime option only turns on the screen >>>> print. So why did it make such a big difference? Is there anything special >>>> with this option turned on? Hope someone could help me out. >>>> >>> >>> This is correct. It should not influence the solution at all, and all >>> it does is enable printing. Also, the l2 line search takes significantly >>> more work for well-behaved Newton convergence than bt, as bt automatically >>> accepts the full step. l2 is mostly meant for cases where the step is >>> automatically ill-scaled. >>> >>> - Peter >>> >>> >>>> >>>> Thanks a lot. >>>> >>>> Regards, >>>> Su >>>> >>>> >>> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From solvercorleone at gmail.com Sun Aug 18 10:07:10 2013 From: solvercorleone at gmail.com (Cong Li) Date: Mon, 19 Aug 2013 00:07:10 +0900 Subject: [petsc-users] How can I convert parall matrix to MATSEQDENSE In-Reply-To: <87zjsf3zxo.fsf@mcs.anl.gov> References: <87zjsf3zxo.fsf@mcs.anl.gov> Message-ID: Thanks for the answer. Do you mean get each process's local data and save them in the C array locally, then let the program call MPI_Gather to combine the arrrays on one process? Actuall I am trying to implement block cg method for my research. It is like this AX=B, X and B are mxk matrices, A is mxm large sparse marix. Given initial guess X0, R0=B-AX P=R0 for i=1,2,...do gamma(i)=inv(Pi(transpose)APi)Pi(transpose)Ri X(i+1) = Xi +P(i) gamma(i) ..... .....psi(i) = - inv(Pi(transpose)APi)PiA(transpose)R(i+1) ..... Since the program store Pi(transpose)APi in MATMPIAJI (beacase I am using MatPtAP call and A is MATMPIAIJ, I have to save the result in MATMPIAIJ for the compatibabilty concern), I need to convert Pi(transpose)APi to MATSEQDENSE to get its inverse. By the way, I also think elemental is a good choice, but I don't how to call its functions from PETSc, could you show me an example (data type conversion from MATMPIAIJ to elemental type, and how can I call elemental function in PETSc code) ? Thanks so much Cong On Sun, Aug 18, 2013 at 10:12 PM, Jed Brown wrote: > Cong Li writes: > > > Hi, all > > > > Could someone tell me how I can convert a parallel MATDENSE (or > MATMPIAIJ) > > matrix to a MATSEQDENSE matrix? > > Get the array and use MPI_Gather. With MATMPIAIJ, you almost certainly > should not compute an explicit inverse. > > > I need this converted sequential matrix for direct solver to get its > > inverse matrix. > > You can do that with Elemental without needing to serialize the matrix > inversion. Also, why do you need an explicit inverse? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From solvercorleone at gmail.com Sun Aug 18 10:20:02 2013 From: solvercorleone at gmail.com (Cong Li) Date: Mon, 19 Aug 2013 00:20:02 +0900 Subject: [petsc-users] Fwd: How can get the inverse matrix of P(transpose)AP In-Reply-To: References: <87wqnj3zsm.fsf@mcs.anl.gov> Message-ID: ---------- Forwarded message ---------- From: Cong Li Date: Mon, Aug 19, 2013 at 12:19 AM Subject: Re: [petsc-users] How can get the inverse matrix of P(transpose)AP To: Jed Brown Thanks for the reply. The question I am asking here is closely related to the question I asked in "How can I convert parall matrix to MATSEQDENSE" Actuall I am trying to implement block cg (conjugate gradient) method for my research. It is like this AX=B, X and B are mxk matrices, A is mxm large sparse marix. Given initial guess X0, R0=B-AX P=R0 for i=1,2,...do gamma(i)=inv(Pi(transpose)APi)Pi(transpose)Ri X(i+1) = Xi +P(i) gamma(i) ..... .....psi(i) = - inv(Pi(transpose)APi)PiA(transpose)R(i+1) ..... So I need to get the invese of Pi(transpose)APi. Cong On Sun, Aug 18, 2013 at 10:15 PM, Jed Brown wrote: > Cong Li writes: > > > Hi > > > > I met a problem when trying to get a inverse matrix. > > > > The problem is like this: > > I have a matrix P (m x k matrix, and k< > large SPD matrix). Now I want to get the inverse matrix of > > P(transpose)*A*p. > > Can you back up and explain at a high level what you're trying to > accomplish? > > > Since P(transpose)*A*p is only a k x k matrix, and PETSc only supports > > sequential direct solver, I want to store P(transpose)*A*p in MATSEQDENSE > > type. > > And the questions are > > 1. given P and A are PETSc parallel matrix, for instance MATMPIAIJ, how > can > > I get sequential dense matrix P(transpose)*A*p? > > 2. If I need to multiply P(transpose)*A*p with other parallel dense > matrix, > > how I can do it? > > > > Thanks a lot. > > > > Cong > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Sun Aug 18 12:55:16 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 18 Aug 2013 12:55:16 -0500 Subject: [petsc-users] How can I convert parall matrix to MATSEQDENSE In-Reply-To: References: <87zjsf3zxo.fsf@mcs.anl.gov> Message-ID: <8761v251ff.fsf@mcs.anl.gov> Cong Li writes: > Thanks for the answer. > > Do you mean get each process's local data and save them in the C array > locally, then let the program call MPI_Gather to combine the arrrays on one > process? > > Actuall I am trying to implement block cg method for my research. Okay, then you don't want the P^T A P matrix to ever be MPIAIJ. You can compute it by doing the local product and MPI_Allreduce'ing so that everyone has the k*k matrix. > It is like this AX=B, X and B are mxk matrices, A is mxm large sparse marix. > > Given initial guess X0, R0=B-AX > P=R0 > for i=1,2,...do > gamma(i)=inv(Pi(transpose)APi)Pi(transpose)Ri This should be a solve, not "inv". > X(i+1) = Xi +P(i) gamma(i) > ..... > .....psi(i) = - inv(Pi(transpose)APi)PiA(transpose)R(i+1) > ..... > > Since the program store Pi(transpose)APi in MATMPIAJI (beacase I am using > MatPtAP call and A is MATMPIAIJ, I have to save the result in MATMPIAIJ for > the compatibabilty concern), I need to convert Pi(transpose)APi to > MATSEQDENSE to get its inverse. I'm afraid PETSc does not have a native dense format that stores redundantly, but that's what you want. (In Elemental notation, this is a [*,*] distribution.) This could be added to PETSc and we can advise if you are interested in doing that, but if you want to do the minimal work to implement your method, you should do the product (A P), then get out the arrays and do the local part followed by an MPI_Allreduce. > By the way, I also think elemental is a good choice, but I don't how to > call its functions from PETSc, could you show me an example (data type > conversion from MATMPIAIJ to elemental type, and how can I call elemental > function in PETSc code) ? Your problem doesn't really benefit from Elemental since you really just need local operations. The 2d distributions that PETSc's elemental supports are not what you want for your purposes. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From juhaj at iki.fi Sun Aug 18 14:16:45 2013 From: juhaj at iki.fi (Juha =?ISO-8859-1?Q?J=E4ykk=E4?=) Date: Sun, 18 Aug 2013 21:16:45 +0200 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 In-Reply-To: <8738q75emc.fsf@mcs.anl.gov> References: <2569883.TWAWmgZMUo@rigel> <13652175.rAY5a1HGtb@rigel> <8738q75emc.fsf@mcs.anl.gov> Message-ID: <9648631.6FT9W9zWr3@rigel> > similarly. It would make a lot more difference for something like > visualization where subsets of the data are read, typically with > independent IO. True. Fortunately, h5repack can change the chunking afterwards if necessary. Of course, for truly huge files that's going to take ages. > > turning chunking completely off works too > Are you sure? Did you try writing a second time step? The Yes. No. I just solve an equation, so I'm not interested in saving steps. Sometimes when I do, I save them into separate files. I tried to save them as time-steps sometime around 3.1 or 3.2, but at that point there were only two possibilities: Read and Truncate, so adding another timestep was impossible. >From what I see, there now is an FILE_MODE_APPEND possibility, but it won't be worth it for me. > > See above, but note also that there can at most be 64k chunks in the file, > > so fixing the chunk size to 10 MiB means limiting file size to 640 GiB. > Thanks for noticing this limit. This might come from the 64k limit > on attribute sizes. You're welcome. > Chunk size needs to be collective. We could compute an average size > From each subdomain, but can't just use the subdomain size. Right. The general idea should still work the same way. > We might want the chunk size to be smaller than 4GiB anyway to avoid > out-of-memory problems for readers and writers. I decided against making the OOM point because I came to the conclusion 4 GiB is not going to restrict any system where one would contemplate solving problems of such size: any cluster or "true" supercomputer will have so many cores / node that even a modest 1 GiB / core will give > 4 GiB total. But I agree anyway: there's no point in insisting on such huge chunk size unless there is a huge performance benefit to gain. And even then it might well... > I think the chunk size (or maximum chunk size) should be settable by the > user. to be explicitly set by the user by this option, so that first-time-users and other less savvy users do not get hurt by out-of-memory errors if they are unlucky enough to not have more memory on their HPC system. Cheers, Juha -- ----------------------------------------------- | Juha J?ykk?, juhaj at iki.fi | | http://koti.kapsi.fi/~juhaj/ | ----------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: This is a digitally signed message part. URL: From jedbrown at mcs.anl.gov Sun Aug 18 14:31:05 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 18 Aug 2013 14:31:05 -0500 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 In-Reply-To: <9648631.6FT9W9zWr3@rigel> References: <2569883.TWAWmgZMUo@rigel> <13652175.rAY5a1HGtb@rigel> <8738q75emc.fsf@mcs.anl.gov> <9648631.6FT9W9zWr3@rigel> Message-ID: <8738q64wzq.fsf@mcs.anl.gov> Juha J?ykk? writes: > I decided against making the OOM point because I came to the conclusion 4 GiB > is not going to restrict any system where one would contemplate solving > problems of such size: any cluster or "true" supercomputer will have so many > cores / node that even a modest 1 GiB / core will give > 4 GiB total. I was thinking of reader workflows such as visualization in which each core might independently ask to read a small part of the file. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From juhaj at iki.fi Sun Aug 18 14:36:41 2013 From: juhaj at iki.fi (Juha =?ISO-8859-1?Q?J=E4ykk=E4?=) Date: Sun, 18 Aug 2013 21:36:41 +0200 Subject: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30 In-Reply-To: <8738q64wzq.fsf@mcs.anl.gov> References: <2569883.TWAWmgZMUo@rigel> <9648631.6FT9W9zWr3@rigel> <8738q64wzq.fsf@mcs.anl.gov> Message-ID: <2444325.YdqAnjTcGB@rigel> > I was thinking of reader workflows such as visualization in which each > core might independently ask to read a small part of the file. Uh, there a 4 GiB chunk would be a disaster indeed! No matter if you have the memory or not, reading 4 GiB of data to access, say 10 MiB is not the winner. Cheers, -Juha -- ----------------------------------------------- | Juha J?ykk?, juhaj at iki.fi | | http://koti.kapsi.fi/~juhaj/ | ----------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: This is a digitally signed message part. URL: From gnw20 at cam.ac.uk Mon Aug 19 07:25:07 2013 From: gnw20 at cam.ac.uk (Garth N. Wells) Date: Mon, 19 Aug 2013 13:25:07 +0100 Subject: [petsc-users] Attach approximate null space to a fieldsplit block Message-ID: I'm using PCFIELDSPLIT for a mixed problem, with fields indicated via an index set (using PCFieldSplitSetIS). I'd like to attach an approximate null space to the A00 block of the system. Is there a way to attach the null space to one block? Garth From jedbrown at mcs.anl.gov Mon Aug 19 07:40:03 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 19 Aug 2013 07:40:03 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: Message-ID: <87ppt926sc.fsf@mcs.anl.gov> "Garth N. Wells" writes: > I'm using PCFIELDSPLIT for a mixed problem, with fields indicated via > an index set (using PCFieldSplitSetIS). I'd like to attach an > approximate null space to the A00 block of the system. Is there a way > to attach the null space to one block? With MatNest, you can use MatSetNearNullSpace on sub-blocks and it will be used automatically. This is also the low-memory way to use PCFieldSplit. Note that if you use MatSetValuesLocal, you can use the NEST format with identical code in assembly; only matrix creation is different, see src/snes/example/tutorials/ex28.c for an example that works with both AIJ and NEST formats. Otherwise, you should be able to PCSetUp, then get out the sub-solvers PCFieldSplitGetSubKSP, pull out their matrices, and call MatSetNearNullSpace. Those submatrices should not be overwritten as long as you use SAME_NONZERO_PATTERN, so the null space would still work in future iterations. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Mon Aug 19 08:03:34 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 19 Aug 2013 08:03:34 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: <87ppt926sc.fsf@mcs.anl.gov> References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: On Mon, Aug 19, 2013 at 7:40 AM, Jed Brown wrote: > "Garth N. Wells" writes: > > > I'm using PCFIELDSPLIT for a mixed problem, with fields indicated via > > an index set (using PCFieldSplitSetIS). I'd like to attach an > > approximate null space to the A00 block of the system. Is there a way > > to attach the null space to one block? > There is a better way. You can attach the null space to the IS that forms the split PetscObjectAttach((PetscObject) is0, "nullspace", nullspace); and PCFIELDSPLIT will pull it out and attach it to the preconditioner for block 0. Matt > With MatNest, you can use MatSetNearNullSpace on sub-blocks and it will > be used automatically. This is also the low-memory way to use > PCFieldSplit. Note that if you use MatSetValuesLocal, you can use the > NEST format with identical code in assembly; only matrix creation is > different, see src/snes/example/tutorials/ex28.c for an example that > works with both AIJ and NEST formats. > > Otherwise, you should be able to PCSetUp, then get out the sub-solvers > PCFieldSplitGetSubKSP, pull out their matrices, and call > MatSetNearNullSpace. Those submatrices should not be overwritten as > long as you use SAME_NONZERO_PATTERN, so the null space would still work > in future iterations. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Aug 19 09:01:30 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 19 Aug 2013 09:01:30 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: <4AAF4D39-9AB8-448C-BFAB-1EA2749F97CF@mcs.anl.gov> To prevent confusion: it is PetscObjectCompose() not attach. On Aug 19, 2013, at 8:03 AM, Matthew Knepley wrote: > On Mon, Aug 19, 2013 at 7:40 AM, Jed Brown wrote: > "Garth N. Wells" writes: > > > I'm using PCFIELDSPLIT for a mixed problem, with fields indicated via > > an index set (using PCFieldSplitSetIS). I'd like to attach an > > approximate null space to the A00 block of the system. Is there a way > > to attach the null space to one block? > > There is a better way. You can attach the null space to the IS that forms the split > > PetscObjectAttach((PetscObject) is0, "nullspace", nullspace); > > and PCFIELDSPLIT will pull it out and attach it to the preconditioner for block 0. > > Matt > > With MatNest, you can use MatSetNearNullSpace on sub-blocks and it will > be used automatically. This is also the low-memory way to use > PCFieldSplit. Note that if you use MatSetValuesLocal, you can use the > NEST format with identical code in assembly; only matrix creation is > different, see src/snes/example/tutorials/ex28.c for an example that > works with both AIJ and NEST formats. > > Otherwise, you should be able to PCSetUp, then get out the sub-solvers > PCFieldSplitGetSubKSP, pull out their matrices, and call > MatSetNearNullSpace. Those submatrices should not be overwritten as > long as you use SAME_NONZERO_PATTERN, so the null space would still work > in future iterations. > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From invitations-noreply at linkedin.com Mon Aug 19 14:21:00 2013 From: invitations-noreply at linkedin.com (Francisco Rocha) Date: Mon, 19 Aug 2013 19:21:00 +0000 (UTC) Subject: [petsc-users] Francisco Rocha's invitation is waiting for your response Message-ID: <1709327354.3540846.1376940060240.JavaMail.app@ela4-app2438.prod> Hi Matt, Francisco Rocha sent you an invitation to connect 6 days ago. How would you like to respond? Accept: http://www.linkedin.com/e/-r9oj6w-hkk2jvge-3o/NPBLyes6_CJvfaFX95qTY0Fn_yVIxe9EWtXp/blk/I446072230_125/3cNc39ArCBJpn8JoS5vrCAJoyRJtCVFnSRJrScJr6RBfnhv9ClRsDgZp6lQs6lzoQ5AomZIpn8_dj8NnP0Pcz8Tc3oQd4ALoBleoD0QgPcLdjwMdPcMd3cVcj4LrCBxbOYWrSlI/eml-comm_invm-b-in_ac-remind2013/?hs=false&tok=1SdqIkmNC8b5U1 Ignore Privately: http://www.linkedin.com/e/-r9oj6w-hkk2jvge-3o/NPBLyes6_CJvfaFX95qTY0Fn_yVIxe9EWtXp/blk/I446072230_125/cP4McChKqmRBsyRDqlZKqiRybmRSrCBvrmRLoORIrmkZt5Y_dj8NnP0Pcz8Tc3oQd4ALrCtFbOYWrSlI/eml-comm_invm-b-in_ig-remind2013/?hs=false&tok=0LssBWMSe8b5U1 View Francisco Rocha's profile: http://www.linkedin.com/e/v2?e=2557fb-hkk2jvgt-5n&a=memberMailboxMarkAsRead&tracking=eml-comm_invm-b-pro_txt-remind2013&msgID=I446072230_125&simpleRedirect=djcUdPcVdjsZh4BOpm9JpmQCcP4McChKqmRBsyRQu7hvrT9Mbm8JrnpKqlZJrmZzbmNJpjRDrCBHoS5Ot2pSs6UZt2pKdiRQpTpGcCJHq2RypzsRdj8ZpjYOtyZBbSRLoOVKqmhBqSVFr2VTtTsLbPFMt7hE You are receiving Invitation emails. Unsubscribe here: http://www.linkedin.com/e/v2?e=2557fb-hkk2jvgt-5n&a=memberMailboxMarkAsRead&tracking=eml-comm_invm-f-unsub-remind2013&msgID=I446072230_125&simpleRedirect=7pLpQkO9mNKokkO9ndzrj0Q9ndOpndRbmdPt6lMfmNFomRB9CURbnhDtCEOqSJEbm9CdPkRczRAqmkCtm8Oqn5QpzlJdStzoT1GfmhFoioRcPcPdPkVcz4Zp6BJ9zwMc30OfmhF9zcNc39ArCBJpn8JoDlPrDkJpyRJtCVFnSRJrScJr6RBfmtKqmJzon9Q9DdKtjRQ9CURbnhDtCEOqSJEbm9CdPkRczRBfP9SbSkLrmZzbCVFp6lHrCBIbDtTtOYLeDdMt7hE This email was intended for Matt Funk (Software Engineer at SAP). Learn why we included this at the following link: http://linkedin.custhelp.com/app/answers/global/id/4788 © 2013, LinkedIn Corporation. 2029 Stierlin Ct. Mountain View, CA 94043, USA -------------- next part -------------- An HTML attachment was scrubbed... URL: From hgbk2008 at gmail.com Tue Aug 20 07:09:20 2013 From: hgbk2008 at gmail.com (Hoang Giang Bui) Date: Tue, 20 Aug 2013 14:09:20 +0200 Subject: [petsc-users] c++ wrapper for petsc Message-ID: <52135C70.1070709@gmail.com> Hi Petsc developers, I wonder if there is good & optimized c++ wrapper for petsc. Currently I write my own wrapper and it's not very good in assembling speed. Some libraries (deal.II, fenics) have petsc wrapper but I don't want to link to the whole library just to use the wrapper. Do you have any advice? BR Bui From knepley at gmail.com Tue Aug 20 07:10:54 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Aug 2013 07:10:54 -0500 Subject: [petsc-users] c++ wrapper for petsc In-Reply-To: <52135C70.1070709@gmail.com> References: <52135C70.1070709@gmail.com> Message-ID: On Tue, Aug 20, 2013 at 7:09 AM, Hoang Giang Bui wrote: > Hi Petsc developers, > > I wonder if there is good & optimized c++ wrapper for petsc. Currently I > write my own wrapper and it's not very good in assembling speed. Some > libraries (deal.II, fenics) have petsc wrapper but I don't want to link to > the whole library just to use the wrapper. Do you have any advice? > Why do you need a wrapper? PETSc is callable from C++ just fine. Thanks, Matt > BR > Bui > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Tue Aug 20 07:19:41 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Tue, 20 Aug 2013 14:19:41 +0200 Subject: [petsc-users] better way of setting dirichlet boundary conditions Message-ID: Hi all, In solving problems such as laplacian/poisson equations with dirichlet boundary conditions with finite difference methods, I set explicity the required values to the diagonal of the boundary rows of the system matrix, and the corresponding rhs vector. i.e. typically my matrix building loop would be like: e.g. in 2d problems, using DMDA: FOR (i=0 to xn-1, j = 0 to yn-1) set row.i = i, row. j = j IF (i = 0 or xn-1) or (j = 0 or yn-1) set diagonal value of matrix A to 1 in current row. ELSE normal interior points: set the values accordingly ENDIF ENDFOR Is there another preferred method instead of doing this ? I saw functions such as MatZeroRows() when following the answer in the FAQ regarding this at: http://www.mcs.anl.gov/petsc/documentation/faq.html#redistribute but I did not understand what it is trying to say in the last sentence of the answer "An alternative approach is ... into the load" Thanks, Bishesh -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.bonnefon at avignon.inra.fr Tue Aug 20 07:50:28 2013 From: olivier.bonnefon at avignon.inra.fr (Olivier Bonnefon) Date: Tue, 20 Aug 2013 14:50:28 +0200 Subject: [petsc-users] import a mesh. Message-ID: <52136614.50503@avignon.inra.fr> Hello, I have modify the ex12.c example for a like diffusive problem (FEM+unstructured mesh+SNES). The next step, is to replace the square mesh by my own mesh. I want import my mesh from a simple file, and after distributed it with DMPlexDistribute. Is it the good way ? Is it possible to import a mesh in petsc ? I didn't find the answer in the documentation. Is there an example ? Thanks Olivier Bonnefon -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 72 21 58 From knepley at gmail.com Tue Aug 20 08:06:00 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Aug 2013 08:06:00 -0500 Subject: [petsc-users] better way of setting dirichlet boundary conditions In-Reply-To: References: Message-ID: On Tue, Aug 20, 2013 at 7:19 AM, Bishesh Khanal wrote: > Hi all, > In solving problems such as laplacian/poisson equations with dirichlet > boundary conditions with finite difference methods, I set explicity the > required values to the diagonal of the boundary rows of the system matrix, > and the corresponding rhs vector. > i.e. typically my matrix building loop would be like: > > e.g. in 2d problems, using DMDA: > > FOR (i=0 to xn-1, j = 0 to yn-1) > set row.i = i, row. j = j > IF (i = 0 or xn-1) or (j = 0 or yn-1) > set diagonal value of matrix A to 1 in current row. > ELSE > normal interior points: set the values accordingly > ENDIF > ENDFOR > > Is there another preferred method instead of doing this ? I saw functions > such as MatZeroRows() > when following the answer in the FAQ regarding this at: > http://www.mcs.anl.gov/petsc/documentation/faq.html#redistribute > > but I did not understand what it is trying to say in the last sentence of > the answer "An alternative approach is ... into the load" > Since those values are fixed, you do not really have to solve for them. You can eliminate them from your system entirely. Imagine you take the matrix you produce, plug in the values to X, act with the part of the matrix that hits them A_ID X, and move that to the RHS, then eliminate the row for Dirichlet values. Matt Thanks, > Bishesh > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Aug 20 08:08:20 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Aug 2013 08:08:20 -0500 Subject: [petsc-users] import a mesh. In-Reply-To: <52136614.50503@avignon.inra.fr> References: <52136614.50503@avignon.inra.fr> Message-ID: On Tue, Aug 20, 2013 at 7:50 AM, Olivier Bonnefon < olivier.bonnefon at avignon.inra.fr> wrote: > Hello, > > I have modify the ex12.c example for a like diffusive problem > (FEM+unstructured mesh+SNES). > The next step, is to replace the square mesh by my own mesh. > I want import my mesh from a simple file, and after distributed it with > DMPlexDistribute. > Is it the good way ? > Is it possible to import a mesh in petsc ? I didn't find the answer in the > documentation. Is there an example ? > There are a few formats supported: 1) Cell-vertex: DMPlexCreateFromCellList() 2) Exodus: DMPlexCreateExodus() 3) CGNS: DMPlexCreateCGNS() Thanks, Matt > Thanks > Olivier Bonnefon > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.bonnefon at avignon.inra.fr Tue Aug 20 10:16:48 2013 From: olivier.bonnefon at avignon.inra.fr (Olivier Bonnefon) Date: Tue, 20 Aug 2013 17:16:48 +0200 Subject: [petsc-users] import a mesh. In-Reply-To: References: <52136614.50503@avignon.inra.fr> Message-ID: <52138860.8020900@avignon.inra.fr> Hello, Thank-you, I'm using DMPlexCreateFromCellList(), and it works except for the boundary conditions. I'm using Dirichlet boundary conditions, and the corresponding degree of freedom are not substituted (like when I use DMPlexCreateBoxMesh). So, I do have missing a step to define the boundary ? Olivier B On 08/20/2013 03:08 PM, Matthew Knepley wrote: > On Tue, Aug 20, 2013 at 7:50 AM, Olivier Bonnefon > > wrote: > > Hello, > > I have modify the ex12.c example for a like diffusive problem > (FEM+unstructured mesh+SNES). > The next step, is to replace the square mesh by my own mesh. > I want import my mesh from a simple file, and after distributed it > with DMPlexDistribute. > Is it the good way ? > Is it possible to import a mesh in petsc ? I didn't find the > answer in the documentation. Is there an example ? > > > There are a few formats supported: > > 1) Cell-vertex: DMPlexCreateFromCellList() > > 2) Exodus: DMPlexCreateExodus() > > 3) CGNS: DMPlexCreateCGNS() > > Thanks, > > Matt > > Thanks > Olivier Bonnefon > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 72 21 58 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Aug 20 10:45:58 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Aug 2013 10:45:58 -0500 Subject: [petsc-users] import a mesh. In-Reply-To: <52138860.8020900@avignon.inra.fr> References: <52136614.50503@avignon.inra.fr> <52138860.8020900@avignon.inra.fr> Message-ID: On Tue, Aug 20, 2013 at 10:16 AM, Olivier Bonnefon < olivier.bonnefon at avignon.inra.fr> wrote: > Hello, > > Thank-you, I'm using DMPlexCreateFromCellList(), and it works except for > the boundary conditions. > I'm using Dirichlet boundary conditions, and the corresponding degree of > freedom are not substituted (like when I use DMPlexCreateBoxMesh). > So, I do have missing a step to define the boundary ? > Yes. For the other formats, it translates boundary markers to DMLabels. With the CellList you will have to label your boundary. If you want it to work exactly as in the example, for every point p on your boundary, DMPlexSetLabelValue(dm, "marker", p, 1); You can avoid the string lookup by using DMPlexCreateLabel(dm, "marker"); DMPlexGetLabel(dm, "marker", &label); DMLabelSetValue(label, p, 1); Obviously, if you have multiple BC, you can use a label for each condition. Thanks, Matt > Olivier B > > > On 08/20/2013 03:08 PM, Matthew Knepley wrote: > > On Tue, Aug 20, 2013 at 7:50 AM, Olivier Bonnefon < > olivier.bonnefon at avignon.inra.fr> wrote: > >> Hello, >> >> I have modify the ex12.c example for a like diffusive problem >> (FEM+unstructured mesh+SNES). >> The next step, is to replace the square mesh by my own mesh. >> I want import my mesh from a simple file, and after distributed it with >> DMPlexDistribute. >> Is it the good way ? >> Is it possible to import a mesh in petsc ? I didn't find the answer in >> the documentation. Is there an example ? >> > > There are a few formats supported: > > 1) Cell-vertex: DMPlexCreateFromCellList() > > 2) Exodus: DMPlexCreateExodus() > > 3) CGNS: DMPlexCreateCGNS() > > Thanks, > > Matt > > >> Thanks >> Olivier Bonnefon >> >> -- >> Olivier Bonnefon >> INRA PACA-Avignon, Unit? BioSP >> Tel: +33 (0)4 32 72 21 58 <%2B33%20%280%294%2032%2072%2021%2058> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jefonseca at gmail.com Tue Aug 20 11:51:00 2013 From: jefonseca at gmail.com (Jim Fonseca) Date: Tue, 20 Aug 2013 11:51:00 -0500 Subject: [petsc-users] mixed precision In-Reply-To: <52092BAA.5030205@mcs.anl.gov> References: <52092BAA.5030205@mcs.anl.gov> Message-ID: Okay, thank you for the guidance. Jim On Mon, Aug 12, 2013 at 1:38 PM, Karl Rupp wrote: > Hi Jim, > > in addition to what Matt already said, keep in mind is that you usually > won't see a two-fold performance gain in iterative solvers anyway, as the > various integers used for storing the nonzeros in the sparse matrix don't > change their size. I once played with an implementation of an > non-preconditioned mixed-precision CG solver, and I only obtained about a > 40 percent overall performance gain for well-conditioned systems. For less > well-conditioned systems you may not get any better overall performance at > all (or worse, fail to converge). > > Best regards, > Karli > > > > On 08/12/2013 12:32 PM, Matthew Knepley wrote: > >> On Mon, Aug 12, 2013 at 12:24 PM, Jim Fonseca > > wrote: >> >> Hi, >> We are curious about the mixed-precision capabilities in NEMO5. I >> see that there is a newish configure option to allow single >> precision for linear solve. Other than that, I found this old post: >> https://lists.mcs.anl.gov/**mailman/htdig/petsc-users/** >> 2012-August/014842.html >> >> Is there any other information about to see if we can take advantage >> of this capability? >> >> >> Mixed-precision is hard, and especially hard in PETSc because the C type >> system is limited. >> However, it also needs to be embedded in an algorithm that can take >> advantage of it. I would >> always start out with a clear motivation: >> >> - What would mixed precision accomplish in your code? >> >> - What is the most possible benefit you would see? >> >> and decide if that is worth a large time investment. >> >> Thanks, >> Jim >> >> -- >> Jim Fonseca, PhD >> Research Scientist >> Network for Computational Nanotechnology >> Purdue University >> 765-496-6495 >> www.jimfonseca.com >> >> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener >> > > -- Jim Fonseca, PhD Research Scientist Network for Computational Nanotechnology Purdue University 765-496-6495 www.jimfonseca.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From gnw20 at cam.ac.uk Tue Aug 20 12:50:00 2013 From: gnw20 at cam.ac.uk (Garth N. Wells) Date: Tue, 20 Aug 2013 18:50:00 +0100 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: On 19 August 2013 14:03, Matthew Knepley wrote: > On Mon, Aug 19, 2013 at 7:40 AM, Jed Brown wrote: >> >> "Garth N. Wells" writes: >> >> > I'm using PCFIELDSPLIT for a mixed problem, with fields indicated via >> > an index set (using PCFieldSplitSetIS). I'd like to attach an >> > approximate null space to the A00 block of the system. Is there a way >> > to attach the null space to one block? > > > There is a better way. You can attach the null space to the IS that forms > the split > > PetscObjectAttach((PetscObject) is0, "nullspace", nullspace); > > and PCFIELDSPLIT will pull it out and attach it to the preconditioner for > block 0. > I tried this and it didn't seem to work - the solver didn't converge at all. > Matt > >> >> With MatNest, you can use MatSetNearNullSpace on sub-blocks and it will >> be used automatically. This is also the low-memory way to use >> PCFieldSplit. We're working on support for this. >> Note that if you use MatSetValuesLocal, you can use the >> NEST format with identical code in assembly; only matrix creation is >> different, see src/snes/example/tutorials/ex28.c for an example that >> works with both AIJ and NEST formats. >> Is there a reason why MatSetValues can't be used? We rely on PETSc caching and then communicating off-process entries. >> Otherwise, you should be able to PCSetUp, then get out the sub-solvers >> PCFieldSplitGetSubKSP, pull out their matrices, and call >> MatSetNearNullSpace. Those submatrices should not be overwritten as >> long as you use SAME_NONZERO_PATTERN, so the null space would still work >> in future iterations. > I'm using the above and it's working great - thanks. Garth > > > From jedbrown at mcs.anl.gov Tue Aug 20 13:00:14 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 20 Aug 2013 13:00:14 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: <87zjscnsy9.fsf@mcs.anl.gov> "Garth N. Wells" writes: >>> Note that if you use MatSetValuesLocal, you can use the >>> NEST format with identical code in assembly; only matrix creation is >>> different, see src/snes/example/tutorials/ex28.c for an example that >>> works with both AIJ and NEST formats. >>> > > Is there a reason why MatSetValues can't be used? We rely on PETSc > caching and then communicating off-process entries. MatSetValuesLocal() just takes indices described in a local ordering (defined by their mapping to global indices, see MatSetLocalToGlobalMapping), but you can still set global indices. A global-to-global map is expensive/non-scalable, so we can't efficiently support it when assembling sub-blocks that correspond to general index sets. The MatSetValuesLocal interface also makes a number of other domain decomposition methods accessible. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Tue Aug 20 13:14:32 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Aug 2013 13:14:32 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: On Tue, Aug 20, 2013 at 12:50 PM, Garth N. Wells wrote: > On 19 August 2013 14:03, Matthew Knepley wrote: > > On Mon, Aug 19, 2013 at 7:40 AM, Jed Brown wrote: > >> > >> "Garth N. Wells" writes: > >> > >> > I'm using PCFIELDSPLIT for a mixed problem, with fields indicated via > >> > an index set (using PCFieldSplitSetIS). I'd like to attach an > >> > approximate null space to the A00 block of the system. Is there a way > >> > to attach the null space to one block? > > > > > > There is a better way. You can attach the null space to the IS that forms > > the split > > > > PetscObjectAttach((PetscObject) is0, "nullspace", nullspace); > > > > and PCFIELDSPLIT will pull it out and attach it to the preconditioner for > > block 0. > > > > I tried this and it didn't seem to work - the solver didn't converge at > all. Is this helpless week :) Run with -ksp_view so we can see what actually happened. Here is the code that gets out the null space: https://bitbucket.org/petsc/petsc/src/d45619dec29bfb59cf96225a84e0a74106da50ca/src/ksp/pc/impls/fieldsplit/fieldsplit.c?at=master#cl-531 You can break there and see if it gets it. Thanks, Matt > > Matt > > > >> > >> With MatNest, you can use MatSetNearNullSpace on sub-blocks and it will > >> be used automatically. This is also the low-memory way to use > >> PCFieldSplit. > > We're working on support for this. > > >> Note that if you use MatSetValuesLocal, you can use the > >> NEST format with identical code in assembly; only matrix creation is > >> different, see src/snes/example/tutorials/ex28.c for an example that > >> works with both AIJ and NEST formats. > >> > > Is there a reason why MatSetValues can't be used? We rely on PETSc > caching and then communicating off-process entries. > > >> Otherwise, you should be able to PCSetUp, then get out the sub-solvers > >> PCFieldSplitGetSubKSP, pull out their matrices, and call > >> MatSetNearNullSpace. Those submatrices should not be overwritten as > >> long as you use SAME_NONZERO_PATTERN, so the null space would still work > >> in future iterations. > > > > I'm using the above and it's working great - thanks. > > Garth > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From gnw20 at cam.ac.uk Tue Aug 20 13:41:04 2013 From: gnw20 at cam.ac.uk (Garth N. Wells) Date: Tue, 20 Aug 2013 19:41:04 +0100 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: On 20 August 2013 19:14, Matthew Knepley wrote: > On Tue, Aug 20, 2013 at 12:50 PM, Garth N. Wells wrote: >> >> On 19 August 2013 14:03, Matthew Knepley wrote: >> > On Mon, Aug 19, 2013 at 7:40 AM, Jed Brown wrote: >> >> >> >> "Garth N. Wells" writes: >> >> >> >> > I'm using PCFIELDSPLIT for a mixed problem, with fields indicated via >> >> > an index set (using PCFieldSplitSetIS). I'd like to attach an >> >> > approximate null space to the A00 block of the system. Is there a way >> >> > to attach the null space to one block? >> > >> > >> > There is a better way. You can attach the null space to the IS that >> > forms >> > the split >> > >> > PetscObjectAttach((PetscObject) is0, "nullspace", nullspace); >> > >> > and PCFIELDSPLIT will pull it out and attach it to the preconditioner >> > for >> > block 0. >> > >> >> I tried this and it didn't seem to work - the solver didn't converge at >> all. > > > Is this helpless week :) Run with -ksp_view so we can see what actually > happened. The string should be 'nearnullspace' in place of 'nullspace' as above, i.e. PetscObjectCompose((PetscObject) is0, "nearnullspace", (PetscObject) nullspace); rather than PetscObjectCompose((PetscObject) is0, "nullspace", (PetscObject) nullspace); With the former it works as expected. Garth > Here is the code that gets out the null space: > > > https://bitbucket.org/petsc/petsc/src/d45619dec29bfb59cf96225a84e0a74106da50ca/src/ksp/pc/impls/fieldsplit/fieldsplit.c?at=master#cl-531 > > You can break there and see if it gets it. > > Thanks, > > Matt > >> >> > Matt >> > >> >> >> >> With MatNest, you can use MatSetNearNullSpace on sub-blocks and it will >> >> be used automatically. This is also the low-memory way to use >> >> PCFieldSplit. >> >> We're working on support for this. >> >> >> Note that if you use MatSetValuesLocal, you can use the >> >> NEST format with identical code in assembly; only matrix creation is >> >> different, see src/snes/example/tutorials/ex28.c for an example that >> >> works with both AIJ and NEST formats. >> >> >> >> Is there a reason why MatSetValues can't be used? We rely on PETSc >> caching and then communicating off-process entries. >> >> >> Otherwise, you should be able to PCSetUp, then get out the sub-solvers >> >> PCFieldSplitGetSubKSP, pull out their matrices, and call >> >> MatSetNearNullSpace. Those submatrices should not be overwritten as >> >> long as you use SAME_NONZERO_PATTERN, so the null space would still >> >> work >> >> in future iterations. >> > >> >> I'm using the above and it's working great - thanks. >> >> Garth >> >> > >> > >> > From knepley at gmail.com Tue Aug 20 13:54:27 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Aug 2013 13:54:27 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: On Tue, Aug 20, 2013 at 1:41 PM, Garth N. Wells wrote: > On 20 August 2013 19:14, Matthew Knepley wrote: > > On Tue, Aug 20, 2013 at 12:50 PM, Garth N. Wells > wrote: > >> > >> On 19 August 2013 14:03, Matthew Knepley wrote: > >> > On Mon, Aug 19, 2013 at 7:40 AM, Jed Brown > wrote: > >> >> > >> >> "Garth N. Wells" writes: > >> >> > >> >> > I'm using PCFIELDSPLIT for a mixed problem, with fields indicated > via > >> >> > an index set (using PCFieldSplitSetIS). I'd like to attach an > >> >> > approximate null space to the A00 block of the system. Is there a > way > >> >> > to attach the null space to one block? > >> > > >> > > >> > There is a better way. You can attach the null space to the IS that > >> > forms > >> > the split > >> > > >> > PetscObjectAttach((PetscObject) is0, "nullspace", nullspace); > >> > > >> > and PCFIELDSPLIT will pull it out and attach it to the preconditioner > >> > for > >> > block 0. > >> > > >> > >> I tried this and it didn't seem to work - the solver didn't converge at > >> all. > > > > > > Is this helpless week :) Run with -ksp_view so we can see what actually > > happened. > > The string should be 'nearnullspace' in place of 'nullspace' as above, i.e. > > PetscObjectCompose((PetscObject) is0, "nearnullspace", > (PetscObject) nullspace); > > rather than > > PetscObjectCompose((PetscObject) is0, "nullspace", (PetscObject) > nullspace); > > With the former it works as expected. I did not realize you wanted the AMG starter. Glad it works. Matt > > Garth > > > > Here is the code that gets out the null space: > > > > > > > https://bitbucket.org/petsc/petsc/src/d45619dec29bfb59cf96225a84e0a74106da50ca/src/ksp/pc/impls/fieldsplit/fieldsplit.c?at=master#cl-531 > > > > You can break there and see if it gets it. > > > > Thanks, > > > > Matt > > > >> > >> > Matt > >> > > >> >> > >> >> With MatNest, you can use MatSetNearNullSpace on sub-blocks and it > will > >> >> be used automatically. This is also the low-memory way to use > >> >> PCFieldSplit. > >> > >> We're working on support for this. > >> > >> >> Note that if you use MatSetValuesLocal, you can use the > >> >> NEST format with identical code in assembly; only matrix creation is > >> >> different, see src/snes/example/tutorials/ex28.c for an example that > >> >> works with both AIJ and NEST formats. > >> >> > >> > >> Is there a reason why MatSetValues can't be used? We rely on PETSc > >> caching and then communicating off-process entries. > >> > >> >> Otherwise, you should be able to PCSetUp, then get out the > sub-solvers > >> >> PCFieldSplitGetSubKSP, pull out their matrices, and call > >> >> MatSetNearNullSpace. Those submatrices should not be overwritten as > >> >> long as you use SAME_NONZERO_PATTERN, so the null space would still > >> >> work > >> >> in future iterations. > >> > > >> > >> I'm using the above and it's working great - thanks. > >> > >> Garth > >> > >> > > >> > > >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue Aug 20 13:57:47 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 20 Aug 2013 13:57:47 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: References: <87ppt926sc.fsf@mcs.anl.gov> Message-ID: <87r4donqac.fsf@mcs.anl.gov> Matthew Knepley writes: > I did not realize you wanted the AMG starter. Glad it works. "I'd like to attach an approximate null space to the A00 block of the system." and my instructions were for MatSetNearNullSpace, and you then recommended "a better way". Looks like just mistyping or miscommunication. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Tue Aug 20 13:59:16 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 20 Aug 2013 13:59:16 -0500 Subject: [petsc-users] Attach approximate null space to a fieldsplit block In-Reply-To: <87r4donqac.fsf@mcs.anl.gov> References: <87ppt926sc.fsf@mcs.anl.gov> <87r4donqac.fsf@mcs.anl.gov> Message-ID: On Tue, Aug 20, 2013 at 1:57 PM, Jed Brown wrote: > Matthew Knepley writes: > > > I did not realize you wanted the AMG starter. Glad it works. > > "I'd like to attach an approximate null space to the A00 block of the > system." > > and my instructions were for MatSetNearNullSpace, and you then > recommended "a better way". > > Looks like just mistyping or miscommunication. > I only read "Is there a way to attach the null space to one block?" because I am lazy. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From s_g at berkeley.edu Tue Aug 20 19:21:13 2013 From: s_g at berkeley.edu (Sanjay GOVINDJEE) Date: Tue, 20 Aug 2013 17:21:13 -0700 Subject: [petsc-users] Using cmake to generate run lines Message-ID: I have recently converted our project to use cmake to generate its makefiles. I was wondering, if anyone has an example of how to have cmake add the "run lines" to the makefile that it generates? i.e. I want cmake to add lines lines like: runex1: -@${MPIEXEC} -n 1 ./ex1 -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always and other variants to the makefile that it produces. Note that I am already using Jed Brown's FindPETSc.cmake already. -sanjay -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue Aug 20 19:37:11 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 20 Aug 2013 19:37:11 -0500 Subject: [petsc-users] Using cmake to generate run lines In-Reply-To: References: Message-ID: You can do something similar with CTest. See those docs and perhaps the examples in https://github.com/jedbrown/dohp. On Aug 20, 2013 8:21 PM, "Sanjay GOVINDJEE" wrote: > I have recently converted our project to use cmake to > generate its makefiles. I was wondering, if anyone has > an example of how to have cmake add the "run lines" > to the makefile that it generates? i.e. I want cmake to > add lines lines like: > > runex1: > -@${MPIEXEC} -n 1 ./ex1 -ksp_monitor_short > -ksp_gmres_cgs_refinement_type refine_always > > and other variants to the makefile that it produces. Note that I am > already using > Jed Brown's FindPETSc.cmake already. > > -sanjay > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From obonnefon at paca.inra.fr Wed Aug 21 04:28:34 2013 From: obonnefon at paca.inra.fr (obonnefon at paca.inra.fr) Date: Wed, 21 Aug 2013 11:28:34 +0200 Subject: [petsc-users] import a mesh. In-Reply-To: References: <52136614.50503@avignon.inra.fr> <52138860.8020900@avignon.inra.fr> Message-ID: <20130821112834.Horde.aPZMASrAflIpug5z8L0mpA1@webmail.paca.inra.fr> Hello, Ok, I did it: ??? ierr = DMPlexCreateFromCellList(comm,dim,8,9,3,0,obcells,2,obvertex,dm);CHKERRQ(ierr); ??? for (i=0;i: > On Tue, Aug 20, 2013 at 10:16 AM, Olivier Bonnefon > wrote: > >> Hello, >> >> Thank-you, I'm using DMPlexCreateFromCellList(), and it works except >> for the boundary conditions. >> I'm using Dirichlet boundary conditions, and the corresponding degree >> of freedom are not substituted (like when I use DMPlexCreateBoxMesh). >> So, I do have missing a step to define the boundary ? > > ? > Yes. For the other formats, it translates boundary markers to > DMLabels. With the CellList you will have to label your boundary. > If you want it to work exactly as in the example, for every point p > on your boundary, > ? > ? DMPlexSetLabelValue(dm, "marker", p, 1); > ? > You can avoid the string lookup by using > ? > ? DMPlexCreateLabel(dm, "marker"); > ? DMPlexGetLabel(dm, "marker", &label); > ? DMLabelSetValue(label, p, 1); > ? > Obviously, if you have multiple BC, you can use a label for each > condition. > ? > ? Thanks, > ? > ? ? ?Matt > ? > >> Olivier B >> >> On 08/20/2013 03:08 PM, Matthew Knepley wrote: >> >>> On Tue, Aug 20, 2013 at 7:50 AM, Olivier Bonnefon >>> wrote: >>> >>>> Hello, >>>> >>>> I have modify the ex12.c example for a like diffusive problem >>>> (FEM+unstructured mesh+SNES). >>>> The next step, is to replace the square mesh by my own mesh. >>>> I want import my mesh from a simple file, and after distributed it >>>> with DMPlexDistribute. >>>> Is it the good way ? >>>> Is it possible to import a mesh in petsc ? I didn't find the answer >>>> in the documentation. Is there an example ? >>> >>> ? >>> There are a few formats supported: >>> ? >>> ? 1) Cell-vertex: DMPlexCreateFromCellList() >>> ? >>> ? 2) Exodus: DMPlexCreateExodus() >>> ? >>> ? 3) CGNS: DMPlexCreateCGNS() >>> ? >>> ? Thanks, >>> ? >>> ? ? ?Matt >>> ? >>> >>>> Thanks >>>> Olivier Bonnefon >>>> >>>> -- >>>> Olivier Bonnefon >>>> INRA PACA-Avignon, Unit? BioSP >>>> Tel: +33 (0)4 32 72 21 58[1] >>>> ? >>> >>> ? >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which >>> their experiments lead. >>> -- Norbert Wiener >> >> -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 72 >> 21 58[1] > > ? > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener Links: ------ [1] tel:%2B33%20%280%294%2032%2072%2021%2058 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Wed Aug 21 04:59:18 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Wed, 21 Aug 2013 11:59:18 +0200 Subject: [petsc-users] better way of setting dirichlet boundary conditions In-Reply-To: References: Message-ID: On Tue, Aug 20, 2013 at 3:06 PM, Matthew Knepley wrote: > On Tue, Aug 20, 2013 at 7:19 AM, Bishesh Khanal wrote: > >> Hi all, >> In solving problems such as laplacian/poisson equations with dirichlet >> boundary conditions with finite difference methods, I set explicity the >> required values to the diagonal of the boundary rows of the system matrix, >> and the corresponding rhs vector. >> i.e. typically my matrix building loop would be like: >> >> e.g. in 2d problems, using DMDA: >> >> FOR (i=0 to xn-1, j = 0 to yn-1) >> set row.i = i, row. j = j >> IF (i = 0 or xn-1) or (j = 0 or yn-1) >> set diagonal value of matrix A to 1 in current row. >> ELSE >> normal interior points: set the values accordingly >> ENDIF >> ENDFOR >> >> Is there another preferred method instead of doing this ? I saw functions >> such as MatZeroRows() >> when following the answer in the FAQ regarding this at: >> http://www.mcs.anl.gov/petsc/documentation/faq.html#redistribute >> >> but I did not understand what it is trying to say in the last sentence of >> the answer "An alternative approach is ... into the load" >> > > Since those values are fixed, you do not really have to solve for them. > You can eliminate them from your > system entirely. Imagine you take the matrix you produce, plug in the > values to X, act with the part of the > matrix that hits them A_ID X, and move that to the RHS, then eliminate > the row for Dirichlet values. > Now I understand the concept, thanks! So how do I efficiently do this with petsc functions when I am using DMDA which contains the boundary points too? Conceptually the steps would be the following, I think, but which petsc functions would enable me to do this efficiently, for example, without explicitly creating the new matrix A1 in the following and instead informing KSP about it ? 1) First create the big system matrix (from DM da) including the identity rows for Dirichlet points and corresponding rhs, Lets say Ax = b. 2) Initialize x with zero, then set the desired Dirichlet values on corresponding boundary points of x. 3) Create a new matrix, A1 with zeros everywhere except the row,col positions corresponding to Dirchlet points where put -1. 4) Get b1 by multiplying A1 with x. 5) Update rhs with b = b + b1. 6) Now update A by removing its rows and columns that correspond to the Dirichlet points, and remove corresponding rows of b and x. 7) Solve Ax=b > Matt > > Thanks, >> Bishesh >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From obonnefon at paca.inra.fr Wed Aug 21 05:25:09 2013 From: obonnefon at paca.inra.fr (obonnefon at paca.inra.fr) Date: Wed, 21 Aug 2013 12:25:09 +0200 Subject: [petsc-users] import a mesh. In-Reply-To: <20130821112834.Horde.aPZMASrAflIpug5z8L0mpA1@webmail.paca.inra.fr> References: <52136614.50503@avignon.inra.fr> <52138860.8020900@avignon.inra.fr> <20130821112834.Horde.aPZMASrAflIpug5z8L0mpA1@webmail.paca.inra.fr> Message-ID: <20130821122509.Horde.EtMzn0X5MJKMeqqtKjJOUg8@webmail.paca.inra.fr> Hello, Hello,I found my bug: The indices of the points are offseted in the function DMPlexCreateFromCellList. Using the correct indicices in the? DMPlexSetLabelValue leads to the expected behaviour. Thanks you. Olivier B Quoting obonnefon at paca.inra.fr: > Hello, > > Ok, I did it: > ??? ierr = > DMPlexCreateFromCellList(comm,dim,8,9,3,0,obcells,2,obvertex,dm);CHKERRQ(ierr); > ??? for (i=0;i ????? ierr =DMPlexSetLabelValue(*dm, "marker", obboundary[i], > 1);CHKERRQ(ierr); > ??? } > > but, it is not efficient: The degree of freedom are still present in the > linear system. > In the source of? DMPlexCreateSquareBoundary, the label is set for the > edges and for the points. > Do I have to do something like this ? > Thanks, > Olivier B > > Quoting Matthew Knepley : > >> On Tue, Aug 20, 2013 at 10:16 AM, Olivier Bonnefon >> wrote: >> >>> Hello, >>> >>> Thank-you, I'm using DMPlexCreateFromCellList(), and it works except >>> for the boundary conditions. >>> I'm using Dirichlet boundary conditions, and the corresponding degree >>> of freedom are not substituted (like when I use DMPlexCreateBoxMesh). >>> So, I do have missing a step to define the boundary ? >> >> ? >> Yes. For the other formats, it translates boundary markers to >> DMLabels. With the CellList you will have to label your boundary. >> If you want it to work exactly as in the example, for every point >> p on your boundary, >> ? >> ? DMPlexSetLabelValue(dm, "marker", p, 1); >> ? >> You can avoid the string lookup by using >> ? >> ? DMPlexCreateLabel(dm, "marker"); >> ? DMPlexGetLabel(dm, "marker", &label); >> ? DMLabelSetValue(label, p, 1); >> ? >> Obviously, if you have multiple BC, you can use a label for each >> condition. >> ? >> ? Thanks, >> ? >> ? ? ?Matt >> ? >> >>> Olivier B >>> >>> On 08/20/2013 03:08 PM, Matthew Knepley wrote: >>> >>>> On Tue, Aug 20, 2013 at 7:50 AM, Olivier Bonnefon >>>> wrote: >>>> >>>>> Hello, >>>>> >>>>> I have modify the ex12.c example for a like diffusive problem >>>>> (FEM+unstructured mesh+SNES). >>>>> The next step, is to replace the square mesh by my own mesh. >>>>> I want import my mesh from a simple file, and after distributed it >>>>> with DMPlexDistribute. >>>>> Is it the good way ? >>>>> Is it possible to import a mesh in petsc ? I didn't find the answer >>>>> in the documentation. Is there an example ? >>>> >>>> ? >>>> There are a few formats supported: >>>> ? >>>> ? 1) Cell-vertex: DMPlexCreateFromCellList() >>>> ? >>>> ? 2) Exodus: DMPlexCreateExodus() >>>> ? >>>> ? 3) CGNS: DMPlexCreateCGNS() >>>> ? >>>> ? Thanks, >>>> ? >>>> ? ? ?Matt >>>> ? >>>> >>>>> Thanks >>>>> Olivier Bonnefon >>>>> >>>>> -- >>>>> Olivier Bonnefon >>>>> INRA PACA-Avignon, Unit? BioSP >>>>> Tel: +33 (0)4 32 72 21 58[1] >>>>> ? >>>> >>>> ? >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which >>>> their experiments lead. >>>> -- Norbert Wiener >>> >>> -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 >>> 72 21 58[1] >> >> ? >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener > > > ? Links: ------ [1] tel:%2B33%20%280%294%2032%2072%2021%2058 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 21 06:57:24 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 21 Aug 2013 06:57:24 -0500 Subject: [petsc-users] better way of setting dirichlet boundary conditions In-Reply-To: References: Message-ID: On Wed, Aug 21, 2013 at 4:59 AM, Bishesh Khanal wrote: > > > > On Tue, Aug 20, 2013 at 3:06 PM, Matthew Knepley wrote: > >> On Tue, Aug 20, 2013 at 7:19 AM, Bishesh Khanal wrote: >> >>> Hi all, >>> In solving problems such as laplacian/poisson equations with dirichlet >>> boundary conditions with finite difference methods, I set explicity the >>> required values to the diagonal of the boundary rows of the system matrix, >>> and the corresponding rhs vector. >>> i.e. typically my matrix building loop would be like: >>> >>> e.g. in 2d problems, using DMDA: >>> >>> FOR (i=0 to xn-1, j = 0 to yn-1) >>> set row.i = i, row. j = j >>> IF (i = 0 or xn-1) or (j = 0 or yn-1) >>> set diagonal value of matrix A to 1 in current row. >>> ELSE >>> normal interior points: set the values accordingly >>> ENDIF >>> ENDFOR >>> >>> Is there another preferred method instead of doing this ? I saw >>> functions such as MatZeroRows() >>> when following the answer in the FAQ regarding this at: >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#redistribute >>> >>> but I did not understand what it is trying to say in the last sentence >>> of the answer "An alternative approach is ... into the load" >>> >> >> Since those values are fixed, you do not really have to solve for them. >> You can eliminate them from your >> system entirely. Imagine you take the matrix you produce, plug in the >> values to X, act with the part of the >> matrix that hits them A_ID X, and move that to the RHS, then eliminate >> the row for Dirichlet values. >> > > Now I understand the concept, thanks! So how do I efficiently do this with > petsc functions when I am using DMDA which contains the boundary points > too? Conceptually the steps would be the following, I think, but which > petsc functions would enable me to do this efficiently, for example, > without explicitly creating the new matrix A1 in the following and instead > informing KSP about it ? > 1) First create the big system matrix (from DM da) including the identity > rows for Dirichlet points and corresponding rhs, Lets say Ax = b. > 2) Initialize x with zero, then set the desired Dirichlet values on > corresponding boundary points of x. > 3) Create a new matrix, A1 with zeros everywhere except the row,col > positions corresponding to Dirchlet points where put -1. > 4) Get b1 by multiplying A1 with x. > 5) Update rhs with b = b + b1. > 6) Now update A by removing its rows and columns that correspond to the > Dirichlet points, and remove corresponding rows of b and x. > 7) Solve Ax=b > This is generally not a good thing to do with FD. Matt >> Matt >> >> Thanks, >>> Bishesh >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Wed Aug 21 07:12:56 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Wed, 21 Aug 2013 14:12:56 +0200 Subject: [petsc-users] better way of setting dirichlet boundary conditions In-Reply-To: References: Message-ID: On Wed, Aug 21, 2013 at 1:57 PM, Matthew Knepley wrote: > On Wed, Aug 21, 2013 at 4:59 AM, Bishesh Khanal wrote: > >> >> >> >> On Tue, Aug 20, 2013 at 3:06 PM, Matthew Knepley wrote: >> >>> On Tue, Aug 20, 2013 at 7:19 AM, Bishesh Khanal wrote: >>> >>>> Hi all, >>>> In solving problems such as laplacian/poisson equations with dirichlet >>>> boundary conditions with finite difference methods, I set explicity the >>>> required values to the diagonal of the boundary rows of the system matrix, >>>> and the corresponding rhs vector. >>>> i.e. typically my matrix building loop would be like: >>>> >>>> e.g. in 2d problems, using DMDA: >>>> >>>> FOR (i=0 to xn-1, j = 0 to yn-1) >>>> set row.i = i, row. j = j >>>> IF (i = 0 or xn-1) or (j = 0 or yn-1) >>>> set diagonal value of matrix A to 1 in current row. >>>> ELSE >>>> normal interior points: set the values accordingly >>>> ENDIF >>>> ENDFOR >>>> >>>> Is there another preferred method instead of doing this ? I saw >>>> functions such as MatZeroRows() >>>> when following the answer in the FAQ regarding this at: >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#redistribute >>>> >>>> but I did not understand what it is trying to say in the last sentence >>>> of the answer "An alternative approach is ... into the load" >>>> >>> >>> Since those values are fixed, you do not really have to solve for them. >>> You can eliminate them from your >>> system entirely. Imagine you take the matrix you produce, plug in the >>> values to X, act with the part of the >>> matrix that hits them A_ID X, and move that to the RHS, then eliminate >>> the row for Dirichlet values. >>> >> >> Now I understand the concept, thanks! So how do I efficiently do this >> with petsc functions when I am using DMDA which contains the boundary >> points too? Conceptually the steps would be the following, I think, but >> which petsc functions would enable me to do this efficiently, for example, >> without explicitly creating the new matrix A1 in the following and instead >> informing KSP about it ? >> 1) First create the big system matrix (from DM da) including the identity >> rows for Dirichlet points and corresponding rhs, Lets say Ax = b. >> 2) Initialize x with zero, then set the desired Dirichlet values on >> corresponding boundary points of x. >> 3) Create a new matrix, A1 with zeros everywhere except the row,col >> positions corresponding to Dirchlet points where put -1. >> 4) Get b1 by multiplying A1 with x. >> 5) Update rhs with b = b + b1. >> 6) Now update A by removing its rows and columns that correspond to the >> Dirichlet points, and remove corresponding rows of b and x. >> 7) Solve Ax=b >> > > This is generally not a good thing to do with FD. > Do you mean that with FD, it is better to solve the bigger matrix with the identity row for Dirichlet points instead of excluding them (That is the way I illustrated in the pseudocode in the very first email above)? And why is it not a good thing ? (I thought by excluding the rows-cols of Dirichlet points would enable us to preserve the symmetry for symmetrical problems ? > > Matt > > >>> Matt >>> >>> Thanks, >>>> Bishesh >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 21 07:32:25 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 21 Aug 2013 07:32:25 -0500 Subject: [petsc-users] import a mesh. In-Reply-To: <20130821122509.Horde.EtMzn0X5MJKMeqqtKjJOUg8@webmail.paca.inra.fr> References: <52136614.50503@avignon.inra.fr> <52138860.8020900@avignon.inra.fr> <20130821112834.Horde.aPZMASrAflIpug5z8L0mpA1@webmail.paca.inra.fr> <20130821122509.Horde.EtMzn0X5MJKMeqqtKjJOUg8@webmail.paca.inra.fr> Message-ID: On Wed, Aug 21, 2013 at 5:25 AM, wrote: > Hello, > > Hello,I found my bug: The indices of the points are offseted in the > function DMPlexCreateFromCellList. > Using the correct indicices in the DMPlexSetLabelValue leads to the > expected behaviour. > Yes, here is how I organize point numbers by default: [ 0 - numCells): Cells [nC - nC+numVertices): Vertices [nC+nV - nC+nV+numFaces): Faces [nC+nV+nF - nC+nV+nF+numEdges): Edges The reason I do this is to preserve the representation of cell-vertex meshes inside of a mesh with faces and edges added. Thanks, Matt > Thanks you. > Olivier B > > Quoting obonnefon at paca.inra.fr: > > Hello, > > Ok, I did it: > ierr = > DMPlexCreateFromCellList(comm,dim,8,9,3,0,obcells,2,obvertex,dm);CHKERRQ(ierr); > for (i=0;i ierr =DMPlexSetLabelValue(*dm, "marker", obboundary[i], > 1);CHKERRQ(ierr); > } > > but, it is not efficient: The degree of freedom are still present in the > linear system. > In the source of DMPlexCreateSquareBoundary, the label is set for the > edges and for the points. > Do I have to do something like this ? > Thanks, > Olivier B > > Quoting Matthew Knepley : > > On Tue, Aug 20, 2013 at 10:16 AM, Olivier Bonnefon < > olivier.bonnefon at avignon.inra.fr> wrote: > >> Hello, >> >> Thank-you, I'm using DMPlexCreateFromCellList(), and it works except for >> the boundary conditions. >> I'm using Dirichlet boundary conditions, and the corresponding degree of >> freedom are not substituted (like when I use DMPlexCreateBoxMesh). >> So, I do have missing a step to define the boundary ? >> > > Yes. For the other formats, it translates boundary markers to DMLabels. > With the CellList you will have to label your boundary. > If you want it to work exactly as in the example, for every point p on > your boundary, > > DMPlexSetLabelValue(dm, "marker", p, 1); > > You can avoid the string lookup by using > > DMPlexCreateLabel(dm, "marker"); > DMPlexGetLabel(dm, "marker", &label); > DMLabelSetValue(label, p, 1); > > Obviously, if you have multiple BC, you can use a label for each condition. > > Thanks, > > Matt > > >> Olivier B >> >> >> On 08/20/2013 03:08 PM, Matthew Knepley wrote: >> >> On Tue, Aug 20, 2013 at 7:50 AM, Olivier Bonnefon < >> olivier.bonnefon at avignon.inra.fr> wrote: >> >>> Hello, >>> >>> I have modify the ex12.c example for a like diffusive problem >>> (FEM+unstructured mesh+SNES). >>> The next step, is to replace the square mesh by my own mesh. >>> I want import my mesh from a simple file, and after distributed it with >>> DMPlexDistribute. >>> Is it the good way ? >>> Is it possible to import a mesh in petsc ? I didn't find the answer in >>> the documentation. Is there an example ? >>> >> >> There are a few formats supported: >> >> 1) Cell-vertex: DMPlexCreateFromCellList() >> >> 2) Exodus: DMPlexCreateExodus() >> >> 3) CGNS: DMPlexCreateCGNS() >> >> Thanks, >> >> Matt >> >> >>> Thanks >>> Olivier Bonnefon >>> >>> -- >>> Olivier Bonnefon >>> INRA PACA-Avignon, Unit? BioSP >>> Tel: +33 (0)4 32 72 21 58 >>> >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> >> -- >> Olivier Bonnefon >> INRA PACA-Avignon, Unit? BioSP >> Tel: +33 (0)4 32 72 21 58 >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 21 07:37:48 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 21 Aug 2013 07:37:48 -0500 Subject: [petsc-users] better way of setting dirichlet boundary conditions In-Reply-To: References: Message-ID: On Wed, Aug 21, 2013 at 7:12 AM, Bishesh Khanal wrote: > > > > On Wed, Aug 21, 2013 at 1:57 PM, Matthew Knepley wrote: > >> On Wed, Aug 21, 2013 at 4:59 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Tue, Aug 20, 2013 at 3:06 PM, Matthew Knepley wrote: >>> >>>> On Tue, Aug 20, 2013 at 7:19 AM, Bishesh Khanal wrote: >>>> >>>>> Hi all, >>>>> In solving problems such as laplacian/poisson equations with dirichlet >>>>> boundary conditions with finite difference methods, I set explicity the >>>>> required values to the diagonal of the boundary rows of the system matrix, >>>>> and the corresponding rhs vector. >>>>> i.e. typically my matrix building loop would be like: >>>>> >>>>> e.g. in 2d problems, using DMDA: >>>>> >>>>> FOR (i=0 to xn-1, j = 0 to yn-1) >>>>> set row.i = i, row. j = j >>>>> IF (i = 0 or xn-1) or (j = 0 or yn-1) >>>>> set diagonal value of matrix A to 1 in current row. >>>>> ELSE >>>>> normal interior points: set the values accordingly >>>>> ENDIF >>>>> ENDFOR >>>>> >>>>> Is there another preferred method instead of doing this ? I saw >>>>> functions such as MatZeroRows() >>>>> when following the answer in the FAQ regarding this at: >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#redistribute >>>>> >>>>> but I did not understand what it is trying to say in the last sentence >>>>> of the answer "An alternative approach is ... into the load" >>>>> >>>> >>>> Since those values are fixed, you do not really have to solve for them. >>>> You can eliminate them from your >>>> system entirely. Imagine you take the matrix you produce, plug in the >>>> values to X, act with the part of the >>>> matrix that hits them A_ID X, and move that to the RHS, then eliminate >>>> the row for Dirichlet values. >>>> >>> >>> Now I understand the concept, thanks! So how do I efficiently do this >>> with petsc functions when I am using DMDA which contains the boundary >>> points too? Conceptually the steps would be the following, I think, but >>> which petsc functions would enable me to do this efficiently, for example, >>> without explicitly creating the new matrix A1 in the following and instead >>> informing KSP about it ? >>> 1) First create the big system matrix (from DM da) including the >>> identity rows for Dirichlet points and corresponding rhs, Lets say Ax = b. >>> 2) Initialize x with zero, then set the desired Dirichlet values on >>> corresponding boundary points of x. >>> 3) Create a new matrix, A1 with zeros everywhere except the row,col >>> positions corresponding to Dirchlet points where put -1. >>> 4) Get b1 by multiplying A1 with x. >>> 5) Update rhs with b = b + b1. >>> 6) Now update A by removing its rows and columns that correspond to the >>> Dirichlet points, and remove corresponding rows of b and x. >>> 7) Solve Ax=b >>> >> >> This is generally not a good thing to do with FD. >> > > Do you mean that with FD, it is better to solve the bigger matrix with the > identity row for Dirichlet points instead of excluding them (That is the > way I illustrated in the pseudocode in the very first email above)? > And why is it not a good thing ? (I thought by excluding the rows-cols of > Dirichlet points would enable us to preserve the symmetry for symmetrical > problems ? > If you want symmetry, you can do MatZeroRowColumns(). I said is generally not a good idea because it is more complicated to eliminate them in the FD case. You can definitely do what you propose. With FEM, it makes more sense. You eliminate constrained variables from the system, but keep the values in the local vector. Then when you do an element integral, you get the correct answer including the boundary conditions, and everything is natural. Matt > >> Matt >> >> >>>> Matt >>>> >>>> Thanks, >>>>> Bishesh >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From obonnefon at paca.inra.fr Wed Aug 21 09:19:25 2013 From: obonnefon at paca.inra.fr (obonnefon at paca.inra.fr) Date: Wed, 21 Aug 2013 16:19:25 +0200 Subject: [petsc-users] import mesh. Message-ID: <20130821161925.Horde.W6MV7avnyv79YFtG-QYsPw3@webmail.paca.inra.fr> Hello, I need again your help. In the ex12.c, I have replaced the line : ??? ierr = DMPlexCreateBoxMesh(comm, dim, interpolate, dm);CHKERRQ(ierr); by ??? ierr = DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); ?????? for (i=0;i From jedbrown at mcs.anl.gov Wed Aug 21 10:23:33 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 21 Aug 2013 11:23:33 -0400 Subject: [petsc-users] better way of setting dirichlet boundary conditions In-Reply-To: References: Message-ID: <87ob8rkqyy.fsf@mcs.anl.gov> Matthew Knepley writes: > If you want symmetry, you can do MatZeroRowColumns(). I said is generally > not a good idea because it is more complicated to eliminate > them in the FD case. I don't agree. > You can definitely do what you propose. With FEM, it makes more sense. You > eliminate constrained variables from the system, but > keep the values in the local vector. Then when you do an element integral, > you get the correct answer including the boundary > conditions, and everything is natural. Assuming you are working with a nonlinear problem in defect correction form (e.g., Newton). This simple procedure works fine for FD and FE, to evaluate the residual F(U) and the "Jacobian" or Picard matrix J(U): Scatter UGlobal to ULocal Write correct Dirichlet values into ULocal Evaluate local residual FLocal(ULocal) and scatter to global if applicable Set Dirichlet nodes of FGlobal to UGlobal - UDesired Assemble J at ULocal, ignoring Dirichlet rows and columns Insert 1 on diagonal of Dirichlet rows and columns -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From potaman at outlook.com Wed Aug 21 13:42:30 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Wed, 21 Aug 2013 14:42:30 -0400 Subject: [petsc-users] What does this error mean? Message-ID: I have a phase-field code implemented using libmesh 0.9.2 and petsc-3.4.2 using the petscdm based nonlinear solver. This code works on OS X 10.8.4. However, when I tried to build and run this code on a linux machine, I get the following error. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Petsc has generated inconsistent data! [0]PETSC ERROR: No mesh blocks found.! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. Does anybody know what the likely cause of this error is? Thanks Subramanya -------------- next part -------------- An HTML attachment was scrubbed... URL: From jwpeterson at gmail.com Wed Aug 21 13:48:22 2013 From: jwpeterson at gmail.com (John Peterson) Date: Wed, 21 Aug 2013 12:48:22 -0600 Subject: [petsc-users] [Libmesh-users] What does this error mean? In-Reply-To: References: Message-ID: On Wed, Aug 21, 2013 at 12:42 PM, subramanya sadasiva wrote: > I have a phase-field code implemented using libmesh 0.9.2 and petsc-3.4.2 > using the petscdm based nonlinear solver. This code works on OS X 10.8.4. > However, when I tried to build and run this code on a linux machine, I get > the following error. > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Petsc has generated inconsistent data! > [0]PETSC ERROR: No mesh blocks found.! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > > Does anybody know what the likely cause of this error is? > Looks like it comes from petscdmlibmesh.C line 182. I'd try running in dbg mode if you haven't yet, and see if that comes up with anything useful. This only seems like it would happen if you have no active elements on a given processor. Are you running in parallel on a very small mesh? Other than that, maybe Dmitry has an idea. -- John -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 21 13:58:31 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 21 Aug 2013 13:58:31 -0500 Subject: [petsc-users] import mesh. In-Reply-To: <20130821161925.Horde.W6MV7avnyv79YFtG-QYsPw3@webmail.paca.inra.fr> References: <20130821161925.Horde.W6MV7avnyv79YFtG-QYsPw3@webmail.paca.inra.fr> Message-ID: On Wed, Aug 21, 2013 at 9:19 AM, wrote: > ** > > Hello, > > > I need again your help. > > In the ex12.c, I have replaced the line : > ierr = DMPlexCreateBoxMesh(comm, dim, interpolate, dm);CHKERRQ(ierr); > by > You probably want: if (!rank) { > ierr = > DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); > for (i=0;i ierr =DMPlexSetLabelValue(*dm, "marker", obBoundary[i]+obNbCells, > 1);CHKERRQ(ierr); > } > } else { ierr = DMCreate(comm, dm);CHKERRQ(ierr); ierr = DMSetType(*dm, DMPLEX);CHKERRQ(ierr); ierr = DMPlexSetDimension(*dm, dim);CHKERRQ(ierr); } so that you have an empty mesh on proc 1. I should not crash, but I think its getting confused because you have a weirdly specified mesh on all procs. Matt > > The result works with one process (mpirun -np 1 ./ex12), but not with 2 > (mpirun -np 2 ./ex12), it crashes during the partitioning (I'm using chaco): > > Do you have any clue about this? > > Thanks. > > Olivier B > > Following, the error message: > > $ mpirun -np 2 ./ex12 > [1]PETSC ERROR: > ----------------------------------------------------------------------- > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [1]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS > X to find memory corruption errors > [1]PETSC ERROR: likely location of problem given in stack below > [1]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [1]PETSC ERROR: INSTEAD the line number of the start of the function > [1]PETSC ERROR: is given. > [1]PETSC ERROR: [1] PetscSFCreateEmbeddedSF line 863 > /home/olivierb/SOFT/petsc-3.4.2/src/vec/is/sf/interface/sf.c > [1]PETSC ERROR: [1] PetscSFDistributeSection line 1755 > /home/olivierb/SOFT/petsc-3.4.2/src/vec/is/utils/vsectionis.c > [1]PETSC ERROR: [1] DMPlexDistribute line 2771 > /home/olivierb/SOFT/petsc-3.4.2/src/dm/impls/plex/plex.c > [1]PETSC ERROR: [1] CreateMesh line 371 > "unknowndirectory/"/home/olivierb/TP_PETSC/ex12.c > [1]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [1]PETSC ERROR: Signal received! > [1]PETSC ERROR: > ----------------------------------------------------------------------- > [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: ./ex12 on a arch-linux2-c-debug named pcbiom38 by olivierb > Wed Aug 21 16:02:08 2013 > [1]PETSC ERROR: Libraries linked from /home/olivierb/BUILD/DEBUG/PETSC/lib > [1]PETSC ERROR: Configure run at Tue Aug 20 11:32:52 > 2013-------------------------------------------------------------------------- > MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with > errorcode 59. > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. > You may or may not see output from other processes, depending onexactly > when Open MPI kills > them.-------------------------------------------------------------------------- > [1]PETSC ERROR: Configure options --with-debugging=1 --download-fiat > --download-scientificpython --download-generator --download-triangle > --download-ctetgen --download-chaco > --prefix=/home/olivierb/BUILD/DEBUG/PETSC > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: User provided function() line 0 in unknown directory > unknown > file-------------------------------------------------------------------------- > mpirun has exited due to process rank 1 with PID 4546 onnode pcbiom38 > exiting without calling "finalize". This mayhave caused other processes in > the application to beterminated by signals sent by mpirun (as reported > here).-------------------------------------------------------------------------- > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From potaman at outlook.com Wed Aug 21 14:01:22 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Wed, 21 Aug 2013 15:01:22 -0400 Subject: [petsc-users] [Libmesh-users] What does this error mean? In-Reply-To: References: , Message-ID: Hi John, running in dbg doesn't give me any additional information. Thanks, Subramanya From: jwpeterson at gmail.com Date: Wed, 21 Aug 2013 12:48:22 -0600 Subject: Re: [Libmesh-users] What does this error mean? To: potaman at outlook.com CC: libmesh-users at lists.sourceforge.net; petsc-users at mcs.anl.gov On Wed, Aug 21, 2013 at 12:42 PM, subramanya sadasiva wrote: I have a phase-field code implemented using libmesh 0.9.2 and petsc-3.4.2 using the petscdm based nonlinear solver. This code works on OS X 10.8.4. However, when I tried to build and run this code on a linux machine, I get the following error. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Petsc has generated inconsistent data! [0]PETSC ERROR: No mesh blocks found.! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. Does anybody know what the likely cause of this error is? Looks like it comes from petscdmlibmesh.C line 182. I'd try running in dbg mode if you haven't yet, and see if that comes up with anything useful. This only seems like it would happen if you have no active elements on a given processor. Are you running in parallel on a very small mesh? Other than that, maybe Dmitry has an idea. -- John -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Wed Aug 21 17:05:59 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Thu, 22 Aug 2013 00:05:59 +0200 Subject: [petsc-users] PETSC ERROR: Logging has not been enabled Message-ID: Dear all, My program runs fine when using just one processor, valgrind shows no errors too, but when using more than one processor I get the following errors: [0]PETSC ERROR: PetscOptionsInsertFile() line 461 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c [0]PETSC ERROR: PetscOptionsInsert() line 623 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c [0]PETSC ERROR: PetscInitialize() line 769 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/pinit.c PETSC ERROR: Logging has not been enabled. You might have forgotten to call PetscInitialize(). application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 [cli_0]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 =================================================================================== = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = EXIT CODE: 56 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES =================================================================================== I have not forgotten to call PetscInitialize, if that helps! Thanks, Bishesh -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Aug 21 17:12:49 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 21 Aug 2013 17:12:49 -0500 Subject: [petsc-users] PETSC ERROR: Logging has not been enabled In-Reply-To: References: Message-ID: <0A069F69-2A0A-45B2-A714-2960DB905AD7@mcs.anl.gov> Most likely the tool you are using to launch the parallel program is wrong for the MPI you have linked PETSc with. Are you starting the program with mpiexec ? Is that mpiexec the one that goes with the MPI (mpicc or mpif90) that you built PETSc with? What happens if you compile a trivial MPI only code with the mpicc and then try to run it in parallel with the mpiexec? Barry On Aug 21, 2013, at 5:05 PM, Bishesh Khanal wrote: > Dear all, > My program runs fine when using just one processor, valgrind shows no errors too, but when using more than one processor I get the following errors: > > [0]PETSC ERROR: PetscOptionsInsertFile() line 461 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c > [0]PETSC ERROR: PetscOptionsInsert() line 623 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c > [0]PETSC ERROR: PetscInitialize() line 769 in /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/pinit.c > PETSC ERROR: Logging has not been enabled. > You might have forgotten to call PetscInitialize(). > application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 > [cli_0]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 > > =================================================================================== > = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES > = EXIT CODE: 56 > = CLEANING UP REMAINING PROCESSES > = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES > =================================================================================== > > I have not forgotten to call PetscInitialize, if that helps! > Thanks, > Bishesh From zyzhang at nuaa.edu.cn Wed Aug 21 20:25:08 2013 From: zyzhang at nuaa.edu.cn (Zhang) Date: Thu, 22 Aug 2013 09:25:08 +0800 (GMT+08:00) Subject: [petsc-users] accessing global vector in a DMMA Message-ID: <165291f.318a.140a3a005bc.Coremail.zyzhang@nuaa.edu.cn> Dear All, Now I am confused with the way to access a global vector defined from DMDA. Here is the codes. When I switch on _DEBUG1_ the results get wrong. While if it's off, the results fine. I just wonder why I gave to use the local form of ul,vl,wl to access the values such as ul[k][j][i+1], and etc. Thank you first for any suggestion. Zhenyu ierr = DMCreateGlobalVector(ctxu->grid, &ctxu->x );CHKERRQ(ierr); ierr = VecDuplicate(ctxu->x,&ctxu->b);CHKERRQ(ierr); ierr = DMCreateGlobalVector(ctxv->grid, &ctxv->x );CHKERRQ(ierr); ierr = VecDuplicate(ctxv->x,&ctxv->b);CHKERRQ(ierr); ierr = DMCreateGlobalVector(ctxw->grid, &ctxw->x );CHKERRQ(ierr); ierr = VecDuplicate(ctxw->x,&ctxw->b);CHKERRQ(ierr); ... VecCopy(ctxu->x,ctxu->b); VecCopy(ctxv->x,ctxv->b); VecCopy(ctxw->x,ctxw->b); DMDAVecGetArray( ctxu->grid, ctxu->b, &ustar ); DMDAVecGetArray( ctxv->grid, ctxv->b, &vstar ); DMDAVecGetArray( ctxw->grid, ctxw->b, &wstar ); #if defined(_DEBUG1_) DMDAVecGetArray( ctxu->grid, ctxu->x, &u ); DMDAVecGetArray( ctxv->grid, ctxv->x, &v ); DMDAVecGetArray( ctxw->grid, ctxw->x, &w ); #else DMGetLocalVector(ctxu->grid,&ctxu->local); DMGetLocalVector(ctxv->grid,&ctxv->local); DMGetLocalVector(ctxw->grid,&ctxw->local); DMGlobalToLocalBegin(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); DMGlobalToLocalEnd(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); DMGlobalToLocalBegin(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); DMGlobalToLocalEnd(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); DMGlobalToLocalBegin(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); DMGlobalToLocalEnd(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); DMDAVecGetArray( ctxu->grid, ctxu->local, &ul ); DMDAVecGetArray( ctxv->grid, ctxv->local, &vl ); DMDAVecGetArray( ctxw->grid, ctxw->local, &wl ); #endif //---------------------------------------------------------------- // U DMDAGetCorners( ctxu->grid, &is, &js, &ks, &in, &jn, &kn ); ie = is + in - 1; je = js + jn - 1; ke = ks + kn - 1; is=max(is,1); js=max(js,1); ks=max(ks,1); ie=min(ie,ctxu->l-2); je=min(je,ctxu->m-2); ke=min(ke,ctxu->n-2); for (k=ks; k<=ke; k++) { for (j=js; j<=je; j++) { for (i=is; i<=ie; i++) { #if defined(_DEBUG1_) ustar[k][j][i] += - dtdx*(0.25*((u[k][j][i]+u[k][j][i+1])*(u[k][j][i]+u[k][j][i+1])) - 0.25*((u[k][j][i]+u[k][j][i-1])*(u[k][j][i]+u[k][j][i-1]))) - dtdy*(0.25*(u [k][j][i]+u [k][j+1][i])*(v [k][j][i]+v [k][j][i+1]) - 0.25*(u [k][j][i]+u [k][j-1][i])*(v [k][j-1][i]+v [k][j-1][i+1])) - dtdz*(0.25*(u [k][j][i]+u [k+1][j][i])*(w [k][j][i]+w [k][j][i+1]) - 0.25*(u [k][j][i]+u [k-1][j][i])*(w [k-1][j][i]+w [k-1][j][i+1])) + dtdxx*(u [k][j][i-1]-2*u [k][j][i]+u [k][j][i+1]) + dtdyy*(u [k][j-1][i]-2*u [k][j][i]+u [k][j+1][i]) + dtdzz*(u [k-1][j][i]-2*u [k][j][i]+u [k+1][j][i]); #else ustar[k][j][i] += - dtdx*(0.25*((ul[k][j][i]+ul[k][j][i+1])*(ul[k][j][i]+ul[k][j][i+1])) - 0.25*((ul[k][j][i]+ul[k][j][i-1])*(ul[k][j][i]+ul[k][j][i-1]))) - dtdy*(0.25*(ul[k][j][i]+ul[k][j+1][i])*(vl[k][j][i]+vl[k][j][i+1]) - 0.25*(ul[k][j][i]+ul[k][j-1][i])*(vl[k][j-1][i]+vl[k][j-1][i+1])) - dtdz*(0.25*(ul[k][j][i]+ul[k+1][j][i])*(wl[k][j][i]+wl[k][j][i+1]) - 0.25*(ul[k][j][i]+ul[k-1][j][i])*(wl[k-1][j][i]+wl[k-1][j][i+1])) + dtdxx*(ul[k][j][i-1]-2*ul[k][j][i]+ul[k][j][i+1]) + dtdyy*(ul[k][j-1][i]-2*ul[k][j][i]+ul[k][j+1][i]) + dtdzz*(ul[k-1][j][i]-2*ul[k][j][i]+ul[k+1][j][i]); #endif } } } ..... #if defined(_DEBUG1_) DMDAVecRestoreArray( ctxu->grid, ctxu->x, &u ); DMDAVecRestoreArray( ctxv->grid, ctxv->x, &v ); DMDAVecRestoreArray( ctxw->grid, ctxw->x, &w ); #else DMDAVecRestoreArray( ctxu->grid, ctxu->local, &ul ); DMDAVecRestoreArray( ctxv->grid, ctxv->local, &vl ); DMDAVecRestoreArray( ctxw->grid, ctxw->local, &wl ); DMRestoreLocalVector(ctxu->grid,&ctxu->local); DMRestoreLocalVector(ctxv->grid,&ctxv->local); DMRestoreLocalVector(ctxw->grid,&ctxw->local); #endif DMDAVecRestoreArray( ctxu->grid, ctxu->b, &ustar ); DMDAVecRestoreArray( ctxv->grid, ctxv->b, &vstar ); DMDAVecRestoreArray( ctxw->grid, ctxw->b, &wstar ); -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Aug 21 22:02:13 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 21 Aug 2013 22:02:13 -0500 Subject: [petsc-users] accessing global vector in a DMMA In-Reply-To: <165291f.318a.140a3a005bc.Coremail.zyzhang@nuaa.edu.cn> References: <165291f.318a.140a3a005bc.Coremail.zyzhang@nuaa.edu.cn> Message-ID: <4C88A48C-6423-4472-A527-F388C4398932@mcs.anl.gov> In PETSc the "local vector" or "local representation" refers to vectors WITH ghost points. The global vector or global representation refers to vectors WITHOUT ghost points. Hence to access locations like [i+1] which extend over to the next process you need to use the local ghosted representation. Barry One could argue that it is not the clearest names to use. Just remember local == ghosted and global == not ghosted. On Aug 21, 2013, at 8:25 PM, Zhang wrote: > Dear All, > Now I am confused with the way to access a global vector defined from DMDA. > > Here is the codes. When I switch on _DEBUG1_ the results get wrong. While if it's off, > > the results fine. I just wonder why I gave to use the local form of ul,vl,wl to access the > values such as ul[k][j][i+1], and etc. > > Thank you first for any suggestion. > > Zhenyu > > ierr = DMCreateGlobalVector(ctxu->grid, &ctxu->x );CHKERRQ(ierr); > ierr = VecDuplicate(ctxu->x,&ctxu->b);CHKERRQ(ierr); > ierr = DMCreateGlobalVector(ctxv->grid, &ctxv->x );CHKERRQ(ierr); > ierr = VecDuplicate(ctxv->x,&ctxv->b);CHKERRQ(ierr); > ierr = DMCreateGlobalVector(ctxw->grid, &ctxw->x );CHKERRQ(ierr); > ierr = VecDuplicate(ctxw->x,&ctxw->b);CHKERRQ(ierr); > ... > VecCopy(ctxu->x,ctxu->b); > VecCopy(ctxv->x,ctxv->b); > VecCopy(ctxw->x ,ctxw->b); > > DMDAVecGetArray( ctxu->grid, ctxu->b, &ustar ); > DMDAVecGetArray( ctxv->grid, ctxv->b, &vstar ); > DMDAVecGetArray( ctxw->grid, ctxw->b, &wstar ); > > #if defined(_DEBUG1_) > DMDAVecGetArray( ctxu->grid, ctxu->x, &u ); > DMDAVecGetArray( ctxv->grid, ctxv->x, &v ); > DMDAVecGetArray( ctxw->grid, ctxw->x, &w ); > #else > DMGetLocalVector(ctxu->grid,&ctxu->local); > DMGetLocalVector(ctxv->grid,&ctxv->local); > DMGetLocalVector(ctxw->grid,&ctxw->local); > DMGlobalToLocalBegin(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); > DMGlobalToLocalEnd(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); > DMGlobalToLocalBegin(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); > DMGlobalToLocalEnd(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); > & nbsp; DMGlobalToLocalBegin(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); > DMGlobalToLocalEnd(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); > DMDAVecGetArray( ctxu->grid, ctxu->local, &ul ); > DMDAVecGetArray( ctxv->grid, ctxv->local, &vl ); > DMDAVecGetArray( ctxw->grid, ctxw->local, &wl ); > #endif > > //---------------------------------------------------------------- > // U > DMDAGetCorners( ctxu->grid, &is, &js, &ks, &in, &jn, &kn ); > ie = is + in - 1; > je = js + jn - 1; > ke = ks + kn - 1; > > is=max(is,1); > js=max(js,1); > ks=max(ks,1); > ie=min(ie,ctxu->l-2); > je=min(je,ctxu->m-2); > ke=min(ke,ctxu->n-2); > > > for (k=ks; k<=ke; k++) { > for (j=js; j<=je; j++) { > for (i=is; i<=ie; i+ +) { > > #if defined(_DEBUG1_) > ustar[k][j][i] += > - dtdx*(0.25*((u[k][j][i]+u[k][j][i+1])*(u[k][j][i]+u[k][j][i+1])) > - 0.25*((u[k][j][i]+u[k][j][i-1])*(u[k][j][i]+u[k][j][i-1]))) > - dtdy*(0.25*(u [k][j][i]+u [k][j+1][i])*(v [k][j][i]+v [k][j][i+1]) > - 0.25*(u [k][j][i]+u [k][j-1][i])*(v [k][j-1][i]+v [k][j-1][i+1])) > - dtdz*(0.25*(u [k][j][i]+u [k+1][j][i])*(w [k][j][i]+w [k][j][i+1]) > - 0.25*(u [k][j][i]+u [k-1][j][i])*(w [k-1][j][i]+w [k-1][j][i+1])) > + dtdxx*(u [k][j][i-1]-2*u [k][j][i]+u [k][j] [i+1]) > + dtdyy*(u [k][j-1][i]-2*u [k][j][i]+u [k][j+1][i]) > + dtdzz*(u [k-1][j][i]-2*u [k][j][i]+u [k+1][j][i]); > #else > ustar[k][j][i] += > - dtdx*(0.25*((ul[k][j][i]+ul[k][j][i+1])*(ul[k][j][i]+ul[k][j][i+1])) > &nbs p; - 0.25*((ul[k][j][i]+ul[k][j][i-1])*(ul[k][j][i]+ul[k][j][i-1]))) > - dtdy*(0.25*(ul[k][j][i]+ul[k][j+1][i])*(vl[k][j][i]+vl[k][j][i+1]) > - 0.25*(ul[k][j][i]+ul[k][j-1][i])*(vl[k][j-1][i]+vl[k][j-1][i+1])) > - dtdz*(0.25*(ul[k][j][i]+ul[k+1][j][i])*(wl[k][j][i]+wl[k][j][i+1]) > &nb sp; - 0.25*(ul[k][j][i]+ul[k-1][j][i])*(wl[k-1][j][i]+wl[k-1][j][i+1])) > + dtdxx*(ul[k][j][i-1]-2*ul[k][j][i]+ul[k][j][i+1]) > + dtdyy*(ul[k][j-1][i]-2*ul[k][j][i]+ul[k][j+1][i]) > + dtdzz*(ul[k-1][j][i]-2*ul[k][j][i]+ul[k+1][j][i]); > #endif > } > } > } > > ..... > > #if defined(_DEBUG1_) > DMDAVecRestoreArray( ctxu->grid, ctxu->x, &u ); > DMDAVecRestoreArray( ctxv->grid, ctxv->x, &v ); > DMDAVecRestoreArray( ctxw->grid, ctxw->x, &w ); > #else > DMDAVecRestoreArray( ctxu->grid, ctxu->local, &ul ); > DMDAVecRestoreArray( ctxv->grid, ctxv->local, &vl ); > DMDAVecRestoreArray( ctxw->grid, ctxw->local, &wl ); > DMRestoreLocalVector(ctxu->grid,&ctxu->local); > DMRestoreLocalVector(ctxv->grid,&ctxv->local); > DMRestoreLocalVector(ctxw->grid,&ctxw->local); > #endif > > DMDAVecRestoreArray( ctxu->grid, ctxu->b, &ustar ); > DMDAVecRestoreArray( ctxv->grid, ctxv->b, &vstar ); > DMDAVecRestoreArray( ctxw->grid, ctxw->b, &wstar ); > > From zyzhang at nuaa.edu.cn Wed Aug 21 22:04:12 2013 From: zyzhang at nuaa.edu.cn (Zhang) Date: Thu, 22 Aug 2013 11:04:12 +0800 (GMT+08:00) Subject: [petsc-users] accessing global vector in a DMMA In-Reply-To: <4C88A48C-6423-4472-A527-F388C4398932@mcs.anl.gov> References: <165291f.318a.140a3a005bc.Coremail.zyzhang@nuaa.edu.cn> <4C88A48C-6423-4472-A527-F388C4398932@mcs.anl.gov> Message-ID: <1d579dc.3257.140a3faba04.Coremail.zyzhang@nuaa.edu.cn> Thank your help, Barry, If I assign the sum of local values (, such as ul[i+1]-2*ul[i]+ul[i-1],) to a global value ustar[i], 1. should I use the local value for ustar as well ? 2. Once I finished the work of assigning values for ustar and restored them, should I use VecAssemblyBegin/End to them later? Because I find such a usage in the examples. I am not sure it necessary or not. 3. AS a test if I choose the incorrect way, say, ustar[i] comes straightly from u[i+/-1]..., why I did not get any error, such as memory access out of range? Or even this is dangerous, such errors will not be shown in a serial run? Cheers, Zhenyu > -----????----- > ???: "Barry Smith" > ????: 2013-08-22 11:02:13 (???) > ???: Zhang > ??: petsc-users at mcs.anl.gov > ??: Re: [petsc-users] accessing global vector in a DMMA > > > In PETSc the "local vector" or "local representation" refers to vectors WITH ghost points. The global vector or global representation refers to vectors WITHOUT ghost points. Hence to access locations like [i+1] which extend over to the next process you need to use the local ghosted representation. > > Barry > > One could argue that it is not the clearest names to use. Just remember local == ghosted and global == not ghosted. > > > On Aug 21, 2013, at 8:25 PM, Zhang wrote: > > > Dear All, > > Now I am confused with the way to access a global vector defined from DMDA. > > > > Here is the codes. When I switch on _DEBUG1_ the results get wrong. While if it's off, > > > > the results fine. I just wonder why I gave to use the local form of ul,vl,wl to access the > > values such as ul[k][j][i+1], and etc. > > > > Thank you first for any suggestion. > > > > Zhenyu > > > > ierr = DMCreateGlobalVector(ctxu->grid, &ctxu->x );CHKERRQ(ierr); > > ierr = VecDuplicate(ctxu->x,&ctxu->b);CHKERRQ(ierr); > > ierr = DMCreateGlobalVector(ctxv->grid, &ctxv->x );CHKERRQ(ierr); > > ierr = VecDuplicate(ctxv->x,&ctxv->b);CHKERRQ(ierr); > > ierr = DMCreateGlobalVector(ctxw->grid, &ctxw->x );CHKERRQ(ierr); > > ierr = VecDuplicate(ctxw->x,&ctxw->b);CHKERRQ(ierr); > > ... > > VecCopy(ctxu->x,ctxu->b); > > VecCopy(ctxv->x,ctxv->b); > > VecCopy(ctxw->x ,ctxw->b); > > > > DMDAVecGetArray( ctxu->grid, ctxu->b, &ustar ); > > DMDAVecGetArray( ctxv->grid, ctxv->b, &vstar ); > > DMDAVecGetArray( ctxw->grid, ctxw->b, &wstar ); > > > > #if defined(_DEBUG1_) > > DMDAVecGetArray( ctxu->grid, ctxu->x, &u ); > > DMDAVecGetArray( ctxv->grid, ctxv->x, &v ); > > DMDAVecGetArray( ctxw->grid, ctxw->x, &w ); > > #else > > DMGetLocalVector(ctxu->grid,&ctxu->local); > > DMGetLocalVector(ctxv->grid,&ctxv->local); > > DMGetLocalVector(ctxw->grid,&ctxw->local); > > DMGlobalToLocalBegin(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); > > DMGlobalToLocalEnd(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); > > DMGlobalToLocalBegin(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); > > DMGlobalToLocalEnd(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); > > & nbsp; DMGlobalToLocalBegin(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); > > DMGlobalToLocalEnd(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); > > DMDAVecGetArray( ctxu->grid, ctxu->local, &ul ); > > DMDAVecGetArray( ctxv->grid, ctxv->local, &vl ); > > DMDAVecGetArray( ctxw->grid, ctxw->local, &wl ); > > #endif > > > > //---------------------------------------------------------------- > > // U > > DMDAGetCorners( ctxu->grid, &is, &js, &ks, &in, &jn, &kn ); > > ie = is + in - 1; > > je = js + jn - 1; > > ke = ks + kn - 1; > > > > is=max(is,1); > > js=max(js,1); > > ks=max(ks,1); > > ie=min(ie,ctxu->l-2); > > je=min(je,ctxu->m-2); > > ke=min(ke,ctxu->n-2); > > > > > > for (k=ks; k<=ke; k++) { > > for (j=js; j<=je; j++) { > > for (i=is; i<=ie; i+ +) { > > > > #if defined(_DEBUG1_) > > ustar[k][j][i] += > > - dtdx*(0.25*((u[k][j][i]+u[k][j][i+1])*(u[k][j][i]+u[k][j][i+1])) > > - 0.25*((u[k][j][i]+u[k][j][i-1])*(u[k][j][i]+u[k][j][i-1]))) > > - dtdy*(0.25*(u [k][j][i]+u [k][j+1][i])*(v [k][j][i]+v [k][j][i+1]) > > - 0.25*(u [k][j][i]+u [k][j-1][i])*(v [k][j-1][i]+v [k][j-1][i+1])) > > - dtdz*(0.25*(u [k][j][i]+u [k+1][j][i])*(w [k][j][i]+w [k][j][i+1]) > > - 0.25*(u [k][j][i]+u [k-1][j][i])*(w [k-1][j][i]+w [k-1][j][i+1])) > > + dtdxx*(u [k][j][i-1]-2*u [k][j][i]+u [k][j] [i+1]) > > + dtdyy*(u [k][j-1][i]-2*u [k][j][i]+u [k][j+1][i]) > > + dtdzz*(u [k-1][j][i]-2*u [k][j][i]+u [k+1][j][i]); > > #else > > ustar[k][j][i] += > > - dtdx*(0.25*((ul[k][j][i]+ul[k][j][i+1])*(ul[k][j][i]+ul[k][j][i+1])) > > &nbs p; - 0.25*((ul[k][j][i]+ul[k][j][i-1])*(ul[k][j][i]+ul[k][j][i-1]))) > > - dtdy*(0.25*(ul[k][j][i]+ul[k][j+1][i])*(vl[k][j][i]+vl[k][j][i+1]) > > - 0.25*(ul[k][j][i]+ul[k][j-1][i])*(vl[k][j-1][i]+vl[k][j-1][i+1])) > > - dtdz*(0.25*(ul[k][j][i]+ul[k+1][j][i])*(wl[k][j][i]+wl[k][j][i+1]) > > &nb sp; - 0.25*(ul[k][j][i]+ul[k-1][j][i])*(wl[k-1][j][i]+wl[k-1][j][i+1])) > > + dtdxx*(ul[k][j][i-1]-2*ul[k][j][i]+ul[k][j][i+1]) > > + dtdyy*(ul[k][j-1][i]-2*ul[k][j][i]+ul[k][j+1][i]) > > + dtdzz*(ul[k-1][j][i]-2*ul[k][j][i]+ul[k+1][j][i]); > > #endif > > } > > } > > } > > > > ..... > > > > #if defined(_DEBUG1_) > > DMDAVecRestoreArray( ctxu->grid, ctxu->x, &u ); > > DMDAVecRestoreArray( ctxv->grid, ctxv->x, &v ); > > DMDAVecRestoreArray( ctxw->grid, ctxw->x, &w ); > > #else > > DMDAVecRestoreArray( ctxu->grid, ctxu->local, &ul ); > > DMDAVecRestoreArray( ctxv->grid, ctxv->local, &vl ); > > DMDAVecRestoreArray( ctxw->grid, ctxw->local, &wl ); > > DMRestoreLocalVector(ctxu->grid,&ctxu->local); > > DMRestoreLocalVector(ctxv->grid,&ctxv->local); > > DMRestoreLocalVector(ctxw->grid,&ctxw->local); > > #endif > > > > DMDAVecRestoreArray( ctxu->grid, ctxu->b, &ustar ); > > DMDAVecRestoreArray( ctxv->grid, ctxv->b, &vstar ); > > DMDAVecRestoreArray( ctxw->grid, ctxw->b, &wstar ); > > > > > From bsmith at mcs.anl.gov Wed Aug 21 22:43:47 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 21 Aug 2013 22:43:47 -0500 Subject: [petsc-users] accessing global vector in a DMMA In-Reply-To: <1d579dc.3257.140a3faba04.Coremail.zyzhang@nuaa.edu.cn> References: <165291f.318a.140a3a005bc.Coremail.zyzhang@nuaa.edu.cn> <4C88A48C-6423-4472-A527-F388C4398932@mcs.anl.gov> <1d579dc.3257.140a3faba04.Coremail.zyzhang@nuaa.edu.cn> Message-ID: On Aug 21, 2013, at 10:04 PM, Zhang wrote: > Thank your help, Barry, > > If I assign the sum of local values (, such as ul[i+1]-2*ul[i]+ul[i-1],) to a global value ustar[i], > 1. should I use the local value for ustar as well ? Generally with finite differences there is no reason to use the local ghosted vector for the result since each process calculates its own values and doesn't calculate any values that to into the ghosted points. > 2. Once I finished the work of assigning values for ustar and restored them, should I use VecAssemblyBegin/End to them later? No, if you use any of the vecgetarray functions you do not use VecAssemblyBegin/End() (though it is harmless to call the VecAssemblyBegin/end because they end up doing nothing. > Because I find such a usage in the examples. I am not sure it necessary or not. > 3. AS a test if I choose the incorrect way, say, ustar[i] comes straightly from u[i+/-1]..., why I did not get any error, such as > memory access out of range? Or even this is dangerous, such errors will not be shown in a serial run? No errors in serial runs because there are no ghost points between processes. In parallel you will get wrong answers but may or may not get crashes due to memory access out of range. We highly recommend making some parallel runs with valgrind because it finds almost all memory access bugs even ones that don't crash the code. http://www.mcs.anl.gov/petsc/petsc-as/documentation/faq.html#valgrind Barry > > Cheers, > > Zhenyu > > >> -----????----- >> ???: "Barry Smith" >> ????: 2013-08-22 11:02:13 (???) >> ???: Zhang >> ??: petsc-users at mcs.anl.gov >> ??: Re: [petsc-users] accessing global vector in a DMMA >> >> >> In PETSc the "local vector" or "local representation" refers to vectors WITH ghost points. The global vector or global representation refers to vectors WITHOUT ghost points. Hence to access locations like [i+1] which extend over to the next process you need to use the local ghosted representation. >> >> Barry >> >> One could argue that it is not the clearest names to use. Just remember local == ghosted and global == not ghosted. >> >> >> On Aug 21, 2013, at 8:25 PM, Zhang wrote: >> >>> Dear All, >>> Now I am confused with the way to access a global vector defined from DMDA. >>> >>> Here is the codes. When I switch on _DEBUG1_ the results get wrong. While if it's off, >>> >>> the results fine. I just wonder why I gave to use the local form of ul,vl,wl to access the >>> values such as ul[k][j][i+1], and etc. >>> >>> Thank you first for any suggestion. >>> >>> Zhenyu >>> >>> ierr = DMCreateGlobalVector(ctxu->grid, &ctxu->x );CHKERRQ(ierr); >>> ierr = VecDuplicate(ctxu->x,&ctxu->b);CHKERRQ(ierr); >>> ierr = DMCreateGlobalVector(ctxv->grid, &ctxv->x );CHKERRQ(ierr); >>> ierr = VecDuplicate(ctxv->x,&ctxv->b);CHKERRQ(ierr); >>> ierr = DMCreateGlobalVector(ctxw->grid, &ctxw->x );CHKERRQ(ierr); >>> ierr = VecDuplicate(ctxw->x,&ctxw->b);CHKERRQ(ierr); >>> ... >>> VecCopy(ctxu->x,ctxu->b); >>> VecCopy(ctxv->x,ctxv->b); >>> VecCopy(ctxw->x ,ctxw->b); >>> >>> DMDAVecGetArray( ctxu->grid, ctxu->b, &ustar ); >>> DMDAVecGetArray( ctxv->grid, ctxv->b, &vstar ); >>> DMDAVecGetArray( ctxw->grid, ctxw->b, &wstar ); >>> >>> #if defined(_DEBUG1_) >>> DMDAVecGetArray( ctxu->grid, ctxu->x, &u ); >>> DMDAVecGetArray( ctxv->grid, ctxv->x, &v ); >>> DMDAVecGetArray( ctxw->grid, ctxw->x, &w ); >>> #else >>> DMGetLocalVector(ctxu->grid,&ctxu->local); >>> DMGetLocalVector(ctxv->grid,&ctxv->local); >>> DMGetLocalVector(ctxw->grid,&ctxw->local); >>> DMGlobalToLocalBegin(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); >>> DMGlobalToLocalEnd(ctxu->grid,ctxu->x,INSERT_VALUES,ctxu->local); >>> DMGlobalToLocalBegin(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); >>> DMGlobalToLocalEnd(ctxv->grid,ctxv->x,INSERT_VALUES,ctxv->local); >>> & nbsp; DMGlobalToLocalBegin(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); >>> DMGlobalToLocalEnd(ctxw->grid,ctxw->x,INSERT_VALUES,ctxw->local); >>> DMDAVecGetArray( ctxu->grid, ctxu->local, &ul ); >>> DMDAVecGetArray( ctxv->grid, ctxv->local, &vl ); >>> DMDAVecGetArray( ctxw->grid, ctxw->local, &wl ); >>> #endif >>> >>> //---------------------------------------------------------------- >>> // U >>> DMDAGetCorners( ctxu->grid, &is, &js, &ks, &in, &jn, &kn ); >>> ie = is + in - 1; >>> je = js + jn - 1; >>> ke = ks + kn - 1; >>> >>> is=max(is,1); >>> js=max(js,1); >>> ks=max(ks,1); >>> ie=min(ie,ctxu->l-2); >>> je=min(je,ctxu->m-2); >>> ke=min(ke,ctxu->n-2); >>> >>> >>> for (k=ks; k<=ke; k++) { >>> for (j=js; j<=je; j++) { >>> for (i=is; i<=ie; i+ +) { >>> >>> #if defined(_DEBUG1_) >>> ustar[k][j][i] += >>> - dtdx*(0.25*((u[k][j][i]+u[k][j][i+1])*(u[k][j][i]+u[k][j][i+1])) >>> - 0.25*((u[k][j][i]+u[k][j][i-1])*(u[k][j][i]+u[k][j][i-1]))) >>> - dtdy*(0.25*(u [k][j][i]+u [k][j+1][i])*(v [k][j][i]+v [k][j][i+1]) >>> - 0.25*(u [k][j][i]+u [k][j-1][i])*(v [k][j-1][i]+v [k][j-1][i+1])) >>> - dtdz*(0.25*(u [k][j][i]+u [k+1][j][i])*(w [k][j][i]+w [k][j][i+1]) >>> - 0.25*(u [k][j][i]+u [k-1][j][i])*(w [k-1][j][i]+w [k-1][j][i+1])) >>> + dtdxx*(u [k][j][i-1]-2*u [k][j][i]+u [k][j] [i+1]) >>> + dtdyy*(u [k][j-1][i]-2*u [k][j][i]+u [k][j+1][i]) >>> + dtdzz*(u [k-1][j][i]-2*u [k][j][i]+u [k+1][j][i]); >>> #else >>> ustar[k][j][i] += >>> - dtdx*(0.25*((ul[k][j][i]+ul[k][j][i+1])*(ul[k][j][i]+ul[k][j][i+1])) >>> &nbs p; - 0.25*((ul[k][j][i]+ul[k][j][i-1])*(ul[k][j][i]+ul[k][j][i-1]))) >>> - dtdy*(0.25*(ul[k][j][i]+ul[k][j+1][i])*(vl[k][j][i]+vl[k][j][i+1]) >>> - 0.25*(ul[k][j][i]+ul[k][j-1][i])*(vl[k][j-1][i]+vl[k][j-1][i+1])) >>> - dtdz*(0.25*(ul[k][j][i]+ul[k+1][j][i])*(wl[k][j][i]+wl[k][j][i+1]) >>> &nb sp; - 0.25*(ul[k][j][i]+ul[k-1][j][i])*(wl[k-1][j][i]+wl[k-1][j][i+1])) >>> + dtdxx*(ul[k][j][i-1]-2*ul[k][j][i]+ul[k][j][i+1]) >>> + dtdyy*(ul[k][j-1][i]-2*ul[k][j][i]+ul[k][j+1][i]) >>> + dtdzz*(ul[k-1][j][i]-2*ul[k][j][i]+ul[k+1][j][i]); >>> #endif >>> } >>> } >>> } >>> >>> ..... >>> >>> #if defined(_DEBUG1_) >>> DMDAVecRestoreArray( ctxu->grid, ctxu->x, &u ); >>> DMDAVecRestoreArray( ctxv->grid, ctxv->x, &v ); >>> DMDAVecRestoreArray( ctxw->grid, ctxw->x, &w ); >>> #else >>> DMDAVecRestoreArray( ctxu->grid, ctxu->local, &ul ); >>> DMDAVecRestoreArray( ctxv->grid, ctxv->local, &vl ); >>> DMDAVecRestoreArray( ctxw->grid, ctxw->local, &wl ); >>> DMRestoreLocalVector(ctxu->grid,&ctxu->local); >>> DMRestoreLocalVector(ctxv->grid,&ctxv->local); >>> DMRestoreLocalVector(ctxw->grid,&ctxw->local); >>> #endif >>> >>> DMDAVecRestoreArray( ctxu->grid, ctxu->b, &ustar ); >>> DMDAVecRestoreArray( ctxv->grid, ctxv->b, &vstar ); >>> DMDAVecRestoreArray( ctxw->grid, ctxw->b, &wstar ); >>> >>> >> > From jedbrown at mcs.anl.gov Wed Aug 21 23:07:41 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 22 Aug 2013 00:07:41 -0400 Subject: [petsc-users] accessing global vector in a DMMA In-Reply-To: <165291f.318a.140a3a005bc.Coremail.zyzhang@nuaa.edu.cn> References: <165291f.318a.140a3a005bc.Coremail.zyzhang@nuaa.edu.cn> Message-ID: <87a9kal65u.fsf@mcs.anl.gov> Zhang writes: > Dear All, > Now I am confused with the way to access a global vector defined from DMDA. > > Here is the codes. When I switch on _DEBUG1_ the results get wrong. While if it's off, > > the results fine. I just wonder why I gave to use the local form of ul,vl,wl to access the > values such as ul[k][j][i+1], and etc. The global vector doesn't contain space for those ghost points, so you have to scatter to the local vector if you want to access ghost points. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From bisheshkh at gmail.com Thu Aug 22 01:39:08 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Thu, 22 Aug 2013 08:39:08 +0200 Subject: [petsc-users] PETSC ERROR: Logging has not been enabled In-Reply-To: <0A069F69-2A0A-45B2-A714-2960DB905AD7@mcs.anl.gov> References: <0A069F69-2A0A-45B2-A714-2960DB905AD7@mcs.anl.gov> Message-ID: On Thu, Aug 22, 2013 at 12:12 AM, Barry Smith wrote: > > Most likely the tool you are using to launch the parallel program is > wrong for the MPI you have linked PETSc with. Are you starting the program > with mpiexec ? Is that mpiexec the one that goes with the MPI (mpicc or > mpif90) that you built PETSc with? > Thanks Barry, I'm using the bin/petscmpiexec of the petsc install. But this is super strange, after having the errors yesterday I gave up and slept. Now, I just turned on the computer and rebuilt the project and run the same thing again and it works without any errors. This is very very strange! > > What happens if you compile a trivial MPI only code with the mpicc and > then try to run it in parallel with the mpiexec? > > > Barry > > On Aug 21, 2013, at 5:05 PM, Bishesh Khanal wrote: > > > Dear all, > > My program runs fine when using just one processor, valgrind shows no > errors too, but when using more than one processor I get the following > errors: > > > > [0]PETSC ERROR: PetscOptionsInsertFile() line 461 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c > > [0]PETSC ERROR: PetscOptionsInsert() line 623 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/options.c > > [0]PETSC ERROR: PetscInitialize() line 769 in > /home/bkhanal/Documents/softwares/petsc-3.4.2/src/sys/objects/pinit.c > > PETSC ERROR: Logging has not been enabled. > > You might have forgotten to call PetscInitialize(). > > application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 > > [cli_0]: aborting job: > > application called MPI_Abort(MPI_COMM_WORLD, 56) - process 0 > > > > > =================================================================================== > > = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES > > = EXIT CODE: 56 > > = CLEANING UP REMAINING PROCESSES > > = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES > > > =================================================================================== > > > > I have not forgotten to call PetscInitialize, if that helps! > > Thanks, > > Bishesh > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From potaman at outlook.com Thu Aug 22 08:42:04 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Thu, 22 Aug 2013 09:42:04 -0400 Subject: [petsc-users] Hypre running out of memory. Message-ID: Hi, I have been trying to use hypre to precondition my Linear Elasticity solver and I've been running out of memory . The problem is that the code is not failing gracefully. The server, where I am running the code on just stops responding and one needs to wait till the code crashes. Can I force the code to crash gracefully when it runs out of memory? Thanks, Subramanya -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Thu Aug 22 10:19:12 2013 From: mfadams at lbl.gov (Mark F. Adams) Date: Thu, 22 Aug 2013 11:19:12 -0400 Subject: [petsc-users] Hypre running out of memory. In-Reply-To: References: Message-ID: <6274499A-DFF1-4E5B-BAF3-B6DB5BEC9E43@lbl.gov> On Aug 22, 2013, at 9:42 AM, subramanya sadasiva wrote: > Hi, > I have been trying to use hypre to precondition my Linear Elasticity solver and I've been running out of memory . The problem is that the code is not failing gracefully. The server, where I am running the code on just stops responding and one needs to wait till the code crashes. Can I force the code to crash gracefully when it runs out of memory? > We try to make things fail gracefully but it sounds like you are paging (so not really running out of memory) and you are bring the machine to its knees. Perhaps you can turn paging off? Once your paging your dead anyway. > Thanks, > Subramanya -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Thu Aug 22 17:39:37 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 23 Aug 2013 06:39:37 +0800 (CST) Subject: [petsc-users] how to reate an IS of size 0 on all the other processes and a seq Vec of the size zero on all other processes Message-ID: <529ffb19.541.140a82ed9a6.Coremail.ztdepyahoo@163.com> -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Aug 22 17:45:39 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 22 Aug 2013 17:45:39 -0500 Subject: [petsc-users] how to reate an IS of size 0 on all the other processes and a seq Vec of the size zero on all other processes In-Reply-To: <529ffb19.541.140a82ed9a6.Coremail.ztdepyahoo@163.com> References: <529ffb19.541.140a82ed9a6.Coremail.ztdepyahoo@163.com> Message-ID: <9096A2D6-66FD-4C84-B7FC-C44317A6F567@mcs.anl.gov> just do it. MPI_Comm_rank(comm,&rank); if (rank) { ISCreateGeneral(comm,0?.) VecCreateSeq(comm,0?. } else { ISCreateGeneral(comm,N,?..) VecCreateSeq(comm,N?..) } Barry Note for scalability reasons this is generally not a wise thing to do. On Aug 22, 2013, at 5:39 PM, ??? wrote: > > > From bisheshkh at gmail.com Fri Aug 23 04:31:22 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 23 Aug 2013 11:31:22 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: Thanks Matt and Mark for comments in using near null space [question I asked in the thread with subject: *problem (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster*]. So I understood that I have to set a nearNullSpace to A00 block where the null space correspond to the rigid body motion. I tried it but still the gamg just keeps on iterating and convergence is very very slow. I am not sure what the problem is, right now gamg does not even work for the constant viscosity case. I have set up the following in my code: 1. null space for the whole system A 2. null space for the Schur complement S 3. Near null space for A00 4. a user preconditioner matrix of inverse viscosity in the diagonal for S. I am testing a small problem with CONSTANT viscosity for grid size of 14^3 with the run time option: -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason -fieldsplit_1_ksp_monitor_true_residual Here is my relevant code of the solve function: PetscErrorCode ierr; PetscFunctionBeginUser; ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); ierr = DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with dof = 4, vx,vy,vz and p. ierr = KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main system ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register the fieldsplits obtained from options. //Setting up user PC for Schur Complement ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); ierr = PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); KSP *subKsp; PetscInt subKspPos = 0; //Set up nearNullspace for A00 block. ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); MatNullSpace rigidBodyModes; Vec coords; ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); ierr = MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); Mat matA00; ierr = KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); //Position 1 => Ksp corresponding to Schur complement S on pressure space subKspPos = 1; ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); //Set up the null space of constant pressure. ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); PetscBool isNull; Mat matSc; ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); if(!isNull) SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid pressure null space \n"); ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); if(!isNull) SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system null space \n"); ierr = PetscFree(subKsp);CHKERRQ(ierr); ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); PetscFunctionReturn(0); On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: > On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: > >> >> >> >> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >> >>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown wrote: >>>>>>>>>> >>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>> >>>>>>>>>>> > Now, I implemented two different approaches, each for both 2D >>>>>>>>>>> and 3D, in >>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>>> solving it for >>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>> > I use staggered grid with p on cell centers, and components of >>>>>>>>>>> v on cell >>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>> account for the >>>>>>>>>>> > variable viscosity case) >>>>>>>>>>> >>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>> discretization of >>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>> recommend >>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction (make >>>>>>>>>>> that >>>>>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>>>>> PCLSC or >>>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>>> containing >>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>> diagonal >>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>> velocity >>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>> PCFieldSplit >>>>>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>>>>> stuck. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix to >>>>>>>>>> use as the preconditioner for the Schur complement solve step as you >>>>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>> A naive approach that I can think of would be to create a vector >>>>>>>>>> with its components as reciprocal viscosities of the cell centers >>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>>> dof as explained before. >>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the >>>>>>>>>> S_p matrix would correspond to only pressure space. Should the distribution >>>>>>>>>> of the rows of S_p among different processor not correspond to the >>>>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>>>> >>>>>>>>> >>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. >>>>>>>>> The P distribution is the same as the entire problem divided by 4. >>>>>>>>> >>>>>>>> >>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but with >>>>>>>> dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>> the vertices ? >>>>>>>> >>>>>>> >>>>>>> Yes. >>>>>>> >>>>>>> >>>>>>>> Even then I think there might be a problem due to the presence of >>>>>>>> "fictitious pressure vertices". The system matrix (A) contains an identity >>>>>>>> corresponding to these fictitious pressure nodes, thus when using a >>>>>>>> -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>> >>>>>>> >>>>>>> Don't use detect_saddle, but split it by fields >>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>> >>>>>> >>>>>> How can I set this split in the code itself without giving it as a >>>>>> command line option when the system matrix is assembled from the DMDA for >>>>>> the whole system with 4 dofs. (i.e. *without* using the DMComposite >>>>>> or *without* using the nested block matrices to assemble different >>>>>> blocks separately and then combine them together). >>>>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>>>> because not using detect_saddle_point means I cannot use >>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>> >>>>> >>>>> This is currently a real problem with the DMDA. In the unstructured >>>>> case, where we always need specialized spaces, you can >>>>> use something like >>>>> >>>>> PetscObject pressure; >>>>> MatNullSpace nullSpacePres; >>>>> >>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, >>>>> 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>>> nullSpacePres);CHKERRQ(ierr); >>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>> >>>>> and then DMGetSubDM() uses this information to attach the null space >>>>> to the IS that is created using the information in the PetscSection. >>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>> think this works correctly, but this has not been tested at all and is very >>>>> new code. Eventually, I think we want all DMs to use this mechanism, >>>>> but we are still working it out. >>>>> >>>> >>>> Currently I do not use PetscSection. If this makes a cleaner approach, >>>> I'd try it too but may a bit later (right now I'd like test my model with a >>>> quickfix even if it means a little dirty code!) >>>> >>>> >>>>> >>>>> Bottom line: For custom null spaces using the default layout in DMDA, >>>>> you need to take apart the PCFIELDSPLIT after it has been setup, >>>>> which is somewhat subtle. You need to call KSPSetUp() and then reach >>>>> in and get the PC, and the subKSPs. I don't like this at all, but we >>>>> have not reorganized that code (which could be very simple and >>>>> inflexible since its very structured). >>>>> >>>> >>>> So I tried to get this approach working but I could not succeed and >>>> encountered some errors. Here is a code snippet: >>>> >>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>> velocity components and 1 pressure comp.) >>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>> ierr = >>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>> for the complete system. >>>> ierr = >>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>> //This I expect would register these options I give:-pc_type fieldsplit >>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>> //-pc_fieldsplit_1_fields 3 >>>> >>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>> >>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>> obtained from the options (fieldsplit) >>>> >>>> ierr = >>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>> >>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>> >>>> KSP *kspSchur; >>>> PetscInt kspSchurPos = 1; >>>> ierr = >>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>> ierr = >>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>> //The null space is the one that correspond to only pressure nodes, created >>>> using the mDaPressure. >>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>> >>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>> >>> >>> Sorry, you need to return to the old DMDA behavior, so you want >>> >>> -pc_fieldsplit_dm_splits 0 >>> >> >> Thanks, with this it seems I can attach the null space properly, but I >> have a question regarding whether the Schur complement ksp solver is >> actually using the preconditioner matrix I provide. >> When using -ksp_view, the outer level pc object of type fieldsplit does >> report that: "Preconditioner for the Schur complement formed from user >> provided matrix", but in the KSP solver for Schur complement S, the pc >> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >> matrix I provide. Am I missing something here ? >> Below are the relevant commented code snippet and the output of the >> -ksp_view >> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >> > > If ILU does not error, it means it is using your matrix, because the Schur > complement matrix cannot be factored, and FS says it is using your matrix. > > Matt > > >> Code snippet: >> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >> nullspace for the whole system >> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up mKsp >> with the options provided with fieldsplit and the fields associated with >> the two splits. >> >> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); //Get >> the fieldsplit pc set up from the options >> >> ierr = >> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >> //Use mPcForSc as the preconditioner for Schur Complement >> >> KSP *kspSchur; >> PetscInt kspSchurPos = 1; >> ierr = >> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >> ierr = >> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >> //Attach the null-space for the Schur complement ksp solver. >> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >> >> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >> >> >> >> the output of the -ksp_view >> KSP Object: 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> has attached null space >> using PRECONDITIONED norm type for convergence test >> PC Object: 1 MPI processes >> type: fieldsplit >> FieldSplit with Schur preconditioner, blocksize = 4, factorization >> FULL >> Preconditioner for the Schur complement formed from user provided >> matrix >> Split info: >> Split number 0 Fields 0, 1, 2 >> Split number 1 Fields 3 >> KSP solver for A00 block >> KSP Object: (fieldsplit_0_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> package used to perform factorization: petsc >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> KSP solver for S = A11 - A10 inv(A00) A01 >> KSP Object: (fieldsplit_1_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> has attached null space >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_1_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=729 >> package used to perform factorization: petsc >> total: nonzeros=15625, allocated nonzeros=15625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Matrix Object: 1 MPI processes >> type: schurcomplement >> rows=729, cols=729 >> Schur complement A11 - A10 inv(A00) A01 >> A11 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=729 >> total: nonzeros=15625, allocated nonzeros=15625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> A10 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=2187 >> total: nonzeros=46875, allocated nonzeros=46875 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP of A00 >> KSP Object: (fieldsplit_0_) 1 >> MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) >> Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI >> processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> package used to perform factorization: petsc >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues >> calls =0 >> using I-node routines: found 729 nodes, limit >> used is 5 >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=2187 >> total: nonzeros=140625, allocated nonzeros=140625 >> total number of mallocs used during MatSetValues calls >> =0 >> using I-node routines: found 729 nodes, limit used is >> 5 >> A01 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2187, cols=729 >> total: nonzeros=46875, allocated nonzeros=46875 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=729, cols=729 >> total: nonzeros=15625, allocated nonzeros=15625 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2916, cols=2916, bs=4 >> total: nonzeros=250000, allocated nonzeros=250000 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 729 nodes, limit used is 5 >> >> >> >> >> >>> >>> or >>> >>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> The errors I get when running with options: -pc_type fieldsplit >>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>> -pc_fieldsplit_1_fields 3 >>>> [0]PETSC ERROR: --------------------- Error Message >>>> ------------------------------------ >>>> [0]PETSC ERROR: No support for this operation for this object type! >>>> [0]PETSC ERROR: Support only implemented for 2d! >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards >>>> by bkhanal Tue Aug 6 17:35:30 2013 >>>> [0]PETSC ERROR: Libraries linked from >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>> -with-clanguage=cxx --download-hypre=1 >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: solveModel() line 181 in >>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>> WARNING! There are options you set that were not used! >>>> WARNING! could be spelling mistake, etc! >>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 23 07:09:58 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Aug 2013 07:09:58 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal wrote: > > Thanks Matt and Mark for comments in using near null space [question I > asked in the thread with subject: *problem (Segmentation voilation) using > -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster*]. > So I understood that I have to set a nearNullSpace to A00 block where the > null space correspond to the rigid body motion. I tried it but still the > gamg just keeps on iterating and convergence is very very slow. I am not > sure what the problem is, right now gamg does not even work for the > constant viscosity case. > I have set up the following in my code: > 1. null space for the whole system A 2. null space for the Schur > complement S 3. Near null space for A00 4. a user preconditioner matrix of > inverse viscosity in the diagonal for S. > If you want to debug solvers, you HAVE to send -ksp_view. Matt > I am testing a small problem with CONSTANT viscosity for grid size of 14^3 > with the run time option: > -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur > -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 > -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view > -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg > -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_ksp_monitor_true_residual > > Here is my relevant code of the solve function: > PetscErrorCode ierr; > PetscFunctionBeginUser; > ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); > ierr = > DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); > ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with dof = > 4, vx,vy,vz and p. > ierr = KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for > the main system > ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); > ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register the > fieldsplits obtained from options. > > //Setting up user PC for Schur Complement > ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); > ierr = > PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); > > KSP *subKsp; > PetscInt subKspPos = 0; > //Set up nearNullspace for A00 block. > ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); > MatNullSpace rigidBodyModes; > Vec coords; > ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); > ierr = > MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); > Mat matA00; > ierr = KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); > ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); > ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); > > //Position 1 => Ksp corresponding to Schur complement S on pressure > space > subKspPos = 1; > ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); > //Set up the null space of constant pressure. > ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); > PetscBool isNull; > Mat matSc; > ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); > ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); > if(!isNull) > SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid pressure null > space \n"); > ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); > ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); > if(!isNull) > SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system null > space \n"); > > ierr = PetscFree(subKsp);CHKERRQ(ierr); > ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); > ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); > ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); > > > PetscFunctionReturn(0); > > > On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: > >> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >>> >>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal >>>>>>> > wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown >>>>>>>>>> > wrote: >>>>>>>>>>> >>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>> >>>>>>>>>>>> > Now, I implemented two different approaches, each for both 2D >>>>>>>>>>>> and 3D, in >>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>>>> solving it for >>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>> > I use staggered grid with p on cell centers, and components >>>>>>>>>>>> of v on cell >>>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>>> account for the >>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>> >>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>> discretization of >>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>>> recommend >>>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction >>>>>>>>>>>> (make that >>>>>>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>>>>>> PCLSC or >>>>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>>>> containing >>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>>> diagonal >>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>>> velocity >>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>>>>>> stuck. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix >>>>>>>>>>> to use as the preconditioner for the Schur complement solve step as you >>>>>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>> A naive approach that I can think of would be to create a vector >>>>>>>>>>> with its components as reciprocal viscosities of the cell centers >>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>>>> dof as explained before. >>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the >>>>>>>>>>> S_p matrix would correspond to only pressure space. Should the distribution >>>>>>>>>>> of the rows of S_p among different processor not correspond to the >>>>>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. >>>>>>>>>> The P distribution is the same as the entire problem divided by 4. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but >>>>>>>>> with dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>>> the vertices ? >>>>>>>>> >>>>>>>> >>>>>>>> Yes. >>>>>>>> >>>>>>>> >>>>>>>>> Even then I think there might be a problem due to the presence >>>>>>>>> of "fictitious pressure vertices". The system matrix (A) contains an >>>>>>>>> identity corresponding to these fictitious pressure nodes, thus when using >>>>>>>>> a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>> >>>>>>>> >>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>> >>>>>>> >>>>>>> How can I set this split in the code itself without giving it as a >>>>>>> command line option when the system matrix is assembled from the DMDA for >>>>>>> the whole system with 4 dofs. (i.e. *without* using the DMComposite >>>>>>> or *without* using the nested block matrices to assemble different >>>>>>> blocks separately and then combine them together). >>>>>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>>>>> because not using detect_saddle_point means I cannot use >>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>> >>>>>> >>>>>> This is currently a real problem with the DMDA. In the unstructured >>>>>> case, where we always need specialized spaces, you can >>>>>> use something like >>>>>> >>>>>> PetscObject pressure; >>>>>> MatNullSpace nullSpacePres; >>>>>> >>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, >>>>>> 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>>>> nullSpacePres);CHKERRQ(ierr); >>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>> >>>>>> and then DMGetSubDM() uses this information to attach the null space >>>>>> to the IS that is created using the information in the PetscSection. >>>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>>> think this works correctly, but this has not been tested at all and is very >>>>>> new code. Eventually, I think we want all DMs to use this mechanism, >>>>>> but we are still working it out. >>>>>> >>>>> >>>>> Currently I do not use PetscSection. If this makes a cleaner approach, >>>>> I'd try it too but may a bit later (right now I'd like test my model with a >>>>> quickfix even if it means a little dirty code!) >>>>> >>>>> >>>>>> >>>>>> Bottom line: For custom null spaces using the default layout in DMDA, >>>>>> you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>> which is somewhat subtle. You need to call KSPSetUp() and then reach >>>>>> in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>> have not reorganized that code (which could be very simple and >>>>>> inflexible since its very structured). >>>>>> >>>>> >>>>> So I tried to get this approach working but I could not succeed and >>>>> encountered some errors. Here is a code snippet: >>>>> >>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>>> velocity components and 1 pressure comp.) >>>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>> ierr = >>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>>> for the complete system. >>>>> ierr = >>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>> //-pc_fieldsplit_1_fields 3 >>>>> >>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>> >>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>>> obtained from the options (fieldsplit) >>>>> >>>>> ierr = >>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>> >>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>> >>>>> KSP *kspSchur; >>>>> PetscInt kspSchurPos = 1; >>>>> ierr = >>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>> ierr = >>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>> //The null space is the one that correspond to only pressure nodes, created >>>>> using the mDaPressure. >>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>> >>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>> >>>> >>>> Sorry, you need to return to the old DMDA behavior, so you want >>>> >>>> -pc_fieldsplit_dm_splits 0 >>>> >>> >>> Thanks, with this it seems I can attach the null space properly, but I >>> have a question regarding whether the Schur complement ksp solver is >>> actually using the preconditioner matrix I provide. >>> When using -ksp_view, the outer level pc object of type fieldsplit does >>> report that: "Preconditioner for the Schur complement formed from user >>> provided matrix", but in the KSP solver for Schur complement S, the pc >>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>> matrix I provide. Am I missing something here ? >>> Below are the relevant commented code snippet and the output of the >>> -ksp_view >>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>> >> >> If ILU does not error, it means it is using your matrix, because the >> Schur complement matrix cannot be factored, and FS says it is using your >> matrix. >> >> Matt >> >> >>> Code snippet: >>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >>> nullspace for the whole system >>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up mKsp >>> with the options provided with fieldsplit and the fields associated with >>> the two splits. >>> >>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); //Get >>> the fieldsplit pc set up from the options >>> >>> ierr = >>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>> //Use mPcForSc as the preconditioner for Schur Complement >>> >>> KSP *kspSchur; >>> PetscInt kspSchurPos = 1; >>> ierr = >>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>> ierr = >>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>> //Attach the null-space for the Schur complement ksp solver. >>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>> >>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>> >>> >>> >>> the output of the -ksp_view >>> KSP Object: 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> has attached null space >>> using PRECONDITIONED norm type for convergence test >>> PC Object: 1 MPI processes >>> type: fieldsplit >>> FieldSplit with Schur preconditioner, blocksize = 4, factorization >>> FULL >>> Preconditioner for the Schur complement formed from user provided >>> matrix >>> Split info: >>> Split number 0 Fields 0, 1, 2 >>> Split number 1 Fields 3 >>> KSP solver for A00 block >>> KSP Object: (fieldsplit_0_) 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_0_) 1 MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=2187 >>> package used to perform factorization: petsc >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=2187 >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> KSP solver for S = A11 - A10 inv(A00) A01 >>> KSP Object: (fieldsplit_1_) 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> has attached null space >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_1_) 1 MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=729 >>> package used to perform factorization: petsc >>> total: nonzeros=15625, allocated nonzeros=15625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix followed by preconditioner matrix: >>> Matrix Object: 1 MPI processes >>> type: schurcomplement >>> rows=729, cols=729 >>> Schur complement A11 - A10 inv(A00) A01 >>> A11 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=729 >>> total: nonzeros=15625, allocated nonzeros=15625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> A10 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=2187 >>> total: nonzeros=46875, allocated nonzeros=46875 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> KSP of A00 >>> KSP Object: (fieldsplit_0_) 1 >>> MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) >>> Gram-Schmidt Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, >>> divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_0_) 1 >>> MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI >>> processes >>> type: seqaij >>> rows=2187, cols=2187 >>> package used to perform factorization: petsc >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues >>> calls =0 >>> using I-node routines: found 729 nodes, limit >>> used is 5 >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=2187 >>> total: nonzeros=140625, allocated nonzeros=140625 >>> total number of mallocs used during MatSetValues calls >>> =0 >>> using I-node routines: found 729 nodes, limit used >>> is 5 >>> A01 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2187, cols=729 >>> total: nonzeros=46875, allocated nonzeros=46875 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=729, cols=729 >>> total: nonzeros=15625, allocated nonzeros=15625 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2916, cols=2916, bs=4 >>> total: nonzeros=250000, allocated nonzeros=250000 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 729 nodes, limit used is 5 >>> >>> >>> >>> >>> >>>> >>>> or >>>> >>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> The errors I get when running with options: -pc_type fieldsplit >>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>> -pc_fieldsplit_1_fields 3 >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>> ------------------------------------ >>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named edwards >>>>> by bkhanal Tue Aug 6 17:35:30 2013 >>>>> [0]PETSC ERROR: Libraries linked from >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>> -with-clanguage=cxx --download-hypre=1 >>>>> [0]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>> WARNING! There are options you set that were not used! >>>>> WARNING! could be spelling mistake, etc! >>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Fri Aug 23 07:25:19 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 23 Aug 2013 14:25:19 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: > On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal wrote: > >> >> Thanks Matt and Mark for comments in using near null space [question I >> asked in the thread with subject: *problem (Segmentation voilation) >> using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster >> *]. >> So I understood that I have to set a nearNullSpace to A00 block where the >> null space correspond to the rigid body motion. I tried it but still the >> gamg just keeps on iterating and convergence is very very slow. I am not >> sure what the problem is, right now gamg does not even work for the >> constant viscosity case. >> I have set up the following in my code: >> 1. null space for the whole system A 2. null space for the Schur >> complement S 3. Near null space for A00 4. a user preconditioner matrix of >> inverse viscosity in the diagonal for S. >> > > If you want to debug solvers, you HAVE to send -ksp_view. > When I use gamg, the -fieldsplit_0_ksp was iterating on and on so didn't get to the end to get -ksp_view results. Instead here I have put the -ksp_view output when running the program with following options: (In this case I get the results) -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view Linear solve converged due to CONVERGED_RTOL iterations 2 KSP Object: 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning has attached null space using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, blocksize = 4, factorization FULL Preconditioner for the Schur complement formed from user provided matrix Split info: Split number 0 Fields 0, 1, 2 Split number 1 Fields 3 KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 package used to perform factorization: petsc total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning has attached null space using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=2744, cols=2744 package used to perform factorization: petsc total: nonzeros=64000, allocated nonzeros=64000 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix followed by preconditioner matrix: Matrix Object: 1 MPI processes type: schurcomplement rows=2744, cols=2744 Schur complement A11 - A10 inv(A00) A01 A11 Matrix Object: 1 MPI processes type: seqaij rows=2744, cols=2744 total: nonzeros=64000, allocated nonzeros=64000 total number of mallocs used during MatSetValues calls =0 not using I-node routines A10 Matrix Object: 1 MPI processes type: seqaij rows=2744, cols=8232 total: nonzeros=192000, allocated nonzeros=192000 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 package used to perform factorization: petsc total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 A01 Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=2744 total: nonzeros=192000, allocated nonzeros=192000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Matrix Object: 1 MPI processes type: seqaij rows=2744, cols=2744 total: nonzeros=64000, allocated nonzeros=64000 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=10976, cols=10976, bs=4 total: nonzeros=1024000, allocated nonzeros=1024000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 > Matt > > >> I am testing a small problem with CONSTANT viscosity for grid size of >> 14^3 with the run time option: >> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >> -fieldsplit_1_ksp_monitor_true_residual >> >> Here is my relevant code of the solve function: >> PetscErrorCode ierr; >> PetscFunctionBeginUser; >> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >> ierr = >> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with dof >> = 4, vx,vy,vz and p. >> ierr = KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for >> the main system >> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register the >> fieldsplits obtained from options. >> >> //Setting up user PC for Schur Complement >> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >> ierr = >> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >> >> KSP *subKsp; >> PetscInt subKspPos = 0; >> //Set up nearNullspace for A00 block. >> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >> MatNullSpace rigidBodyModes; >> Vec coords; >> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >> ierr = >> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >> Mat matA00; >> ierr = KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >> ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >> >> //Position 1 => Ksp corresponding to Schur complement S on pressure >> space >> subKspPos = 1; >> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >> //Set up the null space of constant pressure. >> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >> PetscBool isNull; >> Mat matSc; >> ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >> if(!isNull) >> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid pressure >> null space \n"); >> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >> if(!isNull) >> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system null >> space \n"); >> >> ierr = PetscFree(subKsp);CHKERRQ(ierr); >> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >> >> >> PetscFunctionReturn(0); >> >> >> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: >> >>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >>>> >>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>> >>>>>>>>>>>>> > Now, I implemented two different approaches, each for both >>>>>>>>>>>>> 2D and 3D, in >>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>>>>> solving it for >>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>> > I use staggered grid with p on cell centers, and components >>>>>>>>>>>>> of v on cell >>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>>>> account for the >>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>> >>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>> discretization of >>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>>>> recommend >>>>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction >>>>>>>>>>>>> (make that >>>>>>>>>>>>> work first, then switch to block preconditioner). You can use >>>>>>>>>>>>> PCLSC or >>>>>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>>>>> containing >>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>>>> diagonal >>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>>>> velocity >>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if you get >>>>>>>>>>>>> stuck. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix >>>>>>>>>>>> to use as the preconditioner for the Schur complement solve step as you >>>>>>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>> A naive approach that I can think of would be to create a >>>>>>>>>>>> vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>>>>> dof as explained before. >>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the >>>>>>>>>>>> S_p matrix would correspond to only pressure space. Should the distribution >>>>>>>>>>>> of the rows of S_p among different processor not correspond to the >>>>>>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks blocks. >>>>>>>>>>> The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but >>>>>>>>>> with dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>>>> the vertices ? >>>>>>>>>> >>>>>>>>> >>>>>>>>> Yes. >>>>>>>>> >>>>>>>>> >>>>>>>>>> Even then I think there might be a problem due to the presence >>>>>>>>>> of "fictitious pressure vertices". The system matrix (A) contains an >>>>>>>>>> identity corresponding to these fictitious pressure nodes, thus when using >>>>>>>>>> a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>> >>>>>>>>> >>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>> >>>>>>>> >>>>>>>> How can I set this split in the code itself without giving it as a >>>>>>>> command line option when the system matrix is assembled from the DMDA for >>>>>>>> the whole system with 4 dofs. (i.e. *without* using the >>>>>>>> DMComposite or *without* using the nested block matrices to >>>>>>>> assemble different blocks separately and then combine them together). >>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>>>>>> because not using detect_saddle_point means I cannot use >>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>> >>>>>>> >>>>>>> This is currently a real problem with the DMDA. In the unstructured >>>>>>> case, where we always need specialized spaces, you can >>>>>>> use something like >>>>>>> >>>>>>> PetscObject pressure; >>>>>>> MatNullSpace nullSpacePres; >>>>>>> >>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), PETSC_TRUE, >>>>>>> 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>>>>> nullSpacePres);CHKERRQ(ierr); >>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>> >>>>>>> and then DMGetSubDM() uses this information to attach the null space >>>>>>> to the IS that is created using the information in the PetscSection. >>>>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>>>> think this works correctly, but this has not been tested at all and is very >>>>>>> new code. Eventually, I think we want all DMs to use this mechanism, >>>>>>> but we are still working it out. >>>>>>> >>>>>> >>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>> model with a quickfix even if it means a little dirty code!) >>>>>> >>>>>> >>>>>>> >>>>>>> Bottom line: For custom null spaces using the default layout in >>>>>>> DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>> which is somewhat subtle. You need to call KSPSetUp() and then reach >>>>>>> in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>>> have not reorganized that code (which could be very simple and >>>>>>> inflexible since its very structured). >>>>>>> >>>>>> >>>>>> So I tried to get this approach working but I could not succeed and >>>>>> encountered some errors. Here is a code snippet: >>>>>> >>>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>>>> velocity components and 1 pressure comp.) >>>>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>> ierr = >>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>>>> for the complete system. >>>>>> ierr = >>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>> //-pc_fieldsplit_1_fields 3 >>>>>> >>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>> >>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>>>> obtained from the options (fieldsplit) >>>>>> >>>>>> ierr = >>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>> >>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>> >>>>>> KSP *kspSchur; >>>>>> PetscInt kspSchurPos = 1; >>>>>> ierr = >>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>> ierr = >>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>> using the mDaPressure. >>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>> >>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>> >>>>> >>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>> >>>>> -pc_fieldsplit_dm_splits 0 >>>>> >>>> >>>> Thanks, with this it seems I can attach the null space properly, but I >>>> have a question regarding whether the Schur complement ksp solver is >>>> actually using the preconditioner matrix I provide. >>>> When using -ksp_view, the outer level pc object of type fieldsplit does >>>> report that: "Preconditioner for the Schur complement formed from user >>>> provided matrix", but in the KSP solver for Schur complement S, the pc >>>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>>> matrix I provide. Am I missing something here ? >>>> Below are the relevant commented code snippet and the output of the >>>> -ksp_view >>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>> >>> >>> If ILU does not error, it means it is using your matrix, because the >>> Schur complement matrix cannot be factored, and FS says it is using your >>> matrix. >>> >>> Matt >>> >>> >>>> Code snippet: >>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >>>> nullspace for the whole system >>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up >>>> mKsp with the options provided with fieldsplit and the fields associated >>>> with the two splits. >>>> >>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); //Get >>>> the fieldsplit pc set up from the options >>>> >>>> ierr = >>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>> //Use mPcForSc as the preconditioner for Schur Complement >>>> >>>> KSP *kspSchur; >>>> PetscInt kspSchurPos = 1; >>>> ierr = >>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>> ierr = >>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>> //Attach the null-space for the Schur complement ksp solver. >>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>> >>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>> >>>> >>>> >>>> the output of the -ksp_view >>>> KSP Object: 1 MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> has attached null space >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: 1 MPI processes >>>> type: fieldsplit >>>> FieldSplit with Schur preconditioner, blocksize = 4, factorization >>>> FULL >>>> Preconditioner for the Schur complement formed from user provided >>>> matrix >>>> Split info: >>>> Split number 0 Fields 0, 1, 2 >>>> Split number 1 Fields 3 >>>> KSP solver for A00 block >>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>> type: ilu >>>> ILU: out-of-place factorization >>>> 0 levels of fill >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: natural >>>> factor fill ratio given 1, needed 1 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2187, cols=2187 >>>> package used to perform factorization: petsc >>>> total: nonzeros=140625, allocated nonzeros=140625 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> using I-node routines: found 729 nodes, limit used is >>>> 5 >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2187, cols=2187 >>>> total: nonzeros=140625, allocated nonzeros=140625 >>>> total number of mallocs used during MatSetValues calls =0 >>>> using I-node routines: found 729 nodes, limit used is 5 >>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> has attached null space >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>> type: ilu >>>> ILU: out-of-place factorization >>>> 0 levels of fill >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: natural >>>> factor fill ratio given 1, needed 1 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=729, cols=729 >>>> package used to perform factorization: petsc >>>> total: nonzeros=15625, allocated nonzeros=15625 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> not using I-node routines >>>> linear system matrix followed by preconditioner matrix: >>>> Matrix Object: 1 MPI processes >>>> type: schurcomplement >>>> rows=729, cols=729 >>>> Schur complement A11 - A10 inv(A00) A01 >>>> A11 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=729, cols=729 >>>> total: nonzeros=15625, allocated nonzeros=15625 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> not using I-node routines >>>> A10 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=729, cols=2187 >>>> total: nonzeros=46875, allocated nonzeros=46875 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> not using I-node routines >>>> KSP of A00 >>>> KSP Object: (fieldsplit_0_) 1 >>>> MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) >>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, >>>> divergence=10000 >>>> left preconditioning >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: (fieldsplit_0_) 1 >>>> MPI processes >>>> type: ilu >>>> ILU: out-of-place factorization >>>> 0 levels of fill >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: natural >>>> factor fill ratio given 1, needed 1 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI >>>> processes >>>> type: seqaij >>>> rows=2187, cols=2187 >>>> package used to perform factorization: petsc >>>> total: nonzeros=140625, allocated >>>> nonzeros=140625 >>>> total number of mallocs used during >>>> MatSetValues calls =0 >>>> using I-node routines: found 729 nodes, limit >>>> used is 5 >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2187, cols=2187 >>>> total: nonzeros=140625, allocated nonzeros=140625 >>>> total number of mallocs used during MatSetValues >>>> calls =0 >>>> using I-node routines: found 729 nodes, limit used >>>> is 5 >>>> A01 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2187, cols=729 >>>> total: nonzeros=46875, allocated nonzeros=46875 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> using I-node routines: found 729 nodes, limit used is >>>> 5 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=729, cols=729 >>>> total: nonzeros=15625, allocated nonzeros=15625 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2916, cols=2916, bs=4 >>>> total: nonzeros=250000, allocated nonzeros=250000 >>>> total number of mallocs used during MatSetValues calls =0 >>>> using I-node routines: found 729 nodes, limit used is 5 >>>> >>>> >>>> >>>> >>>> >>>>> >>>>> or >>>>> >>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> The errors I get when running with options: -pc_type fieldsplit >>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>> -pc_fieldsplit_1_fields 3 >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> ------------------------------------ >>>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>> [0]PETSC ERROR: Libraries linked from >>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>> [0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>> WARNING! There are options you set that were not used! >>>>>> WARNING! could be spelling mistake, etc! >>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 23 07:33:49 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Aug 2013 07:33:49 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal wrote: > > > > On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: > >> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal wrote: >> >>> >>> Thanks Matt and Mark for comments in using near null space [question I >>> asked in the thread with subject: *problem (Segmentation voilation) >>> using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster >>> *]. >>> So I understood that I have to set a nearNullSpace to A00 block where >>> the null space correspond to the rigid body motion. I tried it but still >>> the gamg just keeps on iterating and convergence is very very slow. I am >>> not sure what the problem is, right now gamg does not even work for the >>> constant viscosity case. >>> I have set up the following in my code: >>> 1. null space for the whole system A 2. null space for the Schur >>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>> inverse viscosity in the diagonal for S. >>> >> >> If you want to debug solvers, you HAVE to send -ksp_view. >> > > When I use gamg, the -fieldsplit_0_ksp was iterating on and on so didn't > get to the end to get -ksp_view results. > Instead here I have put the -ksp_view output when running the program with > following options: (In this case I get the results) > -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 > -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 > -ksp_converged_reason -ksp_view > Okay, that looks fine. Does -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol 1.0e-10 converge in one Iterate? What matrix did you attach as the preconditioner matrix for fieldsplit_1_? Thanks, Matt > Linear solve converged due to CONVERGED_RTOL iterations 2 > KSP Object: 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > has attached null space > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, blocksize = 4, factorization FULL > Preconditioner for the Schur complement formed from user provided > matrix > Split info: > Split number 0 Fields 0, 1, 2 > Split number 1 Fields 3 > KSP solver for A00 block > KSP Object: (fieldsplit_0_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_0_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: natural > factor fill ratio given 1, needed 1 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=8232, cols=8232 > package used to perform factorization: petsc > total: nonzeros=576000, allocated nonzeros=576000 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 2744 nodes, limit used is 5 > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=8232, cols=8232 > total: nonzeros=576000, allocated nonzeros=576000 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 2744 nodes, limit used is 5 > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (fieldsplit_1_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > has attached null space > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_1_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: natural > factor fill ratio given 1, needed 1 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=2744, cols=2744 > package used to perform factorization: petsc > total: nonzeros=64000, allocated nonzeros=64000 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Matrix Object: 1 MPI processes > type: schurcomplement > rows=2744, cols=2744 > Schur complement A11 - A10 inv(A00) A01 > A11 > Matrix Object: 1 MPI processes > type: seqaij > rows=2744, cols=2744 > total: nonzeros=64000, allocated nonzeros=64000 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A10 > Matrix Object: 1 MPI processes > type: seqaij > rows=2744, cols=8232 > total: nonzeros=192000, allocated nonzeros=192000 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP of A00 > KSP Object: (fieldsplit_0_) 1 MPI > processes > type: gmres > GMRES: restart=30, using Classical (unmodified) > Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_0_) 1 MPI > processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > matrix ordering: natural > factor fill ratio given 1, needed 1 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=8232, cols=8232 > package used to perform factorization: petsc > total: nonzeros=576000, allocated nonzeros=576000 > total number of mallocs used during MatSetValues > calls =0 > using I-node routines: found 2744 nodes, limit > used is 5 > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=8232, cols=8232 > total: nonzeros=576000, allocated nonzeros=576000 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 2744 nodes, limit used is > 5 > A01 > Matrix Object: 1 MPI processes > type: seqaij > rows=8232, cols=2744 > total: nonzeros=192000, allocated nonzeros=192000 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 2744 nodes, limit used is 5 > Matrix Object: 1 MPI processes > type: seqaij > rows=2744, cols=2744 > total: nonzeros=64000, allocated nonzeros=64000 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=10976, cols=10976, bs=4 > total: nonzeros=1024000, allocated nonzeros=1024000 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 2744 nodes, limit used is 5 > > > > >> Matt >> >> >>> I am testing a small problem with CONSTANT viscosity for grid size of >>> 14^3 with the run time option: >>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>> -fieldsplit_1_ksp_monitor_true_residual >>> >>> Here is my relevant code of the solve function: >>> PetscErrorCode ierr; >>> PetscFunctionBeginUser; >>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>> ierr = >>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with dof >>> = 4, vx,vy,vz and p. >>> ierr = KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace >>> for the main system >>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register the >>> fieldsplits obtained from options. >>> >>> //Setting up user PC for Schur Complement >>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>> ierr = >>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>> >>> KSP *subKsp; >>> PetscInt subKspPos = 0; >>> //Set up nearNullspace for A00 block. >>> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>> MatNullSpace rigidBodyModes; >>> Vec coords; >>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>> ierr = >>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>> Mat matA00; >>> ierr = KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>> ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>> >>> //Position 1 => Ksp corresponding to Schur complement S on pressure >>> space >>> subKspPos = 1; >>> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>> //Set up the null space of constant pressure. >>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>> PetscBool isNull; >>> Mat matSc; >>> ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>> if(!isNull) >>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid pressure >>> null space \n"); >>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>> if(!isNull) >>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system null >>> space \n"); >>> >>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>> >>> >>> PetscFunctionReturn(0); >>> >>> >>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: >>> >>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >>>>> >>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal >>>>>>> > wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>> >>>>>>>>>>>>>> > Now, I implemented two different approaches, each for both >>>>>>>>>>>>>> 2D and 3D, in >>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>>>>>> solving it for >>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and components >>>>>>>>>>>>>> of v on cell >>>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>>>>> account for the >>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>> >>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>>>>> recommend >>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction >>>>>>>>>>>>>> (make that >>>>>>>>>>>>>> work first, then switch to block preconditioner). You can >>>>>>>>>>>>>> use PCLSC or >>>>>>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>>>>>> containing >>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>>>>> diagonal >>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>>>>> velocity >>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if you >>>>>>>>>>>>>> get stuck. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal matrix >>>>>>>>>>>>> to use as the preconditioner for the Schur complement solve step as you >>>>>>>>>>>>> suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>> A naive approach that I can think of would be to create a >>>>>>>>>>>>> vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>>>>>> dof as explained before. >>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for the >>>>>>>>>>>>> S_p matrix would correspond to only pressure space. Should the distribution >>>>>>>>>>>>> of the rows of S_p among different processor not correspond to the >>>>>>>>>>>>> distribution of the rhs vector, say h' if it is solving for p with Sp = h' >>>>>>>>>>>>> where S = A11 inv(A00) A01 ? >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but >>>>>>>>>>> with dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>>>>> the vertices ? >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Yes. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Even then I think there might be a problem due to the presence >>>>>>>>>>> of "fictitious pressure vertices". The system matrix (A) contains an >>>>>>>>>>> identity corresponding to these fictitious pressure nodes, thus when using >>>>>>>>>>> a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of size >>>>>>>>>>> that correspond to only non-fictitious P-nodes. So the preconditioner S_p >>>>>>>>>>> for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>> >>>>>>>>> >>>>>>>>> How can I set this split in the code itself without giving it as a >>>>>>>>> command line option when the system matrix is assembled from the DMDA for >>>>>>>>> the whole system with 4 dofs. (i.e. *without* using the >>>>>>>>> DMComposite or *without* using the nested block matrices to >>>>>>>>> assemble different blocks separately and then combine them together). >>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my code, >>>>>>>>> because not using detect_saddle_point means I cannot use >>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>> >>>>>>>> >>>>>>>> This is currently a real problem with the DMDA. In the unstructured >>>>>>>> case, where we always need specialized spaces, you can >>>>>>>> use something like >>>>>>>> >>>>>>>> PetscObject pressure; >>>>>>>> MatNullSpace nullSpacePres; >>>>>>>> >>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>>>>>> nullSpacePres);CHKERRQ(ierr); >>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>> >>>>>>>> and then DMGetSubDM() uses this information to attach the null >>>>>>>> space to the IS that is created using the information in the PetscSection. >>>>>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>>>>> think this works correctly, but this has not been tested at all and is very >>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>> mechanism, but we are still working it out. >>>>>>>> >>>>>>> >>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Bottom line: For custom null spaces using the default layout in >>>>>>>> DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and then >>>>>>>> reach in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>>>> have not reorganized that code (which could be very simple and >>>>>>>> inflexible since its very structured). >>>>>>>> >>>>>>> >>>>>>> So I tried to get this approach working but I could not succeed and >>>>>>> encountered some errors. Here is a code snippet: >>>>>>> >>>>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>>>>> velocity components and 1 pressure comp.) >>>>>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>> ierr = >>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>>>>> for the complete system. >>>>>>> ierr = >>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>> >>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>> >>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>>>>> obtained from the options (fieldsplit) >>>>>>> >>>>>>> ierr = >>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>> >>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>> >>>>>>> KSP *kspSchur; >>>>>>> PetscInt kspSchurPos = 1; >>>>>>> ierr = >>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>> ierr = >>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>> using the mDaPressure. >>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>> >>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>> >>>>>> >>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>> >>>>>> -pc_fieldsplit_dm_splits 0 >>>>>> >>>>> >>>>> Thanks, with this it seems I can attach the null space properly, but I >>>>> have a question regarding whether the Schur complement ksp solver is >>>>> actually using the preconditioner matrix I provide. >>>>> When using -ksp_view, the outer level pc object of type fieldsplit >>>>> does report that: "Preconditioner for the Schur complement formed from user >>>>> provided matrix", but in the KSP solver for Schur complement S, the pc >>>>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>>>> matrix I provide. Am I missing something here ? >>>>> Below are the relevant commented code snippet and the output of the >>>>> -ksp_view >>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>> >>>> >>>> If ILU does not error, it means it is using your matrix, because the >>>> Schur complement matrix cannot be factored, and FS says it is using your >>>> matrix. >>>> >>>> Matt >>>> >>>> >>>>> Code snippet: >>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >>>>> nullspace for the whole system >>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up >>>>> mKsp with the options provided with fieldsplit and the fields associated >>>>> with the two splits. >>>>> >>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>> //Get the fieldsplit pc set up from the options >>>>> >>>>> ierr = >>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>> >>>>> KSP *kspSchur; >>>>> PetscInt kspSchurPos = 1; >>>>> ierr = >>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>> ierr = >>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>> //Attach the null-space for the Schur complement ksp solver. >>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>> >>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>> >>>>> >>>>> >>>>> the output of the -ksp_view >>>>> KSP Object: 1 MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>> Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> has attached null space >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: 1 MPI processes >>>>> type: fieldsplit >>>>> FieldSplit with Schur preconditioner, blocksize = 4, factorization >>>>> FULL >>>>> Preconditioner for the Schur complement formed from user provided >>>>> matrix >>>>> Split info: >>>>> Split number 0 Fields 0, 1, 2 >>>>> Split number 1 Fields 3 >>>>> KSP solver for A00 block >>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>> Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>> type: ilu >>>>> ILU: out-of-place factorization >>>>> 0 levels of fill >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: natural >>>>> factor fill ratio given 1, needed 1 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2187, cols=2187 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> using I-node routines: found 729 nodes, limit used >>>>> is 5 >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2187, cols=2187 >>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>> Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> has attached null space >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>> type: ilu >>>>> ILU: out-of-place factorization >>>>> 0 levels of fill >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: natural >>>>> factor fill ratio given 1, needed 1 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=729, cols=729 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> not using I-node routines >>>>> linear system matrix followed by preconditioner matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: schurcomplement >>>>> rows=729, cols=729 >>>>> Schur complement A11 - A10 inv(A00) A01 >>>>> A11 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=729, cols=729 >>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> not using I-node routines >>>>> A10 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=729, cols=2187 >>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> not using I-node routines >>>>> KSP of A00 >>>>> KSP Object: (fieldsplit_0_) 1 >>>>> MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) >>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>> divergence=10000 >>>>> left preconditioning >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: (fieldsplit_0_) 1 >>>>> MPI processes >>>>> type: ilu >>>>> ILU: out-of-place factorization >>>>> 0 levels of fill >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: natural >>>>> factor fill ratio given 1, needed 1 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI >>>>> processes >>>>> type: seqaij >>>>> rows=2187, cols=2187 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=140625, allocated >>>>> nonzeros=140625 >>>>> total number of mallocs used during >>>>> MatSetValues calls =0 >>>>> using I-node routines: found 729 nodes, >>>>> limit used is 5 >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2187, cols=2187 >>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>> total number of mallocs used during MatSetValues >>>>> calls =0 >>>>> using I-node routines: found 729 nodes, limit used >>>>> is 5 >>>>> A01 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2187, cols=729 >>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> using I-node routines: found 729 nodes, limit used >>>>> is 5 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=729, cols=729 >>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2916, cols=2916, bs=4 >>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>> >>>>> >>>>> >>>>> >>>>> >>>>>> >>>>>> or >>>>>> >>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> The errors I get when running with options: -pc_type fieldsplit >>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>> ------------------------------------ >>>>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>> [0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------ >>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>> WARNING! There are options you set that were not used! >>>>>>> WARNING! could be spelling mistake, etc! >>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Fri Aug 23 07:46:24 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 23 Aug 2013 14:46:24 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: > On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal wrote: > >> >> >> >> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: >> >>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal wrote: >>> >>>> >>>> Thanks Matt and Mark for comments in using near null space [question I >>>> asked in the thread with subject: *problem (Segmentation voilation) >>>> using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster >>>> *]. >>>> So I understood that I have to set a nearNullSpace to A00 block where >>>> the null space correspond to the rigid body motion. I tried it but still >>>> the gamg just keeps on iterating and convergence is very very slow. I am >>>> not sure what the problem is, right now gamg does not even work for the >>>> constant viscosity case. >>>> I have set up the following in my code: >>>> 1. null space for the whole system A 2. null space for the Schur >>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>> inverse viscosity in the diagonal for S. >>>> >>> >>> If you want to debug solvers, you HAVE to send -ksp_view. >>> >> >> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so didn't >> get to the end to get -ksp_view results. >> Instead here I have put the -ksp_view output when running the program >> with following options: (In this case I get the results) >> -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 >> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 >> -ksp_converged_reason -ksp_view >> > > Okay, that looks fine. Does > > -fieldsplit_0_pc_type lu > - fieldsplit_1_ksp_rtol 1.0e-10 > > converge in one Iterate? > > What matrix did you attach as the preconditioner matrix for fieldsplit_1_? > I used a diagonal matrix with reciprocal of viscosity values of the corresponding cell centers as the preconditioner. with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason I get the following output which means the outer ksp did converge in one iterate I guess. Linear solve converged due to CONVERGED_RTOL iterations 18 Linear solve converged due to CONVERGED_RTOL iterations 18 Linear solve converged due to CONVERGED_RTOL iterations 1 > > Thanks, > > Matt > > >> Linear solve converged due to CONVERGED_RTOL iterations 2 >> KSP Object: 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> has attached null space >> using PRECONDITIONED norm type for convergence test >> PC Object: 1 MPI processes >> type: fieldsplit >> FieldSplit with Schur preconditioner, blocksize = 4, factorization >> FULL >> Preconditioner for the Schur complement formed from user provided >> matrix >> Split info: >> Split number 0 Fields 0, 1, 2 >> Split number 1 Fields 3 >> KSP solver for A00 block >> KSP Object: (fieldsplit_0_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=8232, cols=8232 >> package used to perform factorization: petsc >> total: nonzeros=576000, allocated nonzeros=576000 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 2744 nodes, limit used is 5 >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=8232, cols=8232 >> total: nonzeros=576000, allocated nonzeros=576000 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 2744 nodes, limit used is 5 >> KSP solver for S = A11 - A10 inv(A00) A01 >> KSP Object: (fieldsplit_1_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >> left preconditioning >> has attached null space >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_1_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2744, cols=2744 >> package used to perform factorization: petsc >> total: nonzeros=64000, allocated nonzeros=64000 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Matrix Object: 1 MPI processes >> type: schurcomplement >> rows=2744, cols=2744 >> Schur complement A11 - A10 inv(A00) A01 >> A11 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2744, cols=2744 >> total: nonzeros=64000, allocated nonzeros=64000 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> A10 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2744, cols=8232 >> total: nonzeros=192000, allocated nonzeros=192000 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP of A00 >> KSP Object: (fieldsplit_0_) 1 >> MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) >> Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_0_) 1 MPI >> processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> using diagonal shift on blocks to prevent zero pivot >> matrix ordering: natural >> factor fill ratio given 1, needed 1 >> Factored matrix follows: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=8232, cols=8232 >> package used to perform factorization: petsc >> total: nonzeros=576000, allocated nonzeros=576000 >> total number of mallocs used during MatSetValues >> calls =0 >> using I-node routines: found 2744 nodes, limit >> used is 5 >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=8232, cols=8232 >> total: nonzeros=576000, allocated nonzeros=576000 >> total number of mallocs used during MatSetValues calls >> =0 >> using I-node routines: found 2744 nodes, limit used >> is 5 >> A01 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=8232, cols=2744 >> total: nonzeros=192000, allocated nonzeros=192000 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 2744 nodes, limit used is 5 >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=2744, cols=2744 >> total: nonzeros=64000, allocated nonzeros=64000 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Matrix Object: 1 MPI processes >> type: seqaij >> rows=10976, cols=10976, bs=4 >> total: nonzeros=1024000, allocated nonzeros=1024000 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 2744 nodes, limit used is 5 >> >> >> >> >>> Matt >>> >>> >>>> I am testing a small problem with CONSTANT viscosity for grid size of >>>> 14^3 with the run time option: >>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>> -fieldsplit_1_ksp_monitor_true_residual >>>> >>>> Here is my relevant code of the solve function: >>>> PetscErrorCode ierr; >>>> PetscFunctionBeginUser; >>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>> ierr = >>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with >>>> dof = 4, vx,vy,vz and p. >>>> ierr = KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace >>>> for the main system >>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register >>>> the fieldsplits obtained from options. >>>> >>>> //Setting up user PC for Schur Complement >>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>> ierr = >>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>> >>>> KSP *subKsp; >>>> PetscInt subKspPos = 0; >>>> //Set up nearNullspace for A00 block. >>>> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>> MatNullSpace rigidBodyModes; >>>> Vec coords; >>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>> ierr = >>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>> Mat matA00; >>>> ierr = KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>> ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>> >>>> //Position 1 => Ksp corresponding to Schur complement S on pressure >>>> space >>>> subKspPos = 1; >>>> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>> //Set up the null space of constant pressure. >>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>> PetscBool isNull; >>>> Mat matSc; >>>> ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>> if(!isNull) >>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid pressure >>>> null space \n"); >>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>> if(!isNull) >>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system >>>> null space \n"); >>>> >>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>> >>>> >>>> PetscFunctionReturn(0); >>>> >>>> >>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: >>>> >>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal >>>>>> > wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> > Now, I implemented two different approaches, each for both >>>>>>>>>>>>>>> 2D and 3D, in >>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have problems >>>>>>>>>>>>>>> solving it for >>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>>>>>> account for the >>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>>>>>> recommend >>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction >>>>>>>>>>>>>>> (make that >>>>>>>>>>>>>>> work first, then switch to block preconditioner). You can >>>>>>>>>>>>>>> use PCLSC or >>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning matrix >>>>>>>>>>>>>>> containing >>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>>>>>> diagonal >>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending on >>>>>>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>>>>>> velocity >>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if you >>>>>>>>>>>>>>> get stuck. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal >>>>>>>>>>>>>> matrix to use as the preconditioner for the Schur complement solve step as >>>>>>>>>>>>>> you suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>>> A naive approach that I can think of would be to create a >>>>>>>>>>>>>> vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the Petsc >>>>>>>>>>>>>> distribution of the different rows of the main system matrix over different >>>>>>>>>>>>>> processors ? The main matrix was created using the DMDA structure with 4 >>>>>>>>>>>>>> dof as explained before. >>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for >>>>>>>>>>>>>> the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but >>>>>>>>>>>> with dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>>>>>> the vertices ? >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Yes. >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> How can I set this split in the code itself without giving it as >>>>>>>>>> a command line option when the system matrix is assembled from the DMDA for >>>>>>>>>> the whole system with 4 dofs. (i.e. *without* using the >>>>>>>>>> DMComposite or *without* using the nested block matrices to >>>>>>>>>> assemble different blocks separately and then combine them together). >>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my >>>>>>>>>> code, because not using detect_saddle_point means I cannot use >>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>> >>>>>>>>> >>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>> use something like >>>>>>>>> >>>>>>>>> PetscObject pressure; >>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>> >>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", (PetscObject) >>>>>>>>> nullSpacePres);CHKERRQ(ierr); >>>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> and then DMGetSubDM() uses this information to attach the null >>>>>>>>> space to the IS that is created using the information in the PetscSection. >>>>>>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>>>>>> think this works correctly, but this has not been tested at all and is very >>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>> mechanism, but we are still working it out. >>>>>>>>> >>>>>>>> >>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Bottom line: For custom null spaces using the default layout in >>>>>>>>> DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and then >>>>>>>>> reach in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>>>>> have not reorganized that code (which could be very simple and >>>>>>>>> inflexible since its very structured). >>>>>>>>> >>>>>>>> >>>>>>>> So I tried to get this approach working but I could not succeed and >>>>>>>> encountered some errors. Here is a code snippet: >>>>>>>> >>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>>>>>> velocity components and 1 pressure comp.) >>>>>>>> ierr = DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>> ierr = >>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>>>>>> for the complete system. >>>>>>>> ierr = >>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>> >>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>> >>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>>>>>> obtained from the options (fieldsplit) >>>>>>>> >>>>>>>> ierr = >>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>> >>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>> >>>>>>>> KSP *kspSchur; >>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>> ierr = >>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>> ierr = >>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>> using the mDaPressure. >>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>> >>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>> >>>>>>> >>>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>>> >>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>> >>>>>> >>>>>> Thanks, with this it seems I can attach the null space properly, but >>>>>> I have a question regarding whether the Schur complement ksp solver is >>>>>> actually using the preconditioner matrix I provide. >>>>>> When using -ksp_view, the outer level pc object of type fieldsplit >>>>>> does report that: "Preconditioner for the Schur complement formed from user >>>>>> provided matrix", but in the KSP solver for Schur complement S, the pc >>>>>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>>>>> matrix I provide. Am I missing something here ? >>>>>> Below are the relevant commented code snippet and the output of the >>>>>> -ksp_view >>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>> >>>>> >>>>> If ILU does not error, it means it is using your matrix, because the >>>>> Schur complement matrix cannot be factored, and FS says it is using your >>>>> matrix. >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Code snippet: >>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >>>>>> nullspace for the whole system >>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up >>>>>> mKsp with the options provided with fieldsplit and the fields associated >>>>>> with the two splits. >>>>>> >>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>> //Get the fieldsplit pc set up from the options >>>>>> >>>>>> ierr = >>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>> >>>>>> KSP *kspSchur; >>>>>> PetscInt kspSchurPos = 1; >>>>>> ierr = >>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>> ierr = >>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>> >>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>> >>>>>> >>>>>> >>>>>> the output of the -ksp_view >>>>>> KSP Object: 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>> Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> has attached null space >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: 1 MPI processes >>>>>> type: fieldsplit >>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>> factorization FULL >>>>>> Preconditioner for the Schur complement formed from user provided >>>>>> matrix >>>>>> Split info: >>>>>> Split number 0 Fields 0, 1, 2 >>>>>> Split number 1 Fields 3 >>>>>> KSP solver for A00 block >>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>> type: ilu >>>>>> ILU: out-of-place factorization >>>>>> 0 levels of fill >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: natural >>>>>> factor fill ratio given 1, needed 1 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2187, cols=2187 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> using I-node routines: found 729 nodes, limit used >>>>>> is 5 >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2187, cols=2187 >>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> has attached null space >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>> type: ilu >>>>>> ILU: out-of-place factorization >>>>>> 0 levels of fill >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: natural >>>>>> factor fill ratio given 1, needed 1 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=729, cols=729 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> not using I-node routines >>>>>> linear system matrix followed by preconditioner matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: schurcomplement >>>>>> rows=729, cols=729 >>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>> A11 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=729, cols=729 >>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> not using I-node routines >>>>>> A10 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=729, cols=2187 >>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> not using I-node routines >>>>>> KSP of A00 >>>>>> KSP Object: (fieldsplit_0_) >>>>>> 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>> divergence=10000 >>>>>> left preconditioning >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: (fieldsplit_0_) 1 >>>>>> MPI processes >>>>>> type: ilu >>>>>> ILU: out-of-place factorization >>>>>> 0 levels of fill >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: natural >>>>>> factor fill ratio given 1, needed 1 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI >>>>>> processes >>>>>> type: seqaij >>>>>> rows=2187, cols=2187 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=140625, allocated >>>>>> nonzeros=140625 >>>>>> total number of mallocs used during >>>>>> MatSetValues calls =0 >>>>>> using I-node routines: found 729 nodes, >>>>>> limit used is 5 >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2187, cols=2187 >>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> using I-node routines: found 729 nodes, limit >>>>>> used is 5 >>>>>> A01 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2187, cols=729 >>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> using I-node routines: found 729 nodes, limit used >>>>>> is 5 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=729, cols=729 >>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2916, cols=2916, bs=4 >>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> >>>>>>> or >>>>>>> >>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> The errors I get when running with options: -pc_type fieldsplit >>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>> ------------------------------------ >>>>>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>> [0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------ >>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>> WARNING! There are options you set that were not used! >>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 23 07:53:20 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Aug 2013 07:53:20 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: > > > > On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: > >> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: >>> >>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> Thanks Matt and Mark for comments in using near null space [question I >>>>> asked in the thread with subject: *problem (Segmentation voilation) >>>>> using -pc_type hypre -pc_hypre_type -pilut with multiple nodes in a cluster >>>>> *]. >>>>> So I understood that I have to set a nearNullSpace to A00 block where >>>>> the null space correspond to the rigid body motion. I tried it but still >>>>> the gamg just keeps on iterating and convergence is very very slow. I am >>>>> not sure what the problem is, right now gamg does not even work for the >>>>> constant viscosity case. >>>>> I have set up the following in my code: >>>>> 1. null space for the whole system A 2. null space for the Schur >>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>> inverse viscosity in the diagonal for S. >>>>> >>>> >>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>> >>> >>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so didn't >>> get to the end to get -ksp_view results. >>> Instead here I have put the -ksp_view output when running the program >>> with following options: (In this case I get the results) >>> -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits 0 >>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 >>> -ksp_converged_reason -ksp_view >>> >> >> Okay, that looks fine. Does >> >> -fieldsplit_0_pc_type lu >> - fieldsplit_1_ksp_rtol 1.0e-10 >> >> converge in one Iterate? >> >> What matrix did you attach as the preconditioner matrix for fieldsplit_1_? >> > > > I used a diagonal matrix with reciprocal of viscosity values of the > corresponding cell centers as the preconditioner. > > with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol > 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason > I get the following output which means the outer ksp did converge in one > iterate I guess. > Linear solve converged due to CONVERGED_RTOL iterations 18 > Linear solve converged due to CONVERGED_RTOL iterations 18 > Linear solve converged due to CONVERGED_RTOL iterations 1 > Okay, so A_00 is nonsingular, and the system seems to solve alright. What do you get for -fieldsplit_0_ksp_max_it 30 -fieldsplit_0_pc_type gamg -fieldsplit_0_ksp_converged_reason -fieldsplit_1_ksp_converged_reason This is the kind of investigation you msut be comfortable with if you want to experiment with these solvers. Matt > >> >> Thanks, >> >> Matt >> >> >>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>> KSP Object: 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> has attached null space >>> using PRECONDITIONED norm type for convergence test >>> PC Object: 1 MPI processes >>> type: fieldsplit >>> FieldSplit with Schur preconditioner, blocksize = 4, factorization >>> FULL >>> Preconditioner for the Schur complement formed from user provided >>> matrix >>> Split info: >>> Split number 0 Fields 0, 1, 2 >>> Split number 1 Fields 3 >>> KSP solver for A00 block >>> KSP Object: (fieldsplit_0_) 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_0_) 1 MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=8232, cols=8232 >>> package used to perform factorization: petsc >>> total: nonzeros=576000, allocated nonzeros=576000 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 2744 nodes, limit used is >>> 5 >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=8232, cols=8232 >>> total: nonzeros=576000, allocated nonzeros=576000 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 2744 nodes, limit used is 5 >>> KSP solver for S = A11 - A10 inv(A00) A01 >>> KSP Object: (fieldsplit_1_) 1 MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>> left preconditioning >>> has attached null space >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_1_) 1 MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2744, cols=2744 >>> package used to perform factorization: petsc >>> total: nonzeros=64000, allocated nonzeros=64000 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix followed by preconditioner matrix: >>> Matrix Object: 1 MPI processes >>> type: schurcomplement >>> rows=2744, cols=2744 >>> Schur complement A11 - A10 inv(A00) A01 >>> A11 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2744, cols=2744 >>> total: nonzeros=64000, allocated nonzeros=64000 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> A10 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2744, cols=8232 >>> total: nonzeros=192000, allocated nonzeros=192000 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> KSP of A00 >>> KSP Object: (fieldsplit_0_) 1 >>> MPI processes >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) >>> Gram-Schmidt Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, >>> divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_0_) 1 >>> MPI processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> using diagonal shift on blocks to prevent zero pivot >>> matrix ordering: natural >>> factor fill ratio given 1, needed 1 >>> Factored matrix follows: >>> Matrix Object: 1 MPI >>> processes >>> type: seqaij >>> rows=8232, cols=8232 >>> package used to perform factorization: petsc >>> total: nonzeros=576000, allocated nonzeros=576000 >>> total number of mallocs used during MatSetValues >>> calls =0 >>> using I-node routines: found 2744 nodes, limit >>> used is 5 >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=8232, cols=8232 >>> total: nonzeros=576000, allocated nonzeros=576000 >>> total number of mallocs used during MatSetValues calls >>> =0 >>> using I-node routines: found 2744 nodes, limit used >>> is 5 >>> A01 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=8232, cols=2744 >>> total: nonzeros=192000, allocated nonzeros=192000 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 2744 nodes, limit used is >>> 5 >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=2744, cols=2744 >>> total: nonzeros=64000, allocated nonzeros=64000 >>> total number of mallocs used during MatSetValues calls =0 >>> not using I-node routines >>> linear system matrix = precond matrix: >>> Matrix Object: 1 MPI processes >>> type: seqaij >>> rows=10976, cols=10976, bs=4 >>> total: nonzeros=1024000, allocated nonzeros=1024000 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 2744 nodes, limit used is 5 >>> >>> >>> >>> >>>> Matt >>>> >>>> >>>>> I am testing a small problem with CONSTANT viscosity for grid size of >>>>> 14^3 with the run time option: >>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>> >>>>> Here is my relevant code of the solve function: >>>>> PetscErrorCode ierr; >>>>> PetscFunctionBeginUser; >>>>> ierr = >>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>> ierr = >>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with >>>>> dof = 4, vx,vy,vz and p. >>>>> ierr = KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace >>>>> for the main system >>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register >>>>> the fieldsplits obtained from options. >>>>> >>>>> //Setting up user PC for Schur Complement >>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>> ierr = >>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>> >>>>> KSP *subKsp; >>>>> PetscInt subKspPos = 0; >>>>> //Set up nearNullspace for A00 block. >>>>> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>> MatNullSpace rigidBodyModes; >>>>> Vec coords; >>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>> ierr = >>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>> Mat matA00; >>>>> ierr = KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>> ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>> >>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>> pressure space >>>>> subKspPos = 1; >>>>> ierr = PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>> //Set up the null space of constant pressure. >>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>> PetscBool isNull; >>>>> Mat matSc; >>>>> ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>> if(!isNull) >>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid pressure >>>>> null space \n"); >>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>> if(!isNull) >>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system >>>>> null space \n"); >>>>> >>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>> >>>>> >>>>> PetscFunctionReturn(0); >>>>> >>>>> >>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: >>>>> >>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each for >>>>>>>>>>>>>>>> both 2D and 3D, in >>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>>>>>>> account for the >>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>>>>>>> recommend >>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction >>>>>>>>>>>>>>>> (make that >>>>>>>>>>>>>>>> work first, then switch to block preconditioner). You can >>>>>>>>>>>>>>>> use PCLSC or >>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. This >>>>>>>>>>>>>>>> diagonal >>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending >>>>>>>>>>>>>>>> on >>>>>>>>>>>>>>>> discretization) approximation of the Schur complement. The >>>>>>>>>>>>>>>> velocity >>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if you >>>>>>>>>>>>>>>> get stuck. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal >>>>>>>>>>>>>>> matrix to use as the preconditioner for the Schur complement solve step as >>>>>>>>>>>>>>> you suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>>>> A naive approach that I can think of would be to create a >>>>>>>>>>>>>>> vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the >>>>>>>>>>>>>>> Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for >>>>>>>>>>>>>>> the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size but >>>>>>>>>>>>> with dof=1 instead of 4, the vertices for this new DMDA will be identically >>>>>>>>>>>>> distributed as for the original DMDA ? Or should I inform PETSc by calling >>>>>>>>>>>>> a particular function to make these two DMDA have identical distribution of >>>>>>>>>>>>> the vertices ? >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Yes. >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> How can I set this split in the code itself without giving it as >>>>>>>>>>> a command line option when the system matrix is assembled from the DMDA for >>>>>>>>>>> the whole system with 4 dofs. (i.e. *without* using the >>>>>>>>>>> DMComposite or *without* using the nested block matrices to >>>>>>>>>>> assemble different blocks separately and then combine them together). >>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my >>>>>>>>>>> code, because not using detect_saddle_point means I cannot use >>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>> use something like >>>>>>>>>> >>>>>>>>>> PetscObject pressure; >>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>> >>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> and then DMGetSubDM() uses this information to attach the null >>>>>>>>>> space to the IS that is created using the information in the PetscSection. >>>>>>>>>> If you use a PetscSection to set the data layout over the DMDA, I >>>>>>>>>> think this works correctly, but this has not been tested at all and is very >>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>> >>>>>>>>> >>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Bottom line: For custom null spaces using the default layout in >>>>>>>>>> DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and then >>>>>>>>>> reach in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>>>>>> have not reorganized that code (which could be very simple and >>>>>>>>>> inflexible since its very structured). >>>>>>>>>> >>>>>>>>> >>>>>>>>> So I tried to get this approach working but I could not succeed >>>>>>>>> and encountered some errors. Here is a code snippet: >>>>>>>>> >>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs (3 >>>>>>>>> velocity components and 1 pressure comp.) >>>>>>>>> ierr = >>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>> ierr = >>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>>> //I've the mNullSpaceSystem based on mDa, that contains a null space basis >>>>>>>>> for the complete system. >>>>>>>>> ierr = >>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>> >>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that was >>>>>>>>> obtained from the options (fieldsplit) >>>>>>>>> >>>>>>>>> ierr = >>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>> >>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> KSP *kspSchur; >>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>> ierr = >>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>> ierr = >>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>> using the mDaPressure. >>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>> >>>>>>>> >>>>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>>>> >>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>> >>>>>>> >>>>>>> Thanks, with this it seems I can attach the null space properly, but >>>>>>> I have a question regarding whether the Schur complement ksp solver is >>>>>>> actually using the preconditioner matrix I provide. >>>>>>> When using -ksp_view, the outer level pc object of type fieldsplit >>>>>>> does report that: "Preconditioner for the Schur complement formed from user >>>>>>> provided matrix", but in the KSP solver for Schur complement S, the pc >>>>>>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>>>>>> matrix I provide. Am I missing something here ? >>>>>>> Below are the relevant commented code snippet and the output of the >>>>>>> -ksp_view >>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>> >>>>>> >>>>>> If ILU does not error, it means it is using your matrix, because the >>>>>> Schur complement matrix cannot be factored, and FS says it is using your >>>>>> matrix. >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Code snippet: >>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The >>>>>>> nullspace for the whole system >>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up >>>>>>> mKsp with the options provided with fieldsplit and the fields associated >>>>>>> with the two splits. >>>>>>> >>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>> //Get the fieldsplit pc set up from the options >>>>>>> >>>>>>> ierr = >>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>> >>>>>>> KSP *kspSchur; >>>>>>> PetscInt kspSchurPos = 1; >>>>>>> ierr = >>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>> ierr = >>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>> >>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>> >>>>>>> >>>>>>> >>>>>>> the output of the -ksp_view >>>>>>> KSP Object: 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>> Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> has attached null space >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: 1 MPI processes >>>>>>> type: fieldsplit >>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>> factorization FULL >>>>>>> Preconditioner for the Schur complement formed from user >>>>>>> provided matrix >>>>>>> Split info: >>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>> Split number 1 Fields 3 >>>>>>> KSP solver for A00 block >>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>> type: ilu >>>>>>> ILU: out-of-place factorization >>>>>>> 0 levels of fill >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: natural >>>>>>> factor fill ratio given 1, needed 1 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2187, cols=2187 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> using I-node routines: found 729 nodes, limit used >>>>>>> is 5 >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2187, cols=2187 >>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> has attached null space >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>> type: ilu >>>>>>> ILU: out-of-place factorization >>>>>>> 0 levels of fill >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: natural >>>>>>> factor fill ratio given 1, needed 1 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=729, cols=729 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> not using I-node routines >>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: schurcomplement >>>>>>> rows=729, cols=729 >>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>> A11 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=729, cols=729 >>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> not using I-node routines >>>>>>> A10 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=729, cols=2187 >>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> not using I-node routines >>>>>>> KSP of A00 >>>>>>> KSP Object: (fieldsplit_0_) >>>>>>> 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>> divergence=10000 >>>>>>> left preconditioning >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: (fieldsplit_0_) >>>>>>> 1 MPI processes >>>>>>> type: ilu >>>>>>> ILU: out-of-place factorization >>>>>>> 0 levels of fill >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero >>>>>>> pivot >>>>>>> matrix ordering: natural >>>>>>> factor fill ratio given 1, needed 1 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI >>>>>>> processes >>>>>>> type: seqaij >>>>>>> rows=2187, cols=2187 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=140625, allocated >>>>>>> nonzeros=140625 >>>>>>> total number of mallocs used during >>>>>>> MatSetValues calls =0 >>>>>>> using I-node routines: found 729 nodes, >>>>>>> limit used is 5 >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2187, cols=2187 >>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> using I-node routines: found 729 nodes, limit >>>>>>> used is 5 >>>>>>> A01 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2187, cols=729 >>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> using I-node routines: found 729 nodes, limit used >>>>>>> is 5 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=729, cols=729 >>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2916, cols=2916, bs=4 >>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> or >>>>>>>> >>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> The errors I get when running with options: -pc_type fieldsplit >>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>> ------------------------------------ >>>>>>>>> [0]PETSC ERROR: No support for this operation for this object type! >>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>> [0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Fri Aug 23 08:01:34 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 23 Aug 2013 15:01:34 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: > On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: > >> >> >> >> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: >> >>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: >>>> >>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> Thanks Matt and Mark for comments in using near null space [question >>>>>> I asked in the thread with subject: *problem (Segmentation >>>>>> voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes >>>>>> in a cluster*]. >>>>>> So I understood that I have to set a nearNullSpace to A00 block where >>>>>> the null space correspond to the rigid body motion. I tried it but still >>>>>> the gamg just keeps on iterating and convergence is very very slow. I am >>>>>> not sure what the problem is, right now gamg does not even work for the >>>>>> constant viscosity case. >>>>>> I have set up the following in my code: >>>>>> 1. null space for the whole system A 2. null space for the Schur >>>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>>> inverse viscosity in the diagonal for S. >>>>>> >>>>> >>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>> >>>> >>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>> didn't get to the end to get -ksp_view results. >>>> Instead here I have put the -ksp_view output when running the program >>>> with following options: (In this case I get the results) >>>> -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits >>>> 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 >>>> -ksp_converged_reason -ksp_view >>>> >>> >>> Okay, that looks fine. Does >>> >>> -fieldsplit_0_pc_type lu >>> - fieldsplit_1_ksp_rtol 1.0e-10 >>> >>> converge in one Iterate? >>> >>> What matrix did you attach as the preconditioner matrix for >>> fieldsplit_1_? >>> >> >> >> I used a diagonal matrix with reciprocal of viscosity values of the >> corresponding cell centers as the preconditioner. >> >> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >> I get the following output which means the outer ksp did converge in one >> iterate I guess. >> Linear solve converged due to CONVERGED_RTOL iterations 18 >> Linear solve converged due to CONVERGED_RTOL iterations 18 >> Linear solve converged due to CONVERGED_RTOL iterations 1 >> > > Okay, so A_00 is nonsingular, and the system seems to solve alright. What > do you get for > > -fieldsplit_0_ksp_max_it 30 > -fieldsplit_0_pc_type gamg > -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_ksp_converged_reason > > It fieldsplit_0_ does not converge in 30 iterations. It gives: Linear solve converged due to CONVERGED_ATOL iterations 0 Linear solve did not converge due to DIVERGED_ITS iterations 30 and continues with the same message. > This is the kind of investigation you msut be comfortable with if you want > to experiment with these solvers. > > Matt > > >> >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>> KSP Object: 1 MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> has attached null space >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: 1 MPI processes >>>> type: fieldsplit >>>> FieldSplit with Schur preconditioner, blocksize = 4, factorization >>>> FULL >>>> Preconditioner for the Schur complement formed from user provided >>>> matrix >>>> Split info: >>>> Split number 0 Fields 0, 1, 2 >>>> Split number 1 Fields 3 >>>> KSP solver for A00 block >>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>> type: ilu >>>> ILU: out-of-place factorization >>>> 0 levels of fill >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: natural >>>> factor fill ratio given 1, needed 1 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=8232, cols=8232 >>>> package used to perform factorization: petsc >>>> total: nonzeros=576000, allocated nonzeros=576000 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> using I-node routines: found 2744 nodes, limit used >>>> is 5 >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=8232, cols=8232 >>>> total: nonzeros=576000, allocated nonzeros=576000 >>>> total number of mallocs used during MatSetValues calls =0 >>>> using I-node routines: found 2744 nodes, limit used is 5 >>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>> Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>> left preconditioning >>>> has attached null space >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>> type: ilu >>>> ILU: out-of-place factorization >>>> 0 levels of fill >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: natural >>>> factor fill ratio given 1, needed 1 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2744, cols=2744 >>>> package used to perform factorization: petsc >>>> total: nonzeros=64000, allocated nonzeros=64000 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> not using I-node routines >>>> linear system matrix followed by preconditioner matrix: >>>> Matrix Object: 1 MPI processes >>>> type: schurcomplement >>>> rows=2744, cols=2744 >>>> Schur complement A11 - A10 inv(A00) A01 >>>> A11 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2744, cols=2744 >>>> total: nonzeros=64000, allocated nonzeros=64000 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> not using I-node routines >>>> A10 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2744, cols=8232 >>>> total: nonzeros=192000, allocated nonzeros=192000 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> not using I-node routines >>>> KSP of A00 >>>> KSP Object: (fieldsplit_0_) 1 >>>> MPI processes >>>> type: gmres >>>> GMRES: restart=30, using Classical (unmodified) >>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>> GMRES: happy breakdown tolerance 1e-30 >>>> maximum iterations=10000, initial guess is zero >>>> tolerances: relative=1e-05, absolute=1e-50, >>>> divergence=10000 >>>> left preconditioning >>>> using PRECONDITIONED norm type for convergence test >>>> PC Object: (fieldsplit_0_) 1 >>>> MPI processes >>>> type: ilu >>>> ILU: out-of-place factorization >>>> 0 levels of fill >>>> tolerance for zero pivot 2.22045e-14 >>>> using diagonal shift on blocks to prevent zero pivot >>>> matrix ordering: natural >>>> factor fill ratio given 1, needed 1 >>>> Factored matrix follows: >>>> Matrix Object: 1 MPI >>>> processes >>>> type: seqaij >>>> rows=8232, cols=8232 >>>> package used to perform factorization: petsc >>>> total: nonzeros=576000, allocated >>>> nonzeros=576000 >>>> total number of mallocs used during >>>> MatSetValues calls =0 >>>> using I-node routines: found 2744 nodes, >>>> limit used is 5 >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=8232, cols=8232 >>>> total: nonzeros=576000, allocated nonzeros=576000 >>>> total number of mallocs used during MatSetValues >>>> calls =0 >>>> using I-node routines: found 2744 nodes, limit used >>>> is 5 >>>> A01 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=8232, cols=2744 >>>> total: nonzeros=192000, allocated nonzeros=192000 >>>> total number of mallocs used during MatSetValues calls >>>> =0 >>>> using I-node routines: found 2744 nodes, limit used >>>> is 5 >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=2744, cols=2744 >>>> total: nonzeros=64000, allocated nonzeros=64000 >>>> total number of mallocs used during MatSetValues calls =0 >>>> not using I-node routines >>>> linear system matrix = precond matrix: >>>> Matrix Object: 1 MPI processes >>>> type: seqaij >>>> rows=10976, cols=10976, bs=4 >>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>> total number of mallocs used during MatSetValues calls =0 >>>> using I-node routines: found 2744 nodes, limit used is 5 >>>> >>>> >>>> >>>> >>>>> Matt >>>>> >>>>> >>>>>> I am testing a small problem with CONSTANT viscosity for grid size of >>>>>> 14^3 with the run time option: >>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>> >>>>>> Here is my relevant code of the solve function: >>>>>> PetscErrorCode ierr; >>>>>> PetscFunctionBeginUser; >>>>>> ierr = >>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>> ierr = >>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with >>>>>> dof = 4, vx,vy,vz and p. >>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace >>>>>> for the main system >>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register >>>>>> the fieldsplits obtained from options. >>>>>> >>>>>> //Setting up user PC for Schur Complement >>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>> ierr = >>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>> >>>>>> KSP *subKsp; >>>>>> PetscInt subKspPos = 0; >>>>>> //Set up nearNullspace for A00 block. >>>>>> ierr = >>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>> MatNullSpace rigidBodyModes; >>>>>> Vec coords; >>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>> ierr = >>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>> Mat matA00; >>>>>> ierr = KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>> ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>> >>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>> pressure space >>>>>> subKspPos = 1; >>>>>> ierr = >>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>> //Set up the null space of constant pressure. >>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>> PetscBool isNull; >>>>>> Mat matSc; >>>>>> ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>> if(!isNull) >>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid pressure >>>>>> null space \n"); >>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>> if(!isNull) >>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system >>>>>> null space \n"); >>>>>> >>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>> >>>>>> >>>>>> PetscFunctionReturn(0); >>>>>> >>>>>> >>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each for >>>>>>>>>>>>>>>>> both 2D and 3D, in >>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces to >>>>>>>>>>>>>>>>> account for the >>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and I >>>>>>>>>>>>>>>>> recommend >>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement reduction >>>>>>>>>>>>>>>>> (make that >>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). You can >>>>>>>>>>>>>>>>> use PCLSC or >>>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. >>>>>>>>>>>>>>>>> This diagonal >>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, depending >>>>>>>>>>>>>>>>> on >>>>>>>>>>>>>>>>> discretization) approximation of the Schur complement. >>>>>>>>>>>>>>>>> The velocity >>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if you >>>>>>>>>>>>>>>>> get stuck. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal >>>>>>>>>>>>>>>> matrix to use as the preconditioner for the Schur complement solve step as >>>>>>>>>>>>>>>> you suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>>>>> A naive approach that I can think of would be to create a >>>>>>>>>>>>>>>> vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the >>>>>>>>>>>>>>>> Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for >>>>>>>>>>>>>>>> the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size >>>>>>>>>>>>>> but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Yes. >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> How can I set this split in the code itself without giving it >>>>>>>>>>>> as a command line option when the system matrix is assembled from the DMDA >>>>>>>>>>>> for the whole system with 4 dofs. (i.e. *without* using the >>>>>>>>>>>> DMComposite or *without* using the nested block matrices to >>>>>>>>>>>> assemble different blocks separately and then combine them together). >>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my >>>>>>>>>>>> code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>> use something like >>>>>>>>>>> >>>>>>>>>>> PetscObject pressure; >>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>> >>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> and then DMGetSubDM() uses this information to attach the null >>>>>>>>>>> space to the IS that is created using the information in the PetscSection. >>>>>>>>>>> If you use a PetscSection to set the data layout over the DMDA, >>>>>>>>>>> I think this works correctly, but this has not been tested at all and is >>>>>>>>>>> very >>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Bottom line: For custom null spaces using the default layout in >>>>>>>>>>> DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and then >>>>>>>>>>> reach in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>>>>>>> have not reorganized that code (which could be very simple and >>>>>>>>>>> inflexible since its very structured). >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> So I tried to get this approach working but I could not succeed >>>>>>>>>> and encountered some errors. Here is a code snippet: >>>>>>>>>> >>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs >>>>>>>>>> (3 velocity components and 1 pressure comp.) >>>>>>>>>> ierr = >>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>> complete system. >>>>>>>>>> ierr = >>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>> >>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that >>>>>>>>>> was obtained from the options (fieldsplit) >>>>>>>>>> >>>>>>>>>> ierr = >>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>> >>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> KSP *kspSchur; >>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>> ierr = >>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>> using the mDaPressure. >>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>> >>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>>>>> >>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>> >>>>>>>> >>>>>>>> Thanks, with this it seems I can attach the null space properly, >>>>>>>> but I have a question regarding whether the Schur complement ksp solver is >>>>>>>> actually using the preconditioner matrix I provide. >>>>>>>> When using -ksp_view, the outer level pc object of type fieldsplit >>>>>>>> does report that: "Preconditioner for the Schur complement formed from user >>>>>>>> provided matrix", but in the KSP solver for Schur complement S, the pc >>>>>>>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>>>>>>> matrix I provide. Am I missing something here ? >>>>>>>> Below are the relevant commented code snippet and the output of the >>>>>>>> -ksp_view >>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>> >>>>>>> >>>>>>> If ILU does not error, it means it is using your matrix, because the >>>>>>> Schur complement matrix cannot be factored, and FS says it is using your >>>>>>> matrix. >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Code snippet: >>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>> //The nullspace for the whole system >>>>>>>> ierr = >>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set up >>>>>>>> mKsp with the options provided with fieldsplit and the fields associated >>>>>>>> with the two splits. >>>>>>>> >>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>> >>>>>>>> ierr = >>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>> >>>>>>>> KSP *kspSchur; >>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>> ierr = >>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>> ierr = >>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>> >>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> the output of the -ksp_view >>>>>>>> KSP Object: 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>> Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> has attached null space >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: 1 MPI processes >>>>>>>> type: fieldsplit >>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>> factorization FULL >>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>> provided matrix >>>>>>>> Split info: >>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>> Split number 1 Fields 3 >>>>>>>> KSP solver for A00 block >>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>> divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>> type: ilu >>>>>>>> ILU: out-of-place factorization >>>>>>>> 0 levels of fill >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: natural >>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2187, cols=2187 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>> used is 5 >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2187, cols=2187 >>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>> divergence=10000 >>>>>>>> left preconditioning >>>>>>>> has attached null space >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>> type: ilu >>>>>>>> ILU: out-of-place factorization >>>>>>>> 0 levels of fill >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: natural >>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=729, cols=729 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> not using I-node routines >>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: schurcomplement >>>>>>>> rows=729, cols=729 >>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>> A11 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=729, cols=729 >>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> not using I-node routines >>>>>>>> A10 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=729, cols=2187 >>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> not using I-node routines >>>>>>>> KSP of A00 >>>>>>>> KSP Object: >>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>> divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: (fieldsplit_0_) >>>>>>>> 1 MPI processes >>>>>>>> type: ilu >>>>>>>> ILU: out-of-place factorization >>>>>>>> 0 levels of fill >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>> pivot >>>>>>>> matrix ordering: natural >>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI >>>>>>>> processes >>>>>>>> type: seqaij >>>>>>>> rows=2187, cols=2187 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=140625, allocated >>>>>>>> nonzeros=140625 >>>>>>>> total number of mallocs used during >>>>>>>> MatSetValues calls =0 >>>>>>>> using I-node routines: found 729 nodes, >>>>>>>> limit used is 5 >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2187, cols=2187 >>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>> used is 5 >>>>>>>> A01 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2187, cols=729 >>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>> used is 5 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=729, cols=729 >>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> or >>>>>>>>> >>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> The errors I get when running with options: -pc_type fieldsplit >>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>> ------------------------------------ >>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>> type! >>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>> shooting. >>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 23 08:16:56 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Aug 2013 08:16:56 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 8:01 AM, Bishesh Khanal wrote: > > > > On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: > >> On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: >>> >>>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: >>>>> >>>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> Thanks Matt and Mark for comments in using near null space [question >>>>>>> I asked in the thread with subject: *problem (Segmentation >>>>>>> voilation) using -pc_type hypre -pc_hypre_type -pilut with multiple nodes >>>>>>> in a cluster*]. >>>>>>> So I understood that I have to set a nearNullSpace to A00 block >>>>>>> where the null space correspond to the rigid body motion. I tried it but >>>>>>> still the gamg just keeps on iterating and convergence is very very slow. I >>>>>>> am not sure what the problem is, right now gamg does not even work for the >>>>>>> constant viscosity case. >>>>>>> I have set up the following in my code: >>>>>>> 1. null space for the whole system A 2. null space for the Schur >>>>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>>>> inverse viscosity in the diagonal for S. >>>>>>> >>>>>> >>>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>>> >>>>> >>>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>>> didn't get to the end to get -ksp_view results. >>>>> Instead here I have put the -ksp_view output when running the program >>>>> with following options: (In this case I get the results) >>>>> -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_dm_splits >>>>> 0 -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 3 >>>>> -ksp_converged_reason -ksp_view >>>>> >>>> >>>> Okay, that looks fine. Does >>>> >>>> -fieldsplit_0_pc_type lu >>>> - fieldsplit_1_ksp_rtol 1.0e-10 >>>> >>>> converge in one Iterate? >>>> >>>> What matrix did you attach as the preconditioner matrix for >>>> fieldsplit_1_? >>>> >>> >>> >>> I used a diagonal matrix with reciprocal of viscosity values of the >>> corresponding cell centers as the preconditioner. >>> >>> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >>> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >>> I get the following output which means the outer ksp did converge in one >>> iterate I guess. >>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>> Linear solve converged due to CONVERGED_RTOL iterations 1 >>> >> >> Okay, so A_00 is nonsingular, and the system seems to solve alright. What >> do you get for >> >> -fieldsplit_0_ksp_max_it 30 >> -fieldsplit_0_pc_type gamg >> -fieldsplit_0_ksp_converged_reason >> -fieldsplit_1_ksp_converged_reason >> >> > > It fieldsplit_0_ does not converge in 30 iterations. It gives: > Linear solve converged due to CONVERGED_ATOL iterations 0 > Linear solve did not converge due to DIVERGED_ITS iterations 30 > > and continues with the same message. > So what would you do? Give up? -fieldsplit_0_ksp_gmres_restart 200 The idea is to figure out what is going on: -fieldsplit_0_ksp_monitor_true_residual Matt > > > >> This is the kind of investigation you msut be comfortable with if you >> want to experiment with these solvers. >> >> Matt >> >> >>> >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>>> KSP Object: 1 MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>> Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> has attached null space >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: 1 MPI processes >>>>> type: fieldsplit >>>>> FieldSplit with Schur preconditioner, blocksize = 4, factorization >>>>> FULL >>>>> Preconditioner for the Schur complement formed from user provided >>>>> matrix >>>>> Split info: >>>>> Split number 0 Fields 0, 1, 2 >>>>> Split number 1 Fields 3 >>>>> KSP solver for A00 block >>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>> Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>> type: ilu >>>>> ILU: out-of-place factorization >>>>> 0 levels of fill >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: natural >>>>> factor fill ratio given 1, needed 1 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=8232, cols=8232 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> using I-node routines: found 2744 nodes, limit used >>>>> is 5 >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=8232, cols=8232 >>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>> Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>> left preconditioning >>>>> has attached null space >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>> type: ilu >>>>> ILU: out-of-place factorization >>>>> 0 levels of fill >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: natural >>>>> factor fill ratio given 1, needed 1 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2744, cols=2744 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> not using I-node routines >>>>> linear system matrix followed by preconditioner matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: schurcomplement >>>>> rows=2744, cols=2744 >>>>> Schur complement A11 - A10 inv(A00) A01 >>>>> A11 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2744, cols=2744 >>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> not using I-node routines >>>>> A10 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2744, cols=8232 >>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> not using I-node routines >>>>> KSP of A00 >>>>> KSP Object: (fieldsplit_0_) 1 >>>>> MPI processes >>>>> type: gmres >>>>> GMRES: restart=30, using Classical (unmodified) >>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>> GMRES: happy breakdown tolerance 1e-30 >>>>> maximum iterations=10000, initial guess is zero >>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>> divergence=10000 >>>>> left preconditioning >>>>> using PRECONDITIONED norm type for convergence test >>>>> PC Object: (fieldsplit_0_) 1 >>>>> MPI processes >>>>> type: ilu >>>>> ILU: out-of-place factorization >>>>> 0 levels of fill >>>>> tolerance for zero pivot 2.22045e-14 >>>>> using diagonal shift on blocks to prevent zero pivot >>>>> matrix ordering: natural >>>>> factor fill ratio given 1, needed 1 >>>>> Factored matrix follows: >>>>> Matrix Object: 1 MPI >>>>> processes >>>>> type: seqaij >>>>> rows=8232, cols=8232 >>>>> package used to perform factorization: petsc >>>>> total: nonzeros=576000, allocated >>>>> nonzeros=576000 >>>>> total number of mallocs used during >>>>> MatSetValues calls =0 >>>>> using I-node routines: found 2744 nodes, >>>>> limit used is 5 >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=8232, cols=8232 >>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>> total number of mallocs used during MatSetValues >>>>> calls =0 >>>>> using I-node routines: found 2744 nodes, limit >>>>> used is 5 >>>>> A01 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=8232, cols=2744 >>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>> total number of mallocs used during MatSetValues calls >>>>> =0 >>>>> using I-node routines: found 2744 nodes, limit used >>>>> is 5 >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=2744, cols=2744 >>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> not using I-node routines >>>>> linear system matrix = precond matrix: >>>>> Matrix Object: 1 MPI processes >>>>> type: seqaij >>>>> rows=10976, cols=10976, bs=4 >>>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>> >>>>> >>>>> >>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> I am testing a small problem with CONSTANT viscosity for grid size >>>>>>> of 14^3 with the run time option: >>>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>>> >>>>>>> Here is my relevant code of the solve function: >>>>>>> PetscErrorCode ierr; >>>>>>> PetscFunctionBeginUser; >>>>>>> ierr = >>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>> ierr = >>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa with >>>>>>> dof = 4, vx,vy,vz and p. >>>>>>> ierr = >>>>>>> KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main >>>>>>> system >>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //register >>>>>>> the fieldsplits obtained from options. >>>>>>> >>>>>>> //Setting up user PC for Schur Complement >>>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>>> ierr = >>>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>> >>>>>>> KSP *subKsp; >>>>>>> PetscInt subKspPos = 0; >>>>>>> //Set up nearNullspace for A00 block. >>>>>>> ierr = >>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>> MatNullSpace rigidBodyModes; >>>>>>> Vec coords; >>>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>>> ierr = >>>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>>> Mat matA00; >>>>>>> ierr = >>>>>>> KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>>> ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>>> >>>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>>> pressure space >>>>>>> subKspPos = 1; >>>>>>> ierr = >>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>> //Set up the null space of constant pressure. >>>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>>> PetscBool isNull; >>>>>>> Mat matSc; >>>>>>> ierr = KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>>> if(!isNull) >>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>> pressure null space \n"); >>>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>>> if(!isNull) >>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system >>>>>>> null space \n"); >>>>>>> >>>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>>> >>>>>>> >>>>>>> PetscFunctionReturn(0); >>>>>>> >>>>>>> >>>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal >>>>>>> > wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each for >>>>>>>>>>>>>>>>>> both 2D and 3D, in >>>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces >>>>>>>>>>>>>>>>>> to account for the >>>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and >>>>>>>>>>>>>>>>>> I recommend >>>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement >>>>>>>>>>>>>>>>>> reduction (make that >>>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). You >>>>>>>>>>>>>>>>>> can use PCLSC or >>>>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. >>>>>>>>>>>>>>>>>> This diagonal >>>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, >>>>>>>>>>>>>>>>>> depending on >>>>>>>>>>>>>>>>>> discretization) approximation of the Schur complement. >>>>>>>>>>>>>>>>>> The velocity >>>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if >>>>>>>>>>>>>>>>>> you get stuck. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal >>>>>>>>>>>>>>>>> matrix to use as the preconditioner for the Schur complement solve step as >>>>>>>>>>>>>>>>> you suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>>>>>> A naive approach that I can think of would be to create a >>>>>>>>>>>>>>>>> vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the >>>>>>>>>>>>>>>>> Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but for >>>>>>>>>>>>>>>>> the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size >>>>>>>>>>>>>>> but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Yes. >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> How can I set this split in the code itself without giving it >>>>>>>>>>>>> as a command line option when the system matrix is assembled from the DMDA >>>>>>>>>>>>> for the whole system with 4 dofs. (i.e. *without* using the >>>>>>>>>>>>> DMComposite or *without* using the nested block matrices to >>>>>>>>>>>>> assemble different blocks separately and then combine them together). >>>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my >>>>>>>>>>>>> code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>>> use something like >>>>>>>>>>>> >>>>>>>>>>>> PetscObject pressure; >>>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>>> >>>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> and then DMGetSubDM() uses this information to attach the null >>>>>>>>>>>> space to the IS that is created using the information in the PetscSection. >>>>>>>>>>>> If you use a PetscSection to set the data layout over the DMDA, >>>>>>>>>>>> I think this works correctly, but this has not been tested at all and is >>>>>>>>>>>> very >>>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Bottom line: For custom null spaces using the default layout in >>>>>>>>>>>> DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and then >>>>>>>>>>>> reach in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>>>>>>>> have not reorganized that code (which could be very simple and >>>>>>>>>>>> inflexible since its very structured). >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> So I tried to get this approach working but I could not succeed >>>>>>>>>>> and encountered some errors. Here is a code snippet: >>>>>>>>>>> >>>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs >>>>>>>>>>> (3 velocity components and 1 pressure comp.) >>>>>>>>>>> ierr = >>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>>> complete system. >>>>>>>>>>> ierr = >>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>>> >>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that >>>>>>>>>>> was obtained from the options (fieldsplit) >>>>>>>>>>> >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>>> >>>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>>> using the mDaPressure. >>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>>>>>> >>>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>>> >>>>>>>>> >>>>>>>>> Thanks, with this it seems I can attach the null space properly, >>>>>>>>> but I have a question regarding whether the Schur complement ksp solver is >>>>>>>>> actually using the preconditioner matrix I provide. >>>>>>>>> When using -ksp_view, the outer level pc object of type fieldsplit >>>>>>>>> does report that: "Preconditioner for the Schur complement formed from user >>>>>>>>> provided matrix", but in the KSP solver for Schur complement S, the pc >>>>>>>>> object (fieldsplit_1_) is of type ilu and doesn't say that it is using the >>>>>>>>> matrix I provide. Am I missing something here ? >>>>>>>>> Below are the relevant commented code snippet and the output of >>>>>>>>> the -ksp_view >>>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>>> >>>>>>>> >>>>>>>> If ILU does not error, it means it is using your matrix, because >>>>>>>> the Schur complement matrix cannot be factored, and FS says it is using >>>>>>>> your matrix. >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Code snippet: >>>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>>> //The nullspace for the whole system >>>>>>>>> ierr = >>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set >>>>>>>>> up mKsp with the options provided with fieldsplit and the fields associated >>>>>>>>> with the two splits. >>>>>>>>> >>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>>> >>>>>>>>> ierr = >>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>>> >>>>>>>>> KSP *kspSchur; >>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>> ierr = >>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>> ierr = >>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> the output of the -ksp_view >>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>>> Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> has attached null space >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: 1 MPI processes >>>>>>>>> type: fieldsplit >>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>> factorization FULL >>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>> provided matrix >>>>>>>>> Split info: >>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>> Split number 1 Fields 3 >>>>>>>>> KSP solver for A00 block >>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2187, cols=2187 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2187, cols=2187 >>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> has attached null space >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=729, cols=729 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: schurcomplement >>>>>>>>> rows=729, cols=729 >>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>> A11 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=729, cols=729 >>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> A10 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=729, cols=2187 >>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> KSP of A00 >>>>>>>>> KSP Object: >>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: >>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>>> pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI >>>>>>>>> processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2187, cols=2187 >>>>>>>>> package used to perform factorization: >>>>>>>>> petsc >>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>> nonzeros=140625 >>>>>>>>> total number of mallocs used during >>>>>>>>> MatSetValues calls =0 >>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>> limit used is 5 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2187, cols=2187 >>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> A01 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2187, cols=729 >>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=729, cols=729 >>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>>> or >>>>>>>>>> >>>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> The errors I get when running with options: -pc_type fieldsplit >>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>> ------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>> type! >>>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>> shooting. >>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>>>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Fri Aug 23 08:30:59 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 23 Aug 2013 15:30:59 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 3:16 PM, Matthew Knepley wrote: > On Fri, Aug 23, 2013 at 8:01 AM, Bishesh Khanal wrote: > >> >> >> >> On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: >> >>> On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: >>>> >>>>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal >>>>>> > wrote: >>>>>>> >>>>>>>> >>>>>>>> Thanks Matt and Mark for comments in using near null space >>>>>>>> [question I asked in the thread with subject: *problem >>>>>>>> (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with >>>>>>>> multiple nodes in a cluster*]. >>>>>>>> So I understood that I have to set a nearNullSpace to A00 block >>>>>>>> where the null space correspond to the rigid body motion. I tried it but >>>>>>>> still the gamg just keeps on iterating and convergence is very very slow. I >>>>>>>> am not sure what the problem is, right now gamg does not even work for the >>>>>>>> constant viscosity case. >>>>>>>> I have set up the following in my code: >>>>>>>> 1. null space for the whole system A 2. null space for the Schur >>>>>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>>>>> inverse viscosity in the diagonal for S. >>>>>>>> >>>>>>> >>>>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>>>> >>>>>> >>>>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>>>> didn't get to the end to get -ksp_view results. >>>>>> Instead here I have put the -ksp_view output when running the program >>>>>> with following options: (In this case I get the results) >>>>>> -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>> >>>>> >>>>> Okay, that looks fine. Does >>>>> >>>>> -fieldsplit_0_pc_type lu >>>>> - fieldsplit_1_ksp_rtol 1.0e-10 >>>>> >>>>> converge in one Iterate? >>>>> >>>>> What matrix did you attach as the preconditioner matrix for >>>>> fieldsplit_1_? >>>>> >>>> >>>> >>>> I used a diagonal matrix with reciprocal of viscosity values of the >>>> corresponding cell centers as the preconditioner. >>>> >>>> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >>>> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >>>> I get the following output which means the outer ksp did converge in >>>> one iterate I guess. >>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>> Linear solve converged due to CONVERGED_RTOL iterations 1 >>>> >>> >>> Okay, so A_00 is nonsingular, and the system seems to solve alright. >>> What do you get for >>> >>> -fieldsplit_0_ksp_max_it 30 >>> -fieldsplit_0_pc_type gamg >>> -fieldsplit_0_ksp_converged_reason >>> -fieldsplit_1_ksp_converged_reason >>> >>> >> >> It fieldsplit_0_ does not converge in 30 iterations. It gives: >> Linear solve converged due to CONVERGED_ATOL iterations 0 >> Linear solve did not converge due to DIVERGED_ITS iterations 30 >> >> and continues with the same message. >> > > So what would you do? Give up? > > No, I don't want to give up :) > -fieldsplit_0_ksp_gmres_restart 200 > > The idea is to figure out what is going on: > > -fieldsplit_0_ksp_monitor_true_residual > > I have tried these options before too, the residual is decreasing very very slowly, but I've not been able to figure out why. (using hypre does converge although slowly again, but I had problems using hypre with multiple nodes in a cluster with segmentation fault (we discussed that in another thread!) ) e.g a snapshot of the output: Residual norms for fieldsplit_0_ solve. 0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm 0.000000000000e+00 ||r(i)||/||b|| -nan Linear solve converged due to CONVERGED_ATOL iterations 0 Residual norms for fieldsplit_0_ solve. 0 KSP preconditioned resid norm 2.619231455875e-01 true resid norm 3.637306695895e+02 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 9.351491725479e-02 true resid norm 6.013334574957e+01 ||r(i)||/||b|| 1.653238255038e-01 2 KSP preconditioned resid norm 6.010357491087e-02 true resid norm 3.664473273769e+01 ||r(i)||/||b|| 1.007468871928e-01 3 KSP preconditioned resid norm 6.006968012944e-02 true resid norm 3.696451770148e+01 ||r(i)||/||b|| 1.016260678353e-01 4 KSP preconditioned resid norm 4.418407037098e-02 true resid norm 3.184810838034e+01 ||r(i)||/||b|| 8.755959022176e-02 ... ... 93 KSP preconditioned resid norm 4.549506047737e-04 true resid norm 2.877594552685e+00 ||r(i)||/||b|| 7.911333283864e-03 94 KSP preconditioned resid norm 4.515424416235e-04 true resid norm 2.875249044668e+00 ||r(i)||/||b|| 7.904884809172e-03 95 KSP preconditioned resid norm 4.277647876573e-04 true resid norm 2.830418831358e+00 ||r(i)||/||b|| 7.781633686685e-03 96 KSP preconditioned resid norm 4.244529173876e-04 true resid norm 2.807041401408e+00 ||r(i)||/||b|| 7.717362422521e-03 97 KSP preconditioned resid norm 4.138326570674e-04 true resid norm 2.793663020386e+00 ||r(i)||/||b|| 7.680581413547e-03 98 KSP preconditioned resid norm 3.869979433609e-04 true resid norm 2.715150386650e+00 ||r(i)||/||b|| 7.464727650583e-03 99 KSP preconditioned resid norm 3.847873979265e-04 true resid norm 2.706008990336e+00 ||r(i)||/||b|| 7.439595328571e-03 .... .... 294 KSP preconditioned resid norm 1.416482289961e-04 true resid norm 2.735750748819e+00 ||r(i)||/||b|| 7.521363958412e-03 295 KSP preconditioned resid norm 1.415389087364e-04 true resid norm 2.742638608355e+00 ||r(i)||/||b|| 7.540300661064e-03 296 KSP preconditioned resid norm 1.414967651105e-04 true resid norm 2.747224243968e+00 ||r(i)||/||b|| 7.552907889424e-03 297 KSP preconditioned resid norm 1.413843018303e-04 true resid norm 2.752574248710e+00 ||r(i)||/||b|| 7.567616587891e-03 298 KSP preconditioned resid norm 1.411747949695e-04 true resid norm 2.765459647367e+00 ||r(i)||/||b|| 7.603042246859e-03 299 KSP preconditioned resid norm 1.411609742082e-04 true resid norm 2.765900464868e+00 ||r(i)||/||b|| 7.604254180683e-03 300 KSP preconditioned resid norm 1.409844332838e-04 true resid norm 2.771790506811e+00 ||r(i)||/||b|| 7.620447596402e-03 Linear solve did not converge due to DIVERGED_ITS iterations 300 Residual norms for fieldsplit_0_ solve. 0 KSP preconditioned resid norm 1.294272083271e-03 true resid norm 1.776945075651e+00 ||r(i)||/||b|| 1.000000000000e+00 ... ... > Matt > > >> >> >> >>> This is the kind of investigation you msut be comfortable with if you >>> want to experiment with these solvers. >>> >>> Matt >>> >>> >>>> >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>>>> KSP Object: 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>> Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> has attached null space >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: 1 MPI processes >>>>>> type: fieldsplit >>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>> factorization FULL >>>>>> Preconditioner for the Schur complement formed from user provided >>>>>> matrix >>>>>> Split info: >>>>>> Split number 0 Fields 0, 1, 2 >>>>>> Split number 1 Fields 3 >>>>>> KSP solver for A00 block >>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>> type: ilu >>>>>> ILU: out-of-place factorization >>>>>> 0 levels of fill >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: natural >>>>>> factor fill ratio given 1, needed 1 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=8232, cols=8232 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> using I-node routines: found 2744 nodes, limit used >>>>>> is 5 >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=8232, cols=8232 >>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>> left preconditioning >>>>>> has attached null space >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>> type: ilu >>>>>> ILU: out-of-place factorization >>>>>> 0 levels of fill >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: natural >>>>>> factor fill ratio given 1, needed 1 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2744, cols=2744 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> not using I-node routines >>>>>> linear system matrix followed by preconditioner matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: schurcomplement >>>>>> rows=2744, cols=2744 >>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>> A11 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2744, cols=2744 >>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> not using I-node routines >>>>>> A10 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2744, cols=8232 >>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> not using I-node routines >>>>>> KSP of A00 >>>>>> KSP Object: (fieldsplit_0_) >>>>>> 1 MPI processes >>>>>> type: gmres >>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>> maximum iterations=10000, initial guess is zero >>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>> divergence=10000 >>>>>> left preconditioning >>>>>> using PRECONDITIONED norm type for convergence test >>>>>> PC Object: (fieldsplit_0_) 1 >>>>>> MPI processes >>>>>> type: ilu >>>>>> ILU: out-of-place factorization >>>>>> 0 levels of fill >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>> matrix ordering: natural >>>>>> factor fill ratio given 1, needed 1 >>>>>> Factored matrix follows: >>>>>> Matrix Object: 1 MPI >>>>>> processes >>>>>> type: seqaij >>>>>> rows=8232, cols=8232 >>>>>> package used to perform factorization: petsc >>>>>> total: nonzeros=576000, allocated >>>>>> nonzeros=576000 >>>>>> total number of mallocs used during >>>>>> MatSetValues calls =0 >>>>>> using I-node routines: found 2744 nodes, >>>>>> limit used is 5 >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=8232, cols=8232 >>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> using I-node routines: found 2744 nodes, limit >>>>>> used is 5 >>>>>> A01 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=8232, cols=2744 >>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>> total number of mallocs used during MatSetValues >>>>>> calls =0 >>>>>> using I-node routines: found 2744 nodes, limit used >>>>>> is 5 >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=2744, cols=2744 >>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> not using I-node routines >>>>>> linear system matrix = precond matrix: >>>>>> Matrix Object: 1 MPI processes >>>>>> type: seqaij >>>>>> rows=10976, cols=10976, bs=4 >>>>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> I am testing a small problem with CONSTANT viscosity for grid size >>>>>>>> of 14^3 with the run time option: >>>>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>>>> >>>>>>>> Here is my relevant code of the solve function: >>>>>>>> PetscErrorCode ierr; >>>>>>>> PetscFunctionBeginUser; >>>>>>>> ierr = >>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>> ierr = >>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa >>>>>>>> with dof = 4, vx,vy,vz and p. >>>>>>>> ierr = >>>>>>>> KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main >>>>>>>> system >>>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>> //register the fieldsplits obtained from options. >>>>>>>> >>>>>>>> //Setting up user PC for Schur Complement >>>>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>>>> ierr = >>>>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>> >>>>>>>> KSP *subKsp; >>>>>>>> PetscInt subKspPos = 0; >>>>>>>> //Set up nearNullspace for A00 block. >>>>>>>> ierr = >>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>> MatNullSpace rigidBodyModes; >>>>>>>> Vec coords; >>>>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>>>> ierr = >>>>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>>>> Mat matA00; >>>>>>>> ierr = >>>>>>>> KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>>>> ierr = MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>>>> >>>>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>>>> pressure space >>>>>>>> subKspPos = 1; >>>>>>>> ierr = >>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>> //Set up the null space of constant pressure. >>>>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>>>> PetscBool isNull; >>>>>>>> Mat matSc; >>>>>>>> ierr = >>>>>>>> KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>>>> if(!isNull) >>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>> pressure null space \n"); >>>>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>>>> if(!isNull) >>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid system >>>>>>>> null space \n"); >>>>>>>> >>>>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>>>> >>>>>>>> >>>>>>>> PetscFunctionReturn(0); >>>>>>>> >>>>>>>> >>>>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley wrote: >>>>>>>> >>>>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each for >>>>>>>>>>>>>>>>>>> both 2D and 3D, in >>>>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces >>>>>>>>>>>>>>>>>>> to account for the >>>>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem and >>>>>>>>>>>>>>>>>>> I recommend >>>>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement >>>>>>>>>>>>>>>>>>> reduction (make that >>>>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). You >>>>>>>>>>>>>>>>>>> can use PCLSC or >>>>>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. >>>>>>>>>>>>>>>>>>> This diagonal >>>>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, >>>>>>>>>>>>>>>>>>> depending on >>>>>>>>>>>>>>>>>>> discretization) approximation of the Schur complement. >>>>>>>>>>>>>>>>>>> The velocity >>>>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if >>>>>>>>>>>>>>>>>>> you get stuck. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal >>>>>>>>>>>>>>>>>> matrix to use as the preconditioner for the Schur complement solve step as >>>>>>>>>>>>>>>>>> you suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>>>>>>> A naive approach that I can think of would be to create a >>>>>>>>>>>>>>>>>> vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the >>>>>>>>>>>>>>>>>> Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but >>>>>>>>>>>>>>>>>> for the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size >>>>>>>>>>>>>>>> but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Yes. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> How can I set this split in the code itself without giving it >>>>>>>>>>>>>> as a command line option when the system matrix is assembled from the DMDA >>>>>>>>>>>>>> for the whole system with 4 dofs. (i.e. *without* using the >>>>>>>>>>>>>> DMComposite or *without* using the nested block matrices to >>>>>>>>>>>>>> assemble different blocks separately and then combine them together). >>>>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my >>>>>>>>>>>>>> code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>>>> use something like >>>>>>>>>>>>> >>>>>>>>>>>>> PetscObject pressure; >>>>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> and then DMGetSubDM() uses this information to attach the null >>>>>>>>>>>>> space to the IS that is created using the information in the PetscSection. >>>>>>>>>>>>> If you use a PetscSection to set the data layout over the >>>>>>>>>>>>> DMDA, I think this works correctly, but this has not been tested at all and >>>>>>>>>>>>> is very >>>>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Bottom line: For custom null spaces using the default layout >>>>>>>>>>>>> in DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and then >>>>>>>>>>>>> reach in and get the PC, and the subKSPs. I don't like this at all, but we >>>>>>>>>>>>> have not reorganized that code (which could be very simple and >>>>>>>>>>>>> inflexible since its very structured). >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> So I tried to get this approach working but I could not succeed >>>>>>>>>>>> and encountered some errors. Here is a code snippet: >>>>>>>>>>>> >>>>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 dofs >>>>>>>>>>>> (3 velocity components and 1 pressure comp.) >>>>>>>>>>>> ierr = >>>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>>>> complete system. >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>>>> >>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that >>>>>>>>>>>> was obtained from the options (fieldsplit) >>>>>>>>>>>> >>>>>>>>>>>> ierr = >>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>>>> >>>>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>> ierr = >>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>>>> using the mDaPressure. >>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>>>>>>> >>>>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks, with this it seems I can attach the null space properly, >>>>>>>>>> but I have a question regarding whether the Schur complement ksp solver is >>>>>>>>>> actually using the preconditioner matrix I provide. >>>>>>>>>> When using -ksp_view, the outer level pc object of type >>>>>>>>>> fieldsplit does report that: "Preconditioner for the Schur complement >>>>>>>>>> formed from user provided matrix", but in the KSP solver for Schur >>>>>>>>>> complement S, the pc object (fieldsplit_1_) is of type ilu and doesn't say >>>>>>>>>> that it is using the matrix I provide. Am I missing something here ? >>>>>>>>>> Below are the relevant commented code snippet and the output of >>>>>>>>>> the -ksp_view >>>>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type >>>>>>>>>> schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>>>> >>>>>>>>> >>>>>>>>> If ILU does not error, it means it is using your matrix, because >>>>>>>>> the Schur complement matrix cannot be factored, and FS says it is using >>>>>>>>> your matrix. >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> Code snippet: >>>>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>>>> //The nullspace for the whole system >>>>>>>>>> ierr = >>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set >>>>>>>>>> up mKsp with the options provided with fieldsplit and the fields associated >>>>>>>>>> with the two splits. >>>>>>>>>> >>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>>>> >>>>>>>>>> ierr = >>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>>>> >>>>>>>>>> KSP *kspSchur; >>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>> ierr = >>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> the output of the -ksp_view >>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>>>> Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> has attached null space >>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>> type: fieldsplit >>>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>>> factorization FULL >>>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>>> provided matrix >>>>>>>>>> Split info: >>>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>>> Split number 1 Fields 3 >>>>>>>>>> KSP solver for A00 block >>>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>> divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: ilu >>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>> 0 levels of fill >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: natural >>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>> used is 5 >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>>>> =0 >>>>>>>>>> using I-node routines: found 729 nodes, limit used is >>>>>>>>>> 5 >>>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>> divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> has attached null space >>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>> type: ilu >>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>> 0 levels of fill >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: natural >>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=729, cols=729 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: schurcomplement >>>>>>>>>> rows=729, cols=729 >>>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>>> A11 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=729, cols=729 >>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> A10 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=729, cols=2187 >>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> KSP of A00 >>>>>>>>>> KSP Object: >>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>> divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using PRECONDITIONED norm type for convergence >>>>>>>>>> test >>>>>>>>>> PC Object: >>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: ilu >>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>> 0 levels of fill >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>>>> pivot >>>>>>>>>> matrix ordering: natural >>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI >>>>>>>>>> processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>> package used to perform factorization: >>>>>>>>>> petsc >>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>> nonzeros=140625 >>>>>>>>>> total number of mallocs used during >>>>>>>>>> MatSetValues calls =0 >>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>> limit used is 5 >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>> nonzeros=140625 >>>>>>>>>> total number of mallocs used during >>>>>>>>>> MatSetValues calls =0 >>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>> used is 5 >>>>>>>>>> A01 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2187, cols=729 >>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>> used is 5 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=729, cols=729 >>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>>>> =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> or >>>>>>>>>>> >>>>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> The errors I get when running with options: -pc_type fieldsplit >>>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>>> type! >>>>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>> shooting. >>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>>>>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 23 08:34:18 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Aug 2013 08:34:18 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 8:30 AM, Bishesh Khanal wrote: > > > > On Fri, Aug 23, 2013 at 3:16 PM, Matthew Knepley wrote: > >> On Fri, Aug 23, 2013 at 8:01 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: >>> >>>> On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: >>>>> >>>>>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal < >>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> Thanks Matt and Mark for comments in using near null space >>>>>>>>> [question I asked in the thread with subject: *problem >>>>>>>>> (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with >>>>>>>>> multiple nodes in a cluster*]. >>>>>>>>> So I understood that I have to set a nearNullSpace to A00 block >>>>>>>>> where the null space correspond to the rigid body motion. I tried it but >>>>>>>>> still the gamg just keeps on iterating and convergence is very very slow. I >>>>>>>>> am not sure what the problem is, right now gamg does not even work for the >>>>>>>>> constant viscosity case. >>>>>>>>> I have set up the following in my code: >>>>>>>>> 1. null space for the whole system A 2. null space for the Schur >>>>>>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>>>>>> inverse viscosity in the diagonal for S. >>>>>>>>> >>>>>>>> >>>>>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>>>>> >>>>>>> >>>>>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>>>>> didn't get to the end to get -ksp_view results. >>>>>>> Instead here I have put the -ksp_view output when running the >>>>>>> program with following options: (In this case I get the results) >>>>>>> -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>> >>>>>> >>>>>> Okay, that looks fine. Does >>>>>> >>>>>> -fieldsplit_0_pc_type lu >>>>>> - fieldsplit_1_ksp_rtol 1.0e-10 >>>>>> >>>>>> converge in one Iterate? >>>>>> >>>>>> What matrix did you attach as the preconditioner matrix for >>>>>> fieldsplit_1_? >>>>>> >>>>> >>>>> >>>>> I used a diagonal matrix with reciprocal of viscosity values of the >>>>> corresponding cell centers as the preconditioner. >>>>> >>>>> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >>>>> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >>>>> I get the following output which means the outer ksp did converge in >>>>> one iterate I guess. >>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>> Linear solve converged due to CONVERGED_RTOL iterations 1 >>>>> >>>> >>>> Okay, so A_00 is nonsingular, and the system seems to solve alright. >>>> What do you get for >>>> >>>> -fieldsplit_0_ksp_max_it 30 >>>> -fieldsplit_0_pc_type gamg >>>> -fieldsplit_0_ksp_converged_reason >>>> -fieldsplit_1_ksp_converged_reason >>>> >>>> >>> >>> It fieldsplit_0_ does not converge in 30 iterations. It gives: >>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>> Linear solve did not converge due to DIVERGED_ITS iterations 30 >>> >>> and continues with the same message. >>> >> >> So what would you do? Give up? >> >> No, I don't want to give up :) > > >> -fieldsplit_0_ksp_gmres_restart 200 >> >> The idea is to figure out what is going on: >> >> -fieldsplit_0_ksp_monitor_true_residual >> >> I have tried these options before too, the residual is decreasing very > very slowly, but I've not been able to figure out why. (using hypre does > converge although slowly again, but I had problems using hypre with > multiple nodes in a cluster with segmentation fault (we discussed that in > another thread!) ) > Put in the Laplacian instead of the operator you have now. It should converge in a few iterates. If not, you have a problem in the specification. If so, put in linear elasticity. If it is slow, you have made a mistake specifiying the near null space. Also, you need to check that the near null space made it to GAMG using the ksp_view output. Matt > e.g a snapshot of the output: > > Residual norms for fieldsplit_0_ solve. > 0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm > 0.000000000000e+00 ||r(i)||/||b|| -nan > Linear solve converged due to CONVERGED_ATOL iterations 0 > Residual norms for fieldsplit_0_ solve. > 0 KSP preconditioned resid norm 2.619231455875e-01 true resid norm > 3.637306695895e+02 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 9.351491725479e-02 true resid norm > 6.013334574957e+01 ||r(i)||/||b|| 1.653238255038e-01 > 2 KSP preconditioned resid norm 6.010357491087e-02 true resid norm > 3.664473273769e+01 ||r(i)||/||b|| 1.007468871928e-01 > 3 KSP preconditioned resid norm 6.006968012944e-02 true resid norm > 3.696451770148e+01 ||r(i)||/||b|| 1.016260678353e-01 > 4 KSP preconditioned resid norm 4.418407037098e-02 true resid norm > 3.184810838034e+01 ||r(i)||/||b|| 8.755959022176e-02 > ... > ... > 93 KSP preconditioned resid norm 4.549506047737e-04 true resid norm > 2.877594552685e+00 ||r(i)||/||b|| 7.911333283864e-03 > 94 KSP preconditioned resid norm 4.515424416235e-04 true resid norm > 2.875249044668e+00 ||r(i)||/||b|| 7.904884809172e-03 > 95 KSP preconditioned resid norm 4.277647876573e-04 true resid norm > 2.830418831358e+00 ||r(i)||/||b|| 7.781633686685e-03 > 96 KSP preconditioned resid norm 4.244529173876e-04 true resid norm > 2.807041401408e+00 ||r(i)||/||b|| 7.717362422521e-03 > 97 KSP preconditioned resid norm 4.138326570674e-04 true resid norm > 2.793663020386e+00 ||r(i)||/||b|| 7.680581413547e-03 > 98 KSP preconditioned resid norm 3.869979433609e-04 true resid norm > 2.715150386650e+00 ||r(i)||/||b|| 7.464727650583e-03 > 99 KSP preconditioned resid norm 3.847873979265e-04 true resid norm > 2.706008990336e+00 ||r(i)||/||b|| 7.439595328571e-03 > > .... > .... > 294 KSP preconditioned resid norm 1.416482289961e-04 true resid norm > 2.735750748819e+00 ||r(i)||/||b|| 7.521363958412e-03 > 295 KSP preconditioned resid norm 1.415389087364e-04 true resid norm > 2.742638608355e+00 ||r(i)||/||b|| 7.540300661064e-03 > 296 KSP preconditioned resid norm 1.414967651105e-04 true resid norm > 2.747224243968e+00 ||r(i)||/||b|| 7.552907889424e-03 > 297 KSP preconditioned resid norm 1.413843018303e-04 true resid norm > 2.752574248710e+00 ||r(i)||/||b|| 7.567616587891e-03 > 298 KSP preconditioned resid norm 1.411747949695e-04 true resid norm > 2.765459647367e+00 ||r(i)||/||b|| 7.603042246859e-03 > 299 KSP preconditioned resid norm 1.411609742082e-04 true resid norm > 2.765900464868e+00 ||r(i)||/||b|| 7.604254180683e-03 > 300 KSP preconditioned resid norm 1.409844332838e-04 true resid norm > 2.771790506811e+00 ||r(i)||/||b|| 7.620447596402e-03 > Linear solve did not converge due to DIVERGED_ITS iterations 300 > Residual norms for fieldsplit_0_ solve. > 0 KSP preconditioned resid norm 1.294272083271e-03 true resid norm > 1.776945075651e+00 ||r(i)||/||b|| 1.000000000000e+00 > ... > ... > > > > >> Matt >> >> >>> >>> >>> >>>> This is the kind of investigation you msut be comfortable with if you >>>> want to experiment with these solvers. >>>> >>>> Matt >>>> >>>> >>>>> >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>>>>> KSP Object: 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>> Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> has attached null space >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: 1 MPI processes >>>>>>> type: fieldsplit >>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>> factorization FULL >>>>>>> Preconditioner for the Schur complement formed from user >>>>>>> provided matrix >>>>>>> Split info: >>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>> Split number 1 Fields 3 >>>>>>> KSP solver for A00 block >>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>> type: ilu >>>>>>> ILU: out-of-place factorization >>>>>>> 0 levels of fill >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: natural >>>>>>> factor fill ratio given 1, needed 1 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=8232, cols=8232 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>> used is 5 >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=8232, cols=8232 >>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>> left preconditioning >>>>>>> has attached null space >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>> type: ilu >>>>>>> ILU: out-of-place factorization >>>>>>> 0 levels of fill >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>> matrix ordering: natural >>>>>>> factor fill ratio given 1, needed 1 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2744, cols=2744 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> not using I-node routines >>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: schurcomplement >>>>>>> rows=2744, cols=2744 >>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>> A11 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2744, cols=2744 >>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> not using I-node routines >>>>>>> A10 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2744, cols=8232 >>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> not using I-node routines >>>>>>> KSP of A00 >>>>>>> KSP Object: (fieldsplit_0_) >>>>>>> 1 MPI processes >>>>>>> type: gmres >>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>> divergence=10000 >>>>>>> left preconditioning >>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>> PC Object: (fieldsplit_0_) >>>>>>> 1 MPI processes >>>>>>> type: ilu >>>>>>> ILU: out-of-place factorization >>>>>>> 0 levels of fill >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> using diagonal shift on blocks to prevent zero >>>>>>> pivot >>>>>>> matrix ordering: natural >>>>>>> factor fill ratio given 1, needed 1 >>>>>>> Factored matrix follows: >>>>>>> Matrix Object: 1 MPI >>>>>>> processes >>>>>>> type: seqaij >>>>>>> rows=8232, cols=8232 >>>>>>> package used to perform factorization: petsc >>>>>>> total: nonzeros=576000, allocated >>>>>>> nonzeros=576000 >>>>>>> total number of mallocs used during >>>>>>> MatSetValues calls =0 >>>>>>> using I-node routines: found 2744 nodes, >>>>>>> limit used is 5 >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=8232, cols=8232 >>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>> used is 5 >>>>>>> A01 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=8232, cols=2744 >>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>> total number of mallocs used during MatSetValues >>>>>>> calls =0 >>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>> used is 5 >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=2744, cols=2744 >>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> not using I-node routines >>>>>>> linear system matrix = precond matrix: >>>>>>> Matrix Object: 1 MPI processes >>>>>>> type: seqaij >>>>>>> rows=10976, cols=10976, bs=4 >>>>>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> I am testing a small problem with CONSTANT viscosity for grid size >>>>>>>>> of 14^3 with the run time option: >>>>>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>>>>> >>>>>>>>> Here is my relevant code of the solve function: >>>>>>>>> PetscErrorCode ierr; >>>>>>>>> PetscFunctionBeginUser; >>>>>>>>> ierr = >>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>> ierr = >>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa >>>>>>>>> with dof = 4, vx,vy,vz and p. >>>>>>>>> ierr = >>>>>>>>> KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main >>>>>>>>> system >>>>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>> //register the fieldsplits obtained from options. >>>>>>>>> >>>>>>>>> //Setting up user PC for Schur Complement >>>>>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>>>>> ierr = >>>>>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> KSP *subKsp; >>>>>>>>> PetscInt subKspPos = 0; >>>>>>>>> //Set up nearNullspace for A00 block. >>>>>>>>> ierr = >>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>> MatNullSpace rigidBodyModes; >>>>>>>>> Vec coords; >>>>>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>>>>> ierr = >>>>>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>> Mat matA00; >>>>>>>>> ierr = >>>>>>>>> KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>>>>> ierr = >>>>>>>>> MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>>>>> pressure space >>>>>>>>> subKspPos = 1; >>>>>>>>> ierr = >>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>> //Set up the null space of constant pressure. >>>>>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>>>>> PetscBool isNull; >>>>>>>>> Mat matSc; >>>>>>>>> ierr = >>>>>>>>> KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>>>>> if(!isNull) >>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>> pressure null space \n"); >>>>>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>>>>> if(!isNull) >>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>> system null space \n"); >>>>>>>>> >>>>>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> >>>>>>>>> PetscFunctionReturn(0); >>>>>>>>> >>>>>>>>> >>>>>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley >>>>>>>> > wrote: >>>>>>>>> >>>>>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each for >>>>>>>>>>>>>>>>>>>> both 2D and 3D, in >>>>>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and faces >>>>>>>>>>>>>>>>>>>> to account for the >>>>>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem >>>>>>>>>>>>>>>>>>>> and I recommend >>>>>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement >>>>>>>>>>>>>>>>>>>> reduction (make that >>>>>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). You >>>>>>>>>>>>>>>>>>>> can use PCLSC or >>>>>>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. >>>>>>>>>>>>>>>>>>>> This diagonal >>>>>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, >>>>>>>>>>>>>>>>>>>> depending on >>>>>>>>>>>>>>>>>>>> discretization) approximation of the Schur complement. >>>>>>>>>>>>>>>>>>>> The velocity >>>>>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read the >>>>>>>>>>>>>>>>>>>> PCFieldSplit >>>>>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if >>>>>>>>>>>>>>>>>>>> you get stuck. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal >>>>>>>>>>>>>>>>>>> matrix to use as the preconditioner for the Schur complement solve step as >>>>>>>>>>>>>>>>>>> you suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>>>>>>>> A naive approach that I can think of would be to create >>>>>>>>>>>>>>>>>>> a vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the >>>>>>>>>>>>>>>>>>> Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but >>>>>>>>>>>>>>>>>>> for the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid size >>>>>>>>>>>>>>>>> but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Yes. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> How can I set this split in the code itself without giving >>>>>>>>>>>>>>> it as a command line option when the system matrix is assembled from the >>>>>>>>>>>>>>> DMDA for the whole system with 4 dofs. (i.e. *without*using the DMComposite or >>>>>>>>>>>>>>> *without* using the nested block matrices to assemble >>>>>>>>>>>>>>> different blocks separately and then combine them together). >>>>>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in my >>>>>>>>>>>>>>> code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>>>>> use something like >>>>>>>>>>>>>> >>>>>>>>>>>>>> PetscObject pressure; >>>>>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> and then DMGetSubDM() uses this information to attach the >>>>>>>>>>>>>> null space to the IS that is created using the information in the >>>>>>>>>>>>>> PetscSection. >>>>>>>>>>>>>> If you use a PetscSection to set the data layout over the >>>>>>>>>>>>>> DMDA, I think this works correctly, but this has not been tested at all and >>>>>>>>>>>>>> is very >>>>>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Bottom line: For custom null spaces using the default layout >>>>>>>>>>>>>> in DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and >>>>>>>>>>>>>> then reach in and get the PC, and the subKSPs. I don't like this at all, >>>>>>>>>>>>>> but we >>>>>>>>>>>>>> have not reorganized that code (which could be very simple >>>>>>>>>>>>>> and inflexible since its very structured). >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> So I tried to get this approach working but I could not >>>>>>>>>>>>> succeed and encountered some errors. Here is a code snippet: >>>>>>>>>>>>> >>>>>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 >>>>>>>>>>>>> dofs (3 velocity components and 1 pressure comp.) >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>>>>> complete system. >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC that >>>>>>>>>>>>> was obtained from the options (fieldsplit) >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>>>>> using the mDaPressure. >>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>>>>>>>> >>>>>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks, with this it seems I can attach the null space properly, >>>>>>>>>>> but I have a question regarding whether the Schur complement ksp solver is >>>>>>>>>>> actually using the preconditioner matrix I provide. >>>>>>>>>>> When using -ksp_view, the outer level pc object of type >>>>>>>>>>> fieldsplit does report that: "Preconditioner for the Schur complement >>>>>>>>>>> formed from user provided matrix", but in the KSP solver for Schur >>>>>>>>>>> complement S, the pc object (fieldsplit_1_) is of type ilu and doesn't say >>>>>>>>>>> that it is using the matrix I provide. Am I missing something here ? >>>>>>>>>>> Below are the relevant commented code snippet and the output of >>>>>>>>>>> the -ksp_view >>>>>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type >>>>>>>>>>> schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> If ILU does not error, it means it is using your matrix, because >>>>>>>>>> the Schur complement matrix cannot be factored, and FS says it is using >>>>>>>>>> your matrix. >>>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Code snippet: >>>>>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>>>>> //The nullspace for the whole system >>>>>>>>>>> ierr = >>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); //Set >>>>>>>>>>> up mKsp with the options provided with fieldsplit and the fields associated >>>>>>>>>>> with the two splits. >>>>>>>>>>> >>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>>>>> >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>>>>> >>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> the output of the -ksp_view >>>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>>> type: gmres >>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>>>>> Orthogonalization with no iterative refinement >>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> has attached null space >>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>>> type: fieldsplit >>>>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>>>> factorization FULL >>>>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>>>> provided matrix >>>>>>>>>>> Split info: >>>>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>>>> Split number 1 Fields 3 >>>>>>>>>>> KSP solver for A00 block >>>>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>> type: gmres >>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>> divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>> type: ilu >>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>> 0 levels of fill >>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>> matrix ordering: natural >>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>> Factored matrix follows: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>> calls =0 >>>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>>> used is 5 >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>>>>> =0 >>>>>>>>>>> using I-node routines: found 729 nodes, limit used >>>>>>>>>>> is 5 >>>>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>> type: gmres >>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>> divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> has attached null space >>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>> type: ilu >>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>> 0 levels of fill >>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>> matrix ordering: natural >>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>> Factored matrix follows: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>> calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: schurcomplement >>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>>>> A11 >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>> calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> A10 >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=729, cols=2187 >>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>> calls =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> KSP of A00 >>>>>>>>>>> KSP Object: >>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>> type: gmres >>>>>>>>>>> GMRES: restart=30, using Classical >>>>>>>>>>> (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>> divergence=10000 >>>>>>>>>>> left preconditioning >>>>>>>>>>> using PRECONDITIONED norm type for convergence >>>>>>>>>>> test >>>>>>>>>>> PC Object: >>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>> type: ilu >>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>> 0 levels of fill >>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>>>>> pivot >>>>>>>>>>> matrix ordering: natural >>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>> Factored matrix follows: >>>>>>>>>>> Matrix Object: 1 MPI >>>>>>>>>>> processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>> package used to perform factorization: >>>>>>>>>>> petsc >>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>> nonzeros=140625 >>>>>>>>>>> total number of mallocs used during >>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>> using I-node routines: found 729 >>>>>>>>>>> nodes, limit used is 5 >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>> nonzeros=140625 >>>>>>>>>>> total number of mallocs used during >>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>> limit used is 5 >>>>>>>>>>> A01 >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=2187, cols=729 >>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>> calls =0 >>>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>>> used is 5 >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>>>>> =0 >>>>>>>>>>> not using I-node routines >>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>> type: seqaij >>>>>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> or >>>>>>>>>>>> >>>>>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> The errors I get when running with options: -pc_type >>>>>>>>>>>>> fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>>>> type! >>>>>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>> shooting. >>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug named >>>>>>>>>>>>> edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>>>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Fri Aug 23 08:42:32 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 23 Aug 2013 15:42:32 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 3:34 PM, Matthew Knepley wrote: > On Fri, Aug 23, 2013 at 8:30 AM, Bishesh Khanal wrote: > >> >> >> >> On Fri, Aug 23, 2013 at 3:16 PM, Matthew Knepley wrote: >> >>> On Fri, Aug 23, 2013 at 8:01 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: >>>> >>>>> On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal >>>>>> > wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks Matt and Mark for comments in using near null space >>>>>>>>>> [question I asked in the thread with subject: *problem >>>>>>>>>> (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with >>>>>>>>>> multiple nodes in a cluster*]. >>>>>>>>>> So I understood that I have to set a nearNullSpace to A00 block >>>>>>>>>> where the null space correspond to the rigid body motion. I tried it but >>>>>>>>>> still the gamg just keeps on iterating and convergence is very very slow. I >>>>>>>>>> am not sure what the problem is, right now gamg does not even work for the >>>>>>>>>> constant viscosity case. >>>>>>>>>> I have set up the following in my code: >>>>>>>>>> 1. null space for the whole system A 2. null space for the Schur >>>>>>>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>>>>>>> inverse viscosity in the diagonal for S. >>>>>>>>>> >>>>>>>>> >>>>>>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>>>>>> >>>>>>>> >>>>>>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>>>>>> didn't get to the end to get -ksp_view results. >>>>>>>> Instead here I have put the -ksp_view output when running the >>>>>>>> program with following options: (In this case I get the results) >>>>>>>> -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>> >>>>>>> >>>>>>> Okay, that looks fine. Does >>>>>>> >>>>>>> -fieldsplit_0_pc_type lu >>>>>>> - fieldsplit_1_ksp_rtol 1.0e-10 >>>>>>> >>>>>>> converge in one Iterate? >>>>>>> >>>>>>> What matrix did you attach as the preconditioner matrix for >>>>>>> fieldsplit_1_? >>>>>>> >>>>>> >>>>>> >>>>>> I used a diagonal matrix with reciprocal of viscosity values of the >>>>>> corresponding cell centers as the preconditioner. >>>>>> >>>>>> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >>>>>> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >>>>>> I get the following output which means the outer ksp did converge in >>>>>> one iterate I guess. >>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>> Linear solve converged due to CONVERGED_RTOL iterations 1 >>>>>> >>>>> >>>>> Okay, so A_00 is nonsingular, and the system seems to solve alright. >>>>> What do you get for >>>>> >>>>> -fieldsplit_0_ksp_max_it 30 >>>>> -fieldsplit_0_pc_type gamg >>>>> -fieldsplit_0_ksp_converged_reason >>>>> -fieldsplit_1_ksp_converged_reason >>>>> >>>>> >>>> >>>> It fieldsplit_0_ does not converge in 30 iterations. It gives: >>>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>>> Linear solve did not converge due to DIVERGED_ITS iterations 30 >>>> >>>> and continues with the same message. >>>> >>> >>> So what would you do? Give up? >>> >>> No, I don't want to give up :) >> >> >>> -fieldsplit_0_ksp_gmres_restart 200 >>> >>> The idea is to figure out what is going on: >>> >>> -fieldsplit_0_ksp_monitor_true_residual >>> >>> I have tried these options before too, the residual is decreasing very >> very slowly, but I've not been able to figure out why. (using hypre does >> converge although slowly again, but I had problems using hypre with >> multiple nodes in a cluster with segmentation fault (we discussed that in >> another thread!) ) >> > > Put in the Laplacian instead of the operator you have now. It should > converge in a few iterates. If not, you have a problem > in the specification. > > If so, put in linear elasticity. If it is slow, you have made a mistake > specifiying the near null space. Also, you need to check > that the near null space made it to GAMG using the ksp_view output. > Which operator are you referring to ? The one in A00 block ? I'm testing currently with the constant viscosity case which means the A00 block has \mu div(grad(v)) which is a Laplacian. And Is it possible to view the ksp_view output before the solver actually converges to check if GAMG took the near null space ? > > Matt > > >> e.g a snapshot of the output: >> >> Residual norms for fieldsplit_0_ solve. >> 0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm >> 0.000000000000e+00 ||r(i)||/||b|| -nan >> Linear solve converged due to CONVERGED_ATOL iterations 0 >> Residual norms for fieldsplit_0_ solve. >> 0 KSP preconditioned resid norm 2.619231455875e-01 true resid norm >> 3.637306695895e+02 ||r(i)||/||b|| 1.000000000000e+00 >> 1 KSP preconditioned resid norm 9.351491725479e-02 true resid norm >> 6.013334574957e+01 ||r(i)||/||b|| 1.653238255038e-01 >> 2 KSP preconditioned resid norm 6.010357491087e-02 true resid norm >> 3.664473273769e+01 ||r(i)||/||b|| 1.007468871928e-01 >> 3 KSP preconditioned resid norm 6.006968012944e-02 true resid norm >> 3.696451770148e+01 ||r(i)||/||b|| 1.016260678353e-01 >> 4 KSP preconditioned resid norm 4.418407037098e-02 true resid norm >> 3.184810838034e+01 ||r(i)||/||b|| 8.755959022176e-02 >> ... >> ... >> 93 KSP preconditioned resid norm 4.549506047737e-04 true resid norm >> 2.877594552685e+00 ||r(i)||/||b|| 7.911333283864e-03 >> 94 KSP preconditioned resid norm 4.515424416235e-04 true resid norm >> 2.875249044668e+00 ||r(i)||/||b|| 7.904884809172e-03 >> 95 KSP preconditioned resid norm 4.277647876573e-04 true resid norm >> 2.830418831358e+00 ||r(i)||/||b|| 7.781633686685e-03 >> 96 KSP preconditioned resid norm 4.244529173876e-04 true resid norm >> 2.807041401408e+00 ||r(i)||/||b|| 7.717362422521e-03 >> 97 KSP preconditioned resid norm 4.138326570674e-04 true resid norm >> 2.793663020386e+00 ||r(i)||/||b|| 7.680581413547e-03 >> 98 KSP preconditioned resid norm 3.869979433609e-04 true resid norm >> 2.715150386650e+00 ||r(i)||/||b|| 7.464727650583e-03 >> 99 KSP preconditioned resid norm 3.847873979265e-04 true resid norm >> 2.706008990336e+00 ||r(i)||/||b|| 7.439595328571e-03 >> >> .... >> .... >> 294 KSP preconditioned resid norm 1.416482289961e-04 true resid norm >> 2.735750748819e+00 ||r(i)||/||b|| 7.521363958412e-03 >> 295 KSP preconditioned resid norm 1.415389087364e-04 true resid norm >> 2.742638608355e+00 ||r(i)||/||b|| 7.540300661064e-03 >> 296 KSP preconditioned resid norm 1.414967651105e-04 true resid norm >> 2.747224243968e+00 ||r(i)||/||b|| 7.552907889424e-03 >> 297 KSP preconditioned resid norm 1.413843018303e-04 true resid norm >> 2.752574248710e+00 ||r(i)||/||b|| 7.567616587891e-03 >> 298 KSP preconditioned resid norm 1.411747949695e-04 true resid norm >> 2.765459647367e+00 ||r(i)||/||b|| 7.603042246859e-03 >> 299 KSP preconditioned resid norm 1.411609742082e-04 true resid norm >> 2.765900464868e+00 ||r(i)||/||b|| 7.604254180683e-03 >> 300 KSP preconditioned resid norm 1.409844332838e-04 true resid norm >> 2.771790506811e+00 ||r(i)||/||b|| 7.620447596402e-03 >> Linear solve did not converge due to DIVERGED_ITS iterations 300 >> Residual norms for fieldsplit_0_ solve. >> 0 KSP preconditioned resid norm 1.294272083271e-03 true resid norm >> 1.776945075651e+00 ||r(i)||/||b|| 1.000000000000e+00 >> ... >> ... >> >> >> >> >>> Matt >>> >>> >>>> >>>> >>>> >>>>> This is the kind of investigation you msut be comfortable with if you >>>>> want to experiment with these solvers. >>>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>>>>>> KSP Object: 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>> Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>> left preconditioning >>>>>>>> has attached null space >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: 1 MPI processes >>>>>>>> type: fieldsplit >>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>> factorization FULL >>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>> provided matrix >>>>>>>> Split info: >>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>> Split number 1 Fields 3 >>>>>>>> KSP solver for A00 block >>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>> divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>> type: ilu >>>>>>>> ILU: out-of-place factorization >>>>>>>> 0 levels of fill >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: natural >>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=8232, cols=8232 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>> used is 5 >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=8232, cols=8232 >>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>> divergence=10000 >>>>>>>> left preconditioning >>>>>>>> has attached null space >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>> type: ilu >>>>>>>> ILU: out-of-place factorization >>>>>>>> 0 levels of fill >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>> matrix ordering: natural >>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2744, cols=2744 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> not using I-node routines >>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: schurcomplement >>>>>>>> rows=2744, cols=2744 >>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>> A11 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2744, cols=2744 >>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> not using I-node routines >>>>>>>> A10 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2744, cols=8232 >>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> not using I-node routines >>>>>>>> KSP of A00 >>>>>>>> KSP Object: >>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>> type: gmres >>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>> divergence=10000 >>>>>>>> left preconditioning >>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>> PC Object: (fieldsplit_0_) >>>>>>>> 1 MPI processes >>>>>>>> type: ilu >>>>>>>> ILU: out-of-place factorization >>>>>>>> 0 levels of fill >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>> pivot >>>>>>>> matrix ordering: natural >>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>> Factored matrix follows: >>>>>>>> Matrix Object: 1 MPI >>>>>>>> processes >>>>>>>> type: seqaij >>>>>>>> rows=8232, cols=8232 >>>>>>>> package used to perform factorization: petsc >>>>>>>> total: nonzeros=576000, allocated >>>>>>>> nonzeros=576000 >>>>>>>> total number of mallocs used during >>>>>>>> MatSetValues calls =0 >>>>>>>> using I-node routines: found 2744 nodes, >>>>>>>> limit used is 5 >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=8232, cols=8232 >>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>> used is 5 >>>>>>>> A01 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=8232, cols=2744 >>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>> total number of mallocs used during MatSetValues >>>>>>>> calls =0 >>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>> used is 5 >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=2744, cols=2744 >>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> not using I-node routines >>>>>>>> linear system matrix = precond matrix: >>>>>>>> Matrix Object: 1 MPI processes >>>>>>>> type: seqaij >>>>>>>> rows=10976, cols=10976, bs=4 >>>>>>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> I am testing a small problem with CONSTANT viscosity for grid >>>>>>>>>> size of 14^3 with the run time option: >>>>>>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>>>>>> >>>>>>>>>> Here is my relevant code of the solve function: >>>>>>>>>> PetscErrorCode ierr; >>>>>>>>>> PetscFunctionBeginUser; >>>>>>>>>> ierr = >>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa >>>>>>>>>> with dof = 4, vx,vy,vz and p. >>>>>>>>>> ierr = >>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main >>>>>>>>>> system >>>>>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>> //register the fieldsplits obtained from options. >>>>>>>>>> >>>>>>>>>> //Setting up user PC for Schur Complement >>>>>>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> KSP *subKsp; >>>>>>>>>> PetscInt subKspPos = 0; >>>>>>>>>> //Set up nearNullspace for A00 block. >>>>>>>>>> ierr = >>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>> MatNullSpace rigidBodyModes; >>>>>>>>>> Vec coords; >>>>>>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>> Mat matA00; >>>>>>>>>> ierr = >>>>>>>>>> KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>> ierr = >>>>>>>>>> MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>>>>>> pressure space >>>>>>>>>> subKspPos = 1; >>>>>>>>>> ierr = >>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>> //Set up the null space of constant pressure. >>>>>>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>>>>>> PetscBool isNull; >>>>>>>>>> Mat matSc; >>>>>>>>>> ierr = >>>>>>>>>> KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>>>>>> if(!isNull) >>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>> pressure null space \n"); >>>>>>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>>>>>> if(!isNull) >>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>> system null space \n"); >>>>>>>>>> >>>>>>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> PetscFunctionReturn(0); >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal < >>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each >>>>>>>>>>>>>>>>>>>>> for both 2D and 3D, in >>>>>>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and >>>>>>>>>>>>>>>>>>>>> faces to account for the >>>>>>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem >>>>>>>>>>>>>>>>>>>>> and I recommend >>>>>>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement >>>>>>>>>>>>>>>>>>>>> reduction (make that >>>>>>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). You >>>>>>>>>>>>>>>>>>>>> can use PCLSC or >>>>>>>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. >>>>>>>>>>>>>>>>>>>>> This diagonal >>>>>>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, >>>>>>>>>>>>>>>>>>>>> depending on >>>>>>>>>>>>>>>>>>>>> discretization) approximation of the Schur complement. >>>>>>>>>>>>>>>>>>>>> The velocity >>>>>>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read >>>>>>>>>>>>>>>>>>>>> the PCFieldSplit >>>>>>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know if >>>>>>>>>>>>>>>>>>>>> you get stuck. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity diagonal >>>>>>>>>>>>>>>>>>>> matrix to use as the preconditioner for the Schur complement solve step as >>>>>>>>>>>>>>>>>>>> you suggested. I've few questions about the ways to implement this in Petsc: >>>>>>>>>>>>>>>>>>>> A naive approach that I can think of would be to create >>>>>>>>>>>>>>>>>>>> a vector with its components as reciprocal viscosities of the cell centers >>>>>>>>>>>>>>>>>>>> corresponding to the pressure variables, and then create a diagonal matrix >>>>>>>>>>>>>>>>>>>> from this vector. However I'm not sure about: >>>>>>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to the >>>>>>>>>>>>>>>>>>>> Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but >>>>>>>>>>>>>>>>>>>> for the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never breaks >>>>>>>>>>>>>>>>>>> blocks. The P distribution is the same as the entire problem divided by 4. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid >>>>>>>>>>>>>>>>>> size but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Yes. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> How can I set this split in the code itself without giving >>>>>>>>>>>>>>>> it as a command line option when the system matrix is assembled from the >>>>>>>>>>>>>>>> DMDA for the whole system with 4 dofs. (i.e. *without*using the DMComposite or >>>>>>>>>>>>>>>> *without* using the nested block matrices to assemble >>>>>>>>>>>>>>>> different blocks separately and then combine them together). >>>>>>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in >>>>>>>>>>>>>>>> my code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>>>>>> use something like >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> PetscObject pressure; >>>>>>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> and then DMGetSubDM() uses this information to attach the >>>>>>>>>>>>>>> null space to the IS that is created using the information in the >>>>>>>>>>>>>>> PetscSection. >>>>>>>>>>>>>>> If you use a PetscSection to set the data layout over the >>>>>>>>>>>>>>> DMDA, I think this works correctly, but this has not been tested at all and >>>>>>>>>>>>>>> is very >>>>>>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Bottom line: For custom null spaces using the default layout >>>>>>>>>>>>>>> in DMDA, you need to take apart the PCFIELDSPLIT after it has been setup, >>>>>>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and >>>>>>>>>>>>>>> then reach in and get the PC, and the subKSPs. I don't like this at all, >>>>>>>>>>>>>>> but we >>>>>>>>>>>>>>> have not reorganized that code (which could be very simple >>>>>>>>>>>>>>> and inflexible since its very structured). >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> So I tried to get this approach working but I could not >>>>>>>>>>>>>> succeed and encountered some errors. Here is a code snippet: >>>>>>>>>>>>>> >>>>>>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 >>>>>>>>>>>>>> dofs (3 velocity components and 1 pressure comp.) >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>>>>>> complete system. >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC >>>>>>>>>>>>>> that was obtained from the options (fieldsplit) >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>>>>>> using the mDaPressure. >>>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you want >>>>>>>>>>>>> >>>>>>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Thanks, with this it seems I can attach the null space >>>>>>>>>>>> properly, but I have a question regarding whether the Schur complement ksp >>>>>>>>>>>> solver is actually using the preconditioner matrix I provide. >>>>>>>>>>>> When using -ksp_view, the outer level pc object of type >>>>>>>>>>>> fieldsplit does report that: "Preconditioner for the Schur complement >>>>>>>>>>>> formed from user provided matrix", but in the KSP solver for Schur >>>>>>>>>>>> complement S, the pc object (fieldsplit_1_) is of type ilu and doesn't say >>>>>>>>>>>> that it is using the matrix I provide. Am I missing something here ? >>>>>>>>>>>> Below are the relevant commented code snippet and the output of >>>>>>>>>>>> the -ksp_view >>>>>>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type >>>>>>>>>>>> schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> If ILU does not error, it means it is using your matrix, because >>>>>>>>>>> the Schur complement matrix cannot be factored, and FS says it is using >>>>>>>>>>> your matrix. >>>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> Code snippet: >>>>>>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>>>>>> //The nullspace for the whole system >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>> //Set up mKsp with the options provided with fieldsplit and the fields >>>>>>>>>>>> associated with the two splits. >>>>>>>>>>>> >>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>>>>>> >>>>>>>>>>>> ierr = >>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>>>>>> >>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>> ierr = >>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> the output of the -ksp_view >>>>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>>>> type: gmres >>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>> left preconditioning >>>>>>>>>>>> has attached null space >>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>>>> type: fieldsplit >>>>>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>>>>> factorization FULL >>>>>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>>>>> provided matrix >>>>>>>>>>>> Split info: >>>>>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>>>>> Split number 1 Fields 3 >>>>>>>>>>>> KSP solver for A00 block >>>>>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>> type: gmres >>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>> divergence=10000 >>>>>>>>>>>> left preconditioning >>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>> type: ilu >>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>>>> used is 5 >>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>> calls =0 >>>>>>>>>>>> using I-node routines: found 729 nodes, limit used >>>>>>>>>>>> is 5 >>>>>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>> type: gmres >>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>> divergence=10000 >>>>>>>>>>>> left preconditioning >>>>>>>>>>>> has attached null space >>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>> type: ilu >>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>> not using I-node routines >>>>>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: schurcomplement >>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>>>>> A11 >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>> not using I-node routines >>>>>>>>>>>> A10 >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=729, cols=2187 >>>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>> not using I-node routines >>>>>>>>>>>> KSP of A00 >>>>>>>>>>>> KSP Object: >>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>> type: gmres >>>>>>>>>>>> GMRES: restart=30, using Classical >>>>>>>>>>>> (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>> divergence=10000 >>>>>>>>>>>> left preconditioning >>>>>>>>>>>> using PRECONDITIONED norm type for convergence >>>>>>>>>>>> test >>>>>>>>>>>> PC Object: >>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>> type: ilu >>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>> using diagonal shift on blocks to prevent >>>>>>>>>>>> zero pivot >>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>> Matrix Object: 1 >>>>>>>>>>>> MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>> package used to perform factorization: >>>>>>>>>>>> petsc >>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>> using I-node routines: found 729 >>>>>>>>>>>> nodes, limit used is 5 >>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>> limit used is 5 >>>>>>>>>>>> A01 >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=2187, cols=729 >>>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>>>> used is 5 >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>> calls =0 >>>>>>>>>>>> not using I-node routines >>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>> type: seqaij >>>>>>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> or >>>>>>>>>>>>> >>>>>>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> The errors I get when running with options: -pc_type >>>>>>>>>>>>>> fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this object >>>>>>>>>>>>>> type! >>>>>>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>> updates. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug >>>>>>>>>>>>>> named edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=g77 >>>>>>>>>>>>>> --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 23 08:45:24 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 23 Aug 2013 08:45:24 -0500 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 8:42 AM, Bishesh Khanal wrote: > > > > On Fri, Aug 23, 2013 at 3:34 PM, Matthew Knepley wrote: > >> On Fri, Aug 23, 2013 at 8:30 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Fri, Aug 23, 2013 at 3:16 PM, Matthew Knepley wrote: >>> >>>> On Fri, Aug 23, 2013 at 8:01 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: >>>>> >>>>>> On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal < >>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks Matt and Mark for comments in using near null space >>>>>>>>>>> [question I asked in the thread with subject: *problem >>>>>>>>>>> (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with >>>>>>>>>>> multiple nodes in a cluster*]. >>>>>>>>>>> So I understood that I have to set a nearNullSpace to A00 block >>>>>>>>>>> where the null space correspond to the rigid body motion. I tried it but >>>>>>>>>>> still the gamg just keeps on iterating and convergence is very very slow. I >>>>>>>>>>> am not sure what the problem is, right now gamg does not even work for the >>>>>>>>>>> constant viscosity case. >>>>>>>>>>> I have set up the following in my code: >>>>>>>>>>> 1. null space for the whole system A 2. null space for the Schur >>>>>>>>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>>>>>>>> inverse viscosity in the diagonal for S. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>>>>>>> >>>>>>>>> >>>>>>>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>>>>>>> didn't get to the end to get -ksp_view results. >>>>>>>>> Instead here I have put the -ksp_view output when running the >>>>>>>>> program with following options: (In this case I get the results) >>>>>>>>> -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>> >>>>>>>> >>>>>>>> Okay, that looks fine. Does >>>>>>>> >>>>>>>> -fieldsplit_0_pc_type lu >>>>>>>> - fieldsplit_1_ksp_rtol 1.0e-10 >>>>>>>> >>>>>>>> converge in one Iterate? >>>>>>>> >>>>>>>> What matrix did you attach as the preconditioner matrix for >>>>>>>> fieldsplit_1_? >>>>>>>> >>>>>>> >>>>>>> >>>>>>> I used a diagonal matrix with reciprocal of viscosity values of the >>>>>>> corresponding cell centers as the preconditioner. >>>>>>> >>>>>>> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >>>>>>> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >>>>>>> I get the following output which means the outer ksp did converge in >>>>>>> one iterate I guess. >>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 1 >>>>>>> >>>>>> >>>>>> Okay, so A_00 is nonsingular, and the system seems to solve alright. >>>>>> What do you get for >>>>>> >>>>>> -fieldsplit_0_ksp_max_it 30 >>>>>> -fieldsplit_0_pc_type gamg >>>>>> -fieldsplit_0_ksp_converged_reason >>>>>> -fieldsplit_1_ksp_converged_reason >>>>>> >>>>>> >>>>> >>>>> It fieldsplit_0_ does not converge in 30 iterations. It gives: >>>>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>>>> Linear solve did not converge due to DIVERGED_ITS iterations 30 >>>>> >>>>> and continues with the same message. >>>>> >>>> >>>> So what would you do? Give up? >>>> >>>> No, I don't want to give up :) >>> >>> >>>> -fieldsplit_0_ksp_gmres_restart 200 >>>> >>>> The idea is to figure out what is going on: >>>> >>>> -fieldsplit_0_ksp_monitor_true_residual >>>> >>>> I have tried these options before too, the residual is decreasing very >>> very slowly, but I've not been able to figure out why. (using hypre does >>> converge although slowly again, but I had problems using hypre with >>> multiple nodes in a cluster with segmentation fault (we discussed that in >>> another thread!) ) >>> >> >> Put in the Laplacian instead of the operator you have now. It should >> converge in a few iterates. If not, you have a problem >> in the specification. >> >> If so, put in linear elasticity. If it is slow, you have made a mistake >> specifiying the near null space. Also, you need to check >> that the near null space made it to GAMG using the ksp_view output. >> > > Which operator are you referring to ? The one in A00 block ? I'm testing > currently with the constant viscosity case which means the A00 block has > \mu div(grad(v)) which is a Laplacian. > And Is it possible to view the ksp_view output before the solver actually > converges to check if GAMG took the near null space ? > 1) Make mu 1.0 2) The nullspace does not matter at all for the Laplacian, so turn it off If it does not take < 5 iterations, that is not the Laplacian. There are plenty of FD Laplacians in PETSc, like SNES ex5, that you can run GAMG on to test. You should consider getting an exact solution and testing with that as well, since it appears your operator is not what you think it is. Matt > >> Matt >> >> >>> e.g a snapshot of the output: >>> >>> Residual norms for fieldsplit_0_ solve. >>> 0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm >>> 0.000000000000e+00 ||r(i)||/||b|| -nan >>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>> Residual norms for fieldsplit_0_ solve. >>> 0 KSP preconditioned resid norm 2.619231455875e-01 true resid norm >>> 3.637306695895e+02 ||r(i)||/||b|| 1.000000000000e+00 >>> 1 KSP preconditioned resid norm 9.351491725479e-02 true resid norm >>> 6.013334574957e+01 ||r(i)||/||b|| 1.653238255038e-01 >>> 2 KSP preconditioned resid norm 6.010357491087e-02 true resid norm >>> 3.664473273769e+01 ||r(i)||/||b|| 1.007468871928e-01 >>> 3 KSP preconditioned resid norm 6.006968012944e-02 true resid norm >>> 3.696451770148e+01 ||r(i)||/||b|| 1.016260678353e-01 >>> 4 KSP preconditioned resid norm 4.418407037098e-02 true resid norm >>> 3.184810838034e+01 ||r(i)||/||b|| 8.755959022176e-02 >>> ... >>> ... >>> 93 KSP preconditioned resid norm 4.549506047737e-04 true resid norm >>> 2.877594552685e+00 ||r(i)||/||b|| 7.911333283864e-03 >>> 94 KSP preconditioned resid norm 4.515424416235e-04 true resid norm >>> 2.875249044668e+00 ||r(i)||/||b|| 7.904884809172e-03 >>> 95 KSP preconditioned resid norm 4.277647876573e-04 true resid norm >>> 2.830418831358e+00 ||r(i)||/||b|| 7.781633686685e-03 >>> 96 KSP preconditioned resid norm 4.244529173876e-04 true resid norm >>> 2.807041401408e+00 ||r(i)||/||b|| 7.717362422521e-03 >>> 97 KSP preconditioned resid norm 4.138326570674e-04 true resid norm >>> 2.793663020386e+00 ||r(i)||/||b|| 7.680581413547e-03 >>> 98 KSP preconditioned resid norm 3.869979433609e-04 true resid norm >>> 2.715150386650e+00 ||r(i)||/||b|| 7.464727650583e-03 >>> 99 KSP preconditioned resid norm 3.847873979265e-04 true resid norm >>> 2.706008990336e+00 ||r(i)||/||b|| 7.439595328571e-03 >>> >>> .... >>> .... >>> 294 KSP preconditioned resid norm 1.416482289961e-04 true resid norm >>> 2.735750748819e+00 ||r(i)||/||b|| 7.521363958412e-03 >>> 295 KSP preconditioned resid norm 1.415389087364e-04 true resid norm >>> 2.742638608355e+00 ||r(i)||/||b|| 7.540300661064e-03 >>> 296 KSP preconditioned resid norm 1.414967651105e-04 true resid norm >>> 2.747224243968e+00 ||r(i)||/||b|| 7.552907889424e-03 >>> 297 KSP preconditioned resid norm 1.413843018303e-04 true resid norm >>> 2.752574248710e+00 ||r(i)||/||b|| 7.567616587891e-03 >>> 298 KSP preconditioned resid norm 1.411747949695e-04 true resid norm >>> 2.765459647367e+00 ||r(i)||/||b|| 7.603042246859e-03 >>> 299 KSP preconditioned resid norm 1.411609742082e-04 true resid norm >>> 2.765900464868e+00 ||r(i)||/||b|| 7.604254180683e-03 >>> 300 KSP preconditioned resid norm 1.409844332838e-04 true resid norm >>> 2.771790506811e+00 ||r(i)||/||b|| 7.620447596402e-03 >>> Linear solve did not converge due to DIVERGED_ITS iterations 300 >>> Residual norms for fieldsplit_0_ solve. >>> 0 KSP preconditioned resid norm 1.294272083271e-03 true resid norm >>> 1.776945075651e+00 ||r(i)||/||b|| 1.000000000000e+00 >>> ... >>> ... >>> >>> >>> >>> >>>> Matt >>>> >>>> >>>>> >>>>> >>>>> >>>>>> This is the kind of investigation you msut be comfortable with if you >>>>>> want to experiment with these solvers. >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>>> Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> has attached null space >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: 1 MPI processes >>>>>>>>> type: fieldsplit >>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>> factorization FULL >>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>> provided matrix >>>>>>>>> Split info: >>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>> Split number 1 Fields 3 >>>>>>>>> KSP solver for A00 block >>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit used is >>>>>>>>> 5 >>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> has attached null space >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: schurcomplement >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>> A11 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> A10 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=8232 >>>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> KSP of A00 >>>>>>>>> KSP Object: >>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: >>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>>> pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI >>>>>>>>> processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> package used to perform factorization: >>>>>>>>> petsc >>>>>>>>> total: nonzeros=576000, allocated >>>>>>>>> nonzeros=576000 >>>>>>>>> total number of mallocs used during >>>>>>>>> MatSetValues calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, >>>>>>>>> limit used is 5 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> A01 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=2744 >>>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=10976, cols=10976, bs=4 >>>>>>>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> I am testing a small problem with CONSTANT viscosity for grid >>>>>>>>>>> size of 14^3 with the run time option: >>>>>>>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>>>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>>>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>>>>>>> >>>>>>>>>>> Here is my relevant code of the solve function: >>>>>>>>>>> PetscErrorCode ierr; >>>>>>>>>>> PetscFunctionBeginUser; >>>>>>>>>>> ierr = >>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa >>>>>>>>>>> with dof = 4, vx,vy,vz and p. >>>>>>>>>>> ierr = >>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main >>>>>>>>>>> system >>>>>>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>> //register the fieldsplits obtained from options. >>>>>>>>>>> >>>>>>>>>>> //Setting up user PC for Schur Complement >>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> KSP *subKsp; >>>>>>>>>>> PetscInt subKspPos = 0; >>>>>>>>>>> //Set up nearNullspace for A00 block. >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>>> MatNullSpace rigidBodyModes; >>>>>>>>>>> Vec coords; >>>>>>>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>> Mat matA00; >>>>>>>>>>> ierr = >>>>>>>>>>> KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>>>>>>> pressure space >>>>>>>>>>> subKspPos = 1; >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>>> //Set up the null space of constant pressure. >>>>>>>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>>>>>>> PetscBool isNull; >>>>>>>>>>> Mat matSc; >>>>>>>>>>> ierr = >>>>>>>>>>> KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>>>>>>> if(!isNull) >>>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>>> pressure null space \n"); >>>>>>>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>>>>>>> if(!isNull) >>>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>>> system null space \n"); >>>>>>>>>>> >>>>>>>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> PetscFunctionReturn(0); >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal < >>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each >>>>>>>>>>>>>>>>>>>>>> for both 2D and 3D, in >>>>>>>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and >>>>>>>>>>>>>>>>>>>>>> faces to account for the >>>>>>>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem >>>>>>>>>>>>>>>>>>>>>> and I recommend >>>>>>>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement >>>>>>>>>>>>>>>>>>>>>> reduction (make that >>>>>>>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). >>>>>>>>>>>>>>>>>>>>>> You can use PCLSC or >>>>>>>>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. >>>>>>>>>>>>>>>>>>>>>> This diagonal >>>>>>>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, >>>>>>>>>>>>>>>>>>>>>> depending on >>>>>>>>>>>>>>>>>>>>>> discretization) approximation of the Schur >>>>>>>>>>>>>>>>>>>>>> complement. The velocity >>>>>>>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read >>>>>>>>>>>>>>>>>>>>>> the PCFieldSplit >>>>>>>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know >>>>>>>>>>>>>>>>>>>>>> if you get stuck. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity >>>>>>>>>>>>>>>>>>>>> diagonal matrix to use as the preconditioner for the Schur complement solve >>>>>>>>>>>>>>>>>>>>> step as you suggested. I've few questions about the ways to implement this >>>>>>>>>>>>>>>>>>>>> in Petsc: >>>>>>>>>>>>>>>>>>>>> A naive approach that I can think of would be to >>>>>>>>>>>>>>>>>>>>> create a vector with its components as reciprocal viscosities of the cell >>>>>>>>>>>>>>>>>>>>> centers corresponding to the pressure variables, and then create a diagonal >>>>>>>>>>>>>>>>>>>>> matrix from this vector. However I'm not sure about: >>>>>>>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to >>>>>>>>>>>>>>>>>>>>> the Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but >>>>>>>>>>>>>>>>>>>>> for the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never >>>>>>>>>>>>>>>>>>>> breaks blocks. The P distribution is the same as the entire problem divided >>>>>>>>>>>>>>>>>>>> by 4. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid >>>>>>>>>>>>>>>>>>> size but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Yes. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> How can I set this split in the code itself without giving >>>>>>>>>>>>>>>>> it as a command line option when the system matrix is assembled from the >>>>>>>>>>>>>>>>> DMDA for the whole system with 4 dofs. (i.e. *without*using the DMComposite or >>>>>>>>>>>>>>>>> *without* using the nested block matrices to assemble >>>>>>>>>>>>>>>>> different blocks separately and then combine them together). >>>>>>>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in >>>>>>>>>>>>>>>>> my code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>>>>>>> use something like >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> PetscObject pressure; >>>>>>>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> and then DMGetSubDM() uses this information to attach the >>>>>>>>>>>>>>>> null space to the IS that is created using the information in the >>>>>>>>>>>>>>>> PetscSection. >>>>>>>>>>>>>>>> If you use a PetscSection to set the data layout over the >>>>>>>>>>>>>>>> DMDA, I think this works correctly, but this has not been tested at all and >>>>>>>>>>>>>>>> is very >>>>>>>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Bottom line: For custom null spaces using the default >>>>>>>>>>>>>>>> layout in DMDA, you need to take apart the PCFIELDSPLIT after it has been >>>>>>>>>>>>>>>> setup, >>>>>>>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and >>>>>>>>>>>>>>>> then reach in and get the PC, and the subKSPs. I don't like this at all, >>>>>>>>>>>>>>>> but we >>>>>>>>>>>>>>>> have not reorganized that code (which could be very simple >>>>>>>>>>>>>>>> and inflexible since its very structured). >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> So I tried to get this approach working but I could not >>>>>>>>>>>>>>> succeed and encountered some errors. Here is a code snippet: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 >>>>>>>>>>>>>>> dofs (3 velocity components and 1 pressure comp.) >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>>>>>>> complete system. >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC >>>>>>>>>>>>>>> that was obtained from the options (fieldsplit) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>>>>>>> using the mDaPressure. >>>>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you >>>>>>>>>>>>>> want >>>>>>>>>>>>>> >>>>>>>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, with this it seems I can attach the null space >>>>>>>>>>>>> properly, but I have a question regarding whether the Schur complement ksp >>>>>>>>>>>>> solver is actually using the preconditioner matrix I provide. >>>>>>>>>>>>> When using -ksp_view, the outer level pc object of type >>>>>>>>>>>>> fieldsplit does report that: "Preconditioner for the Schur complement >>>>>>>>>>>>> formed from user provided matrix", but in the KSP solver for Schur >>>>>>>>>>>>> complement S, the pc object (fieldsplit_1_) is of type ilu and doesn't say >>>>>>>>>>>>> that it is using the matrix I provide. Am I missing something here ? >>>>>>>>>>>>> Below are the relevant commented code snippet and the output >>>>>>>>>>>>> of the -ksp_view >>>>>>>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type >>>>>>>>>>>>> schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> If ILU does not error, it means it is using your matrix, >>>>>>>>>>>> because the Schur complement matrix cannot be factored, and FS says it is >>>>>>>>>>>> using your matrix. >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Code snippet: >>>>>>>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>>>>>>> //The nullspace for the whole system >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>> //Set up mKsp with the options provided with fieldsplit and the fields >>>>>>>>>>>>> associated with the two splits. >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>>>>>>> >>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> the output of the -ksp_view >>>>>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> has attached null space >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>>>>> type: fieldsplit >>>>>>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>>>>>> factorization FULL >>>>>>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>>>>>> provided matrix >>>>>>>>>>>>> Split info: >>>>>>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>>>>>> Split number 1 Fields 3 >>>>>>>>>>>>> KSP solver for A00 block >>>>>>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: ilu >>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>> calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, limit used >>>>>>>>>>>>> is 5 >>>>>>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> has attached null space >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>>> type: ilu >>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: schurcomplement >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>>>>>> A11 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> A10 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=2187 >>>>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> KSP of A00 >>>>>>>>>>>>> KSP Object: >>>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical >>>>>>>>>>>>> (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence >>>>>>>>>>>>> test >>>>>>>>>>>>> PC Object: >>>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: ilu >>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent >>>>>>>>>>>>> zero pivot >>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 >>>>>>>>>>>>> MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> package used to perform factorization: >>>>>>>>>>>>> petsc >>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 >>>>>>>>>>>>> nodes, limit used is 5 >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>> A01 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=729 >>>>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>> calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> or >>>>>>>>>>>>>> >>>>>>>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> The errors I get when running with options: -pc_type >>>>>>>>>>>>>>> fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug >>>>>>>>>>>>>>> named edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc >>>>>>>>>>>>>>> --with-fc=g77 --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Fri Aug 23 08:47:25 2013 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 23 Aug 2013 15:47:25 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: Yes of course. Just shut all the other sub-solvers off As Matt says, check check check your GAMG configuration. -ksp_max_it 1 -pc_fieldsplit_0_ksp_max_it 4 -pc_fieldsplit_1_ksp_max_it 2 -ksp_view Or if you just want to see GAMG configuration, do this -ksp_max_it 1 -pc_fieldsplit_0_ksp_max_it 4 -pc_fieldsplit_1_ksp_max_it 2 -pc_fieldsplit_0_ksp_view For your testing, I'd use FMGRES on A00 as this give you the most flexibility with the choice of smoother. -pc_fieldsplit_0_ksp_type fgmres and then do some seriously heavy smoothing on each level to see if you can make something converge. By "heavy" I mean something like gmres(20)+ILU(0) On 23 August 2013 15:42, Bishesh Khanal wrote: > > > > On Fri, Aug 23, 2013 at 3:34 PM, Matthew Knepley wrote: > >> On Fri, Aug 23, 2013 at 8:30 AM, Bishesh Khanal wrote: >> >>> >>> >>> >>> On Fri, Aug 23, 2013 at 3:16 PM, Matthew Knepley wrote: >>> >>>> On Fri, Aug 23, 2013 at 8:01 AM, Bishesh Khanal wrote: >>>> >>>>> >>>>> >>>>> >>>>> On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: >>>>> >>>>>> On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal wrote: >>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley wrote: >>>>>>> >>>>>>>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal < >>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal < >>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks Matt and Mark for comments in using near null space >>>>>>>>>>> [question I asked in the thread with subject: *problem >>>>>>>>>>> (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with >>>>>>>>>>> multiple nodes in a cluster*]. >>>>>>>>>>> So I understood that I have to set a nearNullSpace to A00 block >>>>>>>>>>> where the null space correspond to the rigid body motion. I tried it but >>>>>>>>>>> still the gamg just keeps on iterating and convergence is very very slow. I >>>>>>>>>>> am not sure what the problem is, right now gamg does not even work for the >>>>>>>>>>> constant viscosity case. >>>>>>>>>>> I have set up the following in my code: >>>>>>>>>>> 1. null space for the whole system A 2. null space for the Schur >>>>>>>>>>> complement S 3. Near null space for A00 4. a user preconditioner matrix of >>>>>>>>>>> inverse viscosity in the diagonal for S. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>>>>>>> >>>>>>>>> >>>>>>>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>>>>>>> didn't get to the end to get -ksp_view results. >>>>>>>>> Instead here I have put the -ksp_view output when running the >>>>>>>>> program with following options: (In this case I get the results) >>>>>>>>> -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>> >>>>>>>> >>>>>>>> Okay, that looks fine. Does >>>>>>>> >>>>>>>> -fieldsplit_0_pc_type lu >>>>>>>> - fieldsplit_1_ksp_rtol 1.0e-10 >>>>>>>> >>>>>>>> converge in one Iterate? >>>>>>>> >>>>>>>> What matrix did you attach as the preconditioner matrix for >>>>>>>> fieldsplit_1_? >>>>>>>> >>>>>>> >>>>>>> >>>>>>> I used a diagonal matrix with reciprocal of viscosity values of the >>>>>>> corresponding cell centers as the preconditioner. >>>>>>> >>>>>>> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >>>>>>> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >>>>>>> I get the following output which means the outer ksp did converge in >>>>>>> one iterate I guess. >>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 1 >>>>>>> >>>>>> >>>>>> Okay, so A_00 is nonsingular, and the system seems to solve alright. >>>>>> What do you get for >>>>>> >>>>>> -fieldsplit_0_ksp_max_it 30 >>>>>> -fieldsplit_0_pc_type gamg >>>>>> -fieldsplit_0_ksp_converged_reason >>>>>> -fieldsplit_1_ksp_converged_reason >>>>>> >>>>>> >>>>> >>>>> It fieldsplit_0_ does not converge in 30 iterations. It gives: >>>>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>>>> Linear solve did not converge due to DIVERGED_ITS iterations 30 >>>>> >>>>> and continues with the same message. >>>>> >>>> >>>> So what would you do? Give up? >>>> >>>> No, I don't want to give up :) >>> >>> >>>> -fieldsplit_0_ksp_gmres_restart 200 >>>> >>>> The idea is to figure out what is going on: >>>> >>>> -fieldsplit_0_ksp_monitor_true_residual >>>> >>>> I have tried these options before too, the residual is decreasing very >>> very slowly, but I've not been able to figure out why. (using hypre does >>> converge although slowly again, but I had problems using hypre with >>> multiple nodes in a cluster with segmentation fault (we discussed that in >>> another thread!) ) >>> >> >> Put in the Laplacian instead of the operator you have now. It should >> converge in a few iterates. If not, you have a problem >> in the specification. >> >> If so, put in linear elasticity. If it is slow, you have made a mistake >> specifiying the near null space. Also, you need to check >> that the near null space made it to GAMG using the ksp_view output. >> > > Which operator are you referring to ? The one in A00 block ? I'm testing > currently with the constant viscosity case which means the A00 block has > \mu div(grad(v)) which is a Laplacian. > And Is it possible to view the ksp_view output before the solver actually > converges to check if GAMG took the near null space ? > > >> >> Matt >> >> >>> e.g a snapshot of the output: >>> >>> Residual norms for fieldsplit_0_ solve. >>> 0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm >>> 0.000000000000e+00 ||r(i)||/||b|| -nan >>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>> Residual norms for fieldsplit_0_ solve. >>> 0 KSP preconditioned resid norm 2.619231455875e-01 true resid norm >>> 3.637306695895e+02 ||r(i)||/||b|| 1.000000000000e+00 >>> 1 KSP preconditioned resid norm 9.351491725479e-02 true resid norm >>> 6.013334574957e+01 ||r(i)||/||b|| 1.653238255038e-01 >>> 2 KSP preconditioned resid norm 6.010357491087e-02 true resid norm >>> 3.664473273769e+01 ||r(i)||/||b|| 1.007468871928e-01 >>> 3 KSP preconditioned resid norm 6.006968012944e-02 true resid norm >>> 3.696451770148e+01 ||r(i)||/||b|| 1.016260678353e-01 >>> 4 KSP preconditioned resid norm 4.418407037098e-02 true resid norm >>> 3.184810838034e+01 ||r(i)||/||b|| 8.755959022176e-02 >>> ... >>> ... >>> 93 KSP preconditioned resid norm 4.549506047737e-04 true resid norm >>> 2.877594552685e+00 ||r(i)||/||b|| 7.911333283864e-03 >>> 94 KSP preconditioned resid norm 4.515424416235e-04 true resid norm >>> 2.875249044668e+00 ||r(i)||/||b|| 7.904884809172e-03 >>> 95 KSP preconditioned resid norm 4.277647876573e-04 true resid norm >>> 2.830418831358e+00 ||r(i)||/||b|| 7.781633686685e-03 >>> 96 KSP preconditioned resid norm 4.244529173876e-04 true resid norm >>> 2.807041401408e+00 ||r(i)||/||b|| 7.717362422521e-03 >>> 97 KSP preconditioned resid norm 4.138326570674e-04 true resid norm >>> 2.793663020386e+00 ||r(i)||/||b|| 7.680581413547e-03 >>> 98 KSP preconditioned resid norm 3.869979433609e-04 true resid norm >>> 2.715150386650e+00 ||r(i)||/||b|| 7.464727650583e-03 >>> 99 KSP preconditioned resid norm 3.847873979265e-04 true resid norm >>> 2.706008990336e+00 ||r(i)||/||b|| 7.439595328571e-03 >>> >>> .... >>> .... >>> 294 KSP preconditioned resid norm 1.416482289961e-04 true resid norm >>> 2.735750748819e+00 ||r(i)||/||b|| 7.521363958412e-03 >>> 295 KSP preconditioned resid norm 1.415389087364e-04 true resid norm >>> 2.742638608355e+00 ||r(i)||/||b|| 7.540300661064e-03 >>> 296 KSP preconditioned resid norm 1.414967651105e-04 true resid norm >>> 2.747224243968e+00 ||r(i)||/||b|| 7.552907889424e-03 >>> 297 KSP preconditioned resid norm 1.413843018303e-04 true resid norm >>> 2.752574248710e+00 ||r(i)||/||b|| 7.567616587891e-03 >>> 298 KSP preconditioned resid norm 1.411747949695e-04 true resid norm >>> 2.765459647367e+00 ||r(i)||/||b|| 7.603042246859e-03 >>> 299 KSP preconditioned resid norm 1.411609742082e-04 true resid norm >>> 2.765900464868e+00 ||r(i)||/||b|| 7.604254180683e-03 >>> 300 KSP preconditioned resid norm 1.409844332838e-04 true resid norm >>> 2.771790506811e+00 ||r(i)||/||b|| 7.620447596402e-03 >>> Linear solve did not converge due to DIVERGED_ITS iterations 300 >>> Residual norms for fieldsplit_0_ solve. >>> 0 KSP preconditioned resid norm 1.294272083271e-03 true resid norm >>> 1.776945075651e+00 ||r(i)||/||b|| 1.000000000000e+00 >>> ... >>> ... >>> >>> >>> >>> >>>> Matt >>>> >>>> >>>>> >>>>> >>>>> >>>>>> This is the kind of investigation you msut be comfortable with if you >>>>>> want to experiment with these solvers. >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>>> Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> has attached null space >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: 1 MPI processes >>>>>>>>> type: fieldsplit >>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>> factorization FULL >>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>> provided matrix >>>>>>>>> Split info: >>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>> Split number 1 Fields 3 >>>>>>>>> KSP solver for A00 block >>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit used is >>>>>>>>> 5 >>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> has attached null space >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: schurcomplement >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>> A11 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> A10 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=8232 >>>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> KSP of A00 >>>>>>>>> KSP Object: >>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: gmres >>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>> divergence=10000 >>>>>>>>> left preconditioning >>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>> PC Object: >>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>> type: ilu >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> 0 levels of fill >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>>> pivot >>>>>>>>> matrix ordering: natural >>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>> Factored matrix follows: >>>>>>>>> Matrix Object: 1 MPI >>>>>>>>> processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> package used to perform factorization: >>>>>>>>> petsc >>>>>>>>> total: nonzeros=576000, allocated >>>>>>>>> nonzeros=576000 >>>>>>>>> total number of mallocs used during >>>>>>>>> MatSetValues calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, >>>>>>>>> limit used is 5 >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=8232 >>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> A01 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=8232, cols=2744 >>>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>> calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>> used is 5 >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=2744, cols=2744 >>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> not using I-node routines >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>> type: seqaij >>>>>>>>> rows=10976, cols=10976, bs=4 >>>>>>>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> I am testing a small problem with CONSTANT viscosity for grid >>>>>>>>>>> size of 14^3 with the run time option: >>>>>>>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>>>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>>>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>>>>>>> >>>>>>>>>>> Here is my relevant code of the solve function: >>>>>>>>>>> PetscErrorCode ierr; >>>>>>>>>>> PetscFunctionBeginUser; >>>>>>>>>>> ierr = >>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa >>>>>>>>>>> with dof = 4, vx,vy,vz and p. >>>>>>>>>>> ierr = >>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main >>>>>>>>>>> system >>>>>>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>> //register the fieldsplits obtained from options. >>>>>>>>>>> >>>>>>>>>>> //Setting up user PC for Schur Complement >>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> KSP *subKsp; >>>>>>>>>>> PetscInt subKspPos = 0; >>>>>>>>>>> //Set up nearNullspace for A00 block. >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>>> MatNullSpace rigidBodyModes; >>>>>>>>>>> Vec coords; >>>>>>>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>> Mat matA00; >>>>>>>>>>> ierr = >>>>>>>>>>> KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = >>>>>>>>>>> MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>>>>>>> pressure space >>>>>>>>>>> subKspPos = 1; >>>>>>>>>>> ierr = >>>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>>> //Set up the null space of constant pressure. >>>>>>>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>>>>>>> PetscBool isNull; >>>>>>>>>>> Mat matSc; >>>>>>>>>>> ierr = >>>>>>>>>>> KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>>>>>>> if(!isNull) >>>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>>> pressure null space \n"); >>>>>>>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>>>>>>> if(!isNull) >>>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>>> system null space \n"); >>>>>>>>>>> >>>>>>>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>>>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> PetscFunctionReturn(0); >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal < >>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley < >>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each >>>>>>>>>>>>>>>>>>>>>> for both 2D and 3D, in >>>>>>>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and >>>>>>>>>>>>>>>>>>>>>> faces to account for the >>>>>>>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite difference >>>>>>>>>>>>>>>>>>>>>> discretization of >>>>>>>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem >>>>>>>>>>>>>>>>>>>>>> and I recommend >>>>>>>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement >>>>>>>>>>>>>>>>>>>>>> reduction (make that >>>>>>>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). >>>>>>>>>>>>>>>>>>>>>> You can use PCLSC or >>>>>>>>>>>>>>>>>>>>>> (probably better for you), assemble a preconditioning >>>>>>>>>>>>>>>>>>>>>> matrix containing >>>>>>>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure block. >>>>>>>>>>>>>>>>>>>>>> This diagonal >>>>>>>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, >>>>>>>>>>>>>>>>>>>>>> depending on >>>>>>>>>>>>>>>>>>>>>> discretization) approximation of the Schur >>>>>>>>>>>>>>>>>>>>>> complement. The velocity >>>>>>>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read >>>>>>>>>>>>>>>>>>>>>> the PCFieldSplit >>>>>>>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know >>>>>>>>>>>>>>>>>>>>>> if you get stuck. >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity >>>>>>>>>>>>>>>>>>>>> diagonal matrix to use as the preconditioner for the Schur complement solve >>>>>>>>>>>>>>>>>>>>> step as you suggested. I've few questions about the ways to implement this >>>>>>>>>>>>>>>>>>>>> in Petsc: >>>>>>>>>>>>>>>>>>>>> A naive approach that I can think of would be to >>>>>>>>>>>>>>>>>>>>> create a vector with its components as reciprocal viscosities of the cell >>>>>>>>>>>>>>>>>>>>> centers corresponding to the pressure variables, and then create a diagonal >>>>>>>>>>>>>>>>>>>>> matrix from this vector. However I'm not sure about: >>>>>>>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to >>>>>>>>>>>>>>>>>>>>> the Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs but >>>>>>>>>>>>>>>>>>>>> for the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never >>>>>>>>>>>>>>>>>>>> breaks blocks. The P distribution is the same as the entire problem divided >>>>>>>>>>>>>>>>>>>> by 4. >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid >>>>>>>>>>>>>>>>>>> size but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Yes. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> How can I set this split in the code itself without giving >>>>>>>>>>>>>>>>> it as a command line option when the system matrix is assembled from the >>>>>>>>>>>>>>>>> DMDA for the whole system with 4 dofs. (i.e. *without*using the DMComposite or >>>>>>>>>>>>>>>>> *without* using the nested block matrices to assemble >>>>>>>>>>>>>>>>> different blocks separately and then combine them together). >>>>>>>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in >>>>>>>>>>>>>>>>> my code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>>>>>>> use something like >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> PetscObject pressure; >>>>>>>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> and then DMGetSubDM() uses this information to attach the >>>>>>>>>>>>>>>> null space to the IS that is created using the information in the >>>>>>>>>>>>>>>> PetscSection. >>>>>>>>>>>>>>>> If you use a PetscSection to set the data layout over the >>>>>>>>>>>>>>>> DMDA, I think this works correctly, but this has not been tested at all and >>>>>>>>>>>>>>>> is very >>>>>>>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Currently I do not use PetscSection. If this makes a cleaner >>>>>>>>>>>>>>> approach, I'd try it too but may a bit later (right now I'd like test my >>>>>>>>>>>>>>> model with a quickfix even if it means a little dirty code!) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Bottom line: For custom null spaces using the default >>>>>>>>>>>>>>>> layout in DMDA, you need to take apart the PCFIELDSPLIT after it has been >>>>>>>>>>>>>>>> setup, >>>>>>>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and >>>>>>>>>>>>>>>> then reach in and get the PC, and the subKSPs. I don't like this at all, >>>>>>>>>>>>>>>> but we >>>>>>>>>>>>>>>> have not reorganized that code (which could be very simple >>>>>>>>>>>>>>>> and inflexible since its very structured). >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> So I tried to get this approach working but I could not >>>>>>>>>>>>>>> succeed and encountered some errors. Here is a code snippet: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 >>>>>>>>>>>>>>> dofs (3 velocity components and 1 pressure comp.) >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>>>>>>> complete system. >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC >>>>>>>>>>>>>>> that was obtained from the options (fieldsplit) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>>>>>>> using the mDaPressure. >>>>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you >>>>>>>>>>>>>> want >>>>>>>>>>>>>> >>>>>>>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, with this it seems I can attach the null space >>>>>>>>>>>>> properly, but I have a question regarding whether the Schur complement ksp >>>>>>>>>>>>> solver is actually using the preconditioner matrix I provide. >>>>>>>>>>>>> When using -ksp_view, the outer level pc object of type >>>>>>>>>>>>> fieldsplit does report that: "Preconditioner for the Schur complement >>>>>>>>>>>>> formed from user provided matrix", but in the KSP solver for Schur >>>>>>>>>>>>> complement S, the pc object (fieldsplit_1_) is of type ilu and doesn't say >>>>>>>>>>>>> that it is using the matrix I provide. Am I missing something here ? >>>>>>>>>>>>> Below are the relevant commented code snippet and the output >>>>>>>>>>>>> of the -ksp_view >>>>>>>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type >>>>>>>>>>>>> schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> If ILU does not error, it means it is using your matrix, >>>>>>>>>>>> because the Schur complement matrix cannot be factored, and FS says it is >>>>>>>>>>>> using your matrix. >>>>>>>>>>>> >>>>>>>>>>>> Matt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> Code snippet: >>>>>>>>>>>>> ierr = KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); >>>>>>>>>>>>> //The nullspace for the whole system >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>> //Set up mKsp with the options provided with fieldsplit and the fields >>>>>>>>>>>>> associated with the two splits. >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>>>>>>> >>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = >>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> the output of the -ksp_view >>>>>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> has attached null space >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>>>>> type: fieldsplit >>>>>>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>>>>>> factorization FULL >>>>>>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>>>>>> provided matrix >>>>>>>>>>>>> Split info: >>>>>>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>>>>>> Split number 1 Fields 3 >>>>>>>>>>>>> KSP solver for A00 block >>>>>>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: ilu >>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>> calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, limit used >>>>>>>>>>>>> is 5 >>>>>>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> has attached null space >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>>> type: ilu >>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: schurcomplement >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>>>>>> A11 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> A10 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=2187 >>>>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> KSP of A00 >>>>>>>>>>>>> KSP Object: >>>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: gmres >>>>>>>>>>>>> GMRES: restart=30, using Classical >>>>>>>>>>>>> (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>> using PRECONDITIONED norm type for convergence >>>>>>>>>>>>> test >>>>>>>>>>>>> PC Object: >>>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>> type: ilu >>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>> using diagonal shift on blocks to prevent >>>>>>>>>>>>> zero pivot >>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>> Matrix Object: 1 >>>>>>>>>>>>> MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> package used to perform factorization: >>>>>>>>>>>>> petsc >>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 >>>>>>>>>>>>> nodes, limit used is 5 >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>> A01 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2187, cols=729 >>>>>>>>>>>>> total: nonzeros=46875, allocated nonzeros=46875 >>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>> calls =0 >>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> or >>>>>>>>>>>>>> >>>>>>>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matt >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> The errors I get when running with options: -pc_type >>>>>>>>>>>>>>> fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug >>>>>>>>>>>>>>> named edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc >>>>>>>>>>>>>>> --with-fc=g77 --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> -- >>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -- >>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>> their experiments lead. >>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.bonnefon at avignon.inra.fr Fri Aug 23 09:35:13 2013 From: olivier.bonnefon at avignon.inra.fr (Olivier Bonnefon) Date: Fri, 23 Aug 2013 16:35:13 +0200 Subject: [petsc-users] distribute and cells mapping. Message-ID: <52177321.4080900@avignon.inra.fr> Hello, Thanks for your answers, I'm now able to import and distribute a mesh: if (!rank){ ierr = DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); for (i=0;i References: <52177321.4080900@avignon.inra.fr> Message-ID: On Fri, Aug 23, 2013 at 9:35 AM, Olivier Bonnefon < olivier.bonnefon at avignon.inra.fr> wrote: > Hello, > > Thanks for your answers, I'm now able to import and distribute a mesh: > You might simplify this to if (rank) {obNbCells = 0; obNbVertex = 0;} ierr = DMPlexCreateFromCellList(comm,**dim,obNbCells,obNbVertex,3,0,** obCells,2,obVertex,dm);**CHKERRQ(ierr); > if (!rank){ > ierr = DMPlexCreateFromCellList(comm,** > dim,obNbCells,obNbVertex,3,0,**obCells,2,obVertex,dm);**CHKERRQ(ierr); > for (i=0;i ierr =DMPlexSetLabelValue(*dm, "marker", > obBoundary[i]+obNbCells, 1);CHKERRQ(ierr); > } > }else { > ierr = DMPlexCreateFromCellList(comm,**dim,0,0,3,0,obCells,2,** > obVertex,dm);CHKERRQ(ierr); > } > > ierr = DMPlexDistribute(*dm, partitioner, 0, &distributedMesh);CHKERRQ(** > ierr); > if (distributedMesh) { > ierr = DMDestroy(dm);CHKERRQ(ierr); > *dm = distributedMesh; > } > > Is it possible to known the resulting partition ? ie, What is the mapping > between the initial cell number and the local cell (used in > DMPlexComputeResidualFEM)? > I need this to write an efficient implementation of the FEM struct > functions f0 and g0, space depending. > Yes, but I really do not think you want to do things that way. I am assuming you want different material models or something in different places. The way I envision that is using a DMLabel to mark up parts of the domain. All labels are automatically distributed with the mesh. Is that what you want? Thanks, Matt > Regards, > > Olivier B > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisheshkh at gmail.com Fri Aug 23 11:44:20 2013 From: bisheshkh at gmail.com (Bishesh Khanal) Date: Fri, 23 Aug 2013 18:44:20 +0200 Subject: [petsc-users] discontinuous viscosity stokes equation 3D staggered grid In-Reply-To: References: <87li5555oo.fsf@mcs.anl.gov> Message-ID: On Fri, Aug 23, 2013 at 3:45 PM, Matthew Knepley wrote: > On Fri, Aug 23, 2013 at 8:42 AM, Bishesh Khanal wrote: > >> >> >> >> On Fri, Aug 23, 2013 at 3:34 PM, Matthew Knepley wrote: >> >>> On Fri, Aug 23, 2013 at 8:30 AM, Bishesh Khanal wrote: >>> >>>> >>>> >>>> >>>> On Fri, Aug 23, 2013 at 3:16 PM, Matthew Knepley wrote: >>>> >>>>> On Fri, Aug 23, 2013 at 8:01 AM, Bishesh Khanal wrote: >>>>> >>>>>> >>>>>> >>>>>> >>>>>> On Fri, Aug 23, 2013 at 2:53 PM, Matthew Knepley wrote: >>>>>> >>>>>>> On Fri, Aug 23, 2013 at 7:46 AM, Bishesh Khanal >>>>>> > wrote: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Fri, Aug 23, 2013 at 2:33 PM, Matthew Knepley >>>>>>> > wrote: >>>>>>>> >>>>>>>>> On Fri, Aug 23, 2013 at 7:25 AM, Bishesh Khanal < >>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Fri, Aug 23, 2013 at 2:09 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Fri, Aug 23, 2013 at 4:31 AM, Bishesh Khanal < >>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Thanks Matt and Mark for comments in using near null space >>>>>>>>>>>> [question I asked in the thread with subject: *problem >>>>>>>>>>>> (Segmentation voilation) using -pc_type hypre -pc_hypre_type -pilut with >>>>>>>>>>>> multiple nodes in a cluster*]. >>>>>>>>>>>> So I understood that I have to set a nearNullSpace to A00 block >>>>>>>>>>>> where the null space correspond to the rigid body motion. I tried it but >>>>>>>>>>>> still the gamg just keeps on iterating and convergence is very very slow. I >>>>>>>>>>>> am not sure what the problem is, right now gamg does not even work for the >>>>>>>>>>>> constant viscosity case. >>>>>>>>>>>> I have set up the following in my code: >>>>>>>>>>>> 1. null space for the whole system A 2. null space for the >>>>>>>>>>>> Schur complement S 3. Near null space for A00 4. a user preconditioner >>>>>>>>>>>> matrix of inverse viscosity in the diagonal for S. >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> If you want to debug solvers, you HAVE to send -ksp_view. >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> When I use gamg, the -fieldsplit_0_ksp was iterating on and on so >>>>>>>>>> didn't get to the end to get -ksp_view results. >>>>>>>>>> Instead here I have put the -ksp_view output when running the >>>>>>>>>> program with following options: (In this case I get the results) >>>>>>>>>> -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>>> >>>>>>>>> >>>>>>>>> Okay, that looks fine. Does >>>>>>>>> >>>>>>>>> -fieldsplit_0_pc_type lu >>>>>>>>> - fieldsplit_1_ksp_rtol 1.0e-10 >>>>>>>>> >>>>>>>>> converge in one Iterate? >>>>>>>>> >>>>>>>>> What matrix did you attach as the preconditioner matrix for >>>>>>>>> fieldsplit_1_? >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> I used a diagonal matrix with reciprocal of viscosity values of the >>>>>>>> corresponding cell centers as the preconditioner. >>>>>>>> >>>>>>>> with the options -fieldsplit_0_pc_type lu - fieldsplit_1_ksp_rtol >>>>>>>> 1.0e-10 -fieldsplit_1_ksp_converged_reason -ksp_converged_reason >>>>>>>> I get the following output which means the outer ksp did converge >>>>>>>> in one iterate I guess. >>>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 18 >>>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 1 >>>>>>>> >>>>>>> >>>>>>> Okay, so A_00 is nonsingular, and the system seems to solve alright. >>>>>>> What do you get for >>>>>>> >>>>>>> -fieldsplit_0_ksp_max_it 30 >>>>>>> -fieldsplit_0_pc_type gamg >>>>>>> -fieldsplit_0_ksp_converged_reason >>>>>>> -fieldsplit_1_ksp_converged_reason >>>>>>> >>>>>>> >>>>>> >>>>>> It fieldsplit_0_ does not converge in 30 iterations. It gives: >>>>>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>>>>> Linear solve did not converge due to DIVERGED_ITS iterations 30 >>>>>> >>>>>> and continues with the same message. >>>>>> >>>>> >>>>> So what would you do? Give up? >>>>> >>>>> No, I don't want to give up :) >>>> >>>> >>>>> -fieldsplit_0_ksp_gmres_restart 200 >>>>> >>>>> The idea is to figure out what is going on: >>>>> >>>>> -fieldsplit_0_ksp_monitor_true_residual >>>>> >>>>> I have tried these options before too, the residual is decreasing very >>>> very slowly, but I've not been able to figure out why. (using hypre does >>>> converge although slowly again, but I had problems using hypre with >>>> multiple nodes in a cluster with segmentation fault (we discussed that in >>>> another thread!) ) >>>> >>> >>> Put in the Laplacian instead of the operator you have now. It should >>> converge in a few iterates. If not, you have a problem >>> in the specification. >>> >>> If so, put in linear elasticity. If it is slow, you have made a mistake >>> specifiying the near null space. Also, you need to check >>> that the near null space made it to GAMG using the ksp_view output. >>> >> >> Which operator are you referring to ? The one in A00 block ? I'm testing >> currently with the constant viscosity case which means the A00 block has >> \mu div(grad(v)) which is a Laplacian. >> And Is it possible to view the ksp_view output before the solver actually >> converges to check if GAMG took the near null space ? >> > > 1) Make mu 1.0 > > 2) The nullspace does not matter at all for the Laplacian, so turn it off > If it does not take < 5 iterations, that is not the Laplacian. > When making mu 1.0, the number of iterations for fieldsplit_0_ depended on the scaling of the constant (K) I use for 0 dirichlet boundary condition on velocity. Since I'm using a staggered grid, the rows corresponding to Dirichlet boundary on velocity have either one element K at the diagonal or two elements 3K and -K depending on whether v-component lies exactly on the boundary face or interior to it. E.g. for x=0 face, boundary conditions: K*vx(0,j,k) = 0; 3K*vy(0,j,k) - K*vy(1,j,k) = 0; After playing around a bit, I could find a suitable K that would get me the fieldsplit_0_ to converge in about 10 iterations. But values of K would dramatically increase the number of iterations. And I also checked the -fieldsplit_0_ksp_view as Dave and you suggested, and I did not see anywhere the information if it got the nearnullspace or not. I searched for "null" in the attached file of -fieldsplit_0_ksp_view for that. I don't know why it does not take the near-null space! > There are plenty of FD Laplacians in PETSc, like SNES ex5, that you can > run GAMG on to test. You should consider getting an exact solution and > testing with that as well, since it appears your operator is > not what you think it is. > > Matt > > >> >>> Matt >>> >>> >>>> e.g a snapshot of the output: >>>> >>>> Residual norms for fieldsplit_0_ solve. >>>> 0 KSP preconditioned resid norm 0.000000000000e+00 true resid norm >>>> 0.000000000000e+00 ||r(i)||/||b|| -nan >>>> Linear solve converged due to CONVERGED_ATOL iterations 0 >>>> Residual norms for fieldsplit_0_ solve. >>>> 0 KSP preconditioned resid norm 2.619231455875e-01 true resid norm >>>> 3.637306695895e+02 ||r(i)||/||b|| 1.000000000000e+00 >>>> 1 KSP preconditioned resid norm 9.351491725479e-02 true resid norm >>>> 6.013334574957e+01 ||r(i)||/||b|| 1.653238255038e-01 >>>> 2 KSP preconditioned resid norm 6.010357491087e-02 true resid norm >>>> 3.664473273769e+01 ||r(i)||/||b|| 1.007468871928e-01 >>>> 3 KSP preconditioned resid norm 6.006968012944e-02 true resid norm >>>> 3.696451770148e+01 ||r(i)||/||b|| 1.016260678353e-01 >>>> 4 KSP preconditioned resid norm 4.418407037098e-02 true resid norm >>>> 3.184810838034e+01 ||r(i)||/||b|| 8.755959022176e-02 >>>> ... >>>> ... >>>> 93 KSP preconditioned resid norm 4.549506047737e-04 true resid norm >>>> 2.877594552685e+00 ||r(i)||/||b|| 7.911333283864e-03 >>>> 94 KSP preconditioned resid norm 4.515424416235e-04 true resid norm >>>> 2.875249044668e+00 ||r(i)||/||b|| 7.904884809172e-03 >>>> 95 KSP preconditioned resid norm 4.277647876573e-04 true resid norm >>>> 2.830418831358e+00 ||r(i)||/||b|| 7.781633686685e-03 >>>> 96 KSP preconditioned resid norm 4.244529173876e-04 true resid norm >>>> 2.807041401408e+00 ||r(i)||/||b|| 7.717362422521e-03 >>>> 97 KSP preconditioned resid norm 4.138326570674e-04 true resid norm >>>> 2.793663020386e+00 ||r(i)||/||b|| 7.680581413547e-03 >>>> 98 KSP preconditioned resid norm 3.869979433609e-04 true resid norm >>>> 2.715150386650e+00 ||r(i)||/||b|| 7.464727650583e-03 >>>> 99 KSP preconditioned resid norm 3.847873979265e-04 true resid norm >>>> 2.706008990336e+00 ||r(i)||/||b|| 7.439595328571e-03 >>>> >>>> .... >>>> .... >>>> 294 KSP preconditioned resid norm 1.416482289961e-04 true resid norm >>>> 2.735750748819e+00 ||r(i)||/||b|| 7.521363958412e-03 >>>> 295 KSP preconditioned resid norm 1.415389087364e-04 true resid norm >>>> 2.742638608355e+00 ||r(i)||/||b|| 7.540300661064e-03 >>>> 296 KSP preconditioned resid norm 1.414967651105e-04 true resid norm >>>> 2.747224243968e+00 ||r(i)||/||b|| 7.552907889424e-03 >>>> 297 KSP preconditioned resid norm 1.413843018303e-04 true resid norm >>>> 2.752574248710e+00 ||r(i)||/||b|| 7.567616587891e-03 >>>> 298 KSP preconditioned resid norm 1.411747949695e-04 true resid norm >>>> 2.765459647367e+00 ||r(i)||/||b|| 7.603042246859e-03 >>>> 299 KSP preconditioned resid norm 1.411609742082e-04 true resid norm >>>> 2.765900464868e+00 ||r(i)||/||b|| 7.604254180683e-03 >>>> 300 KSP preconditioned resid norm 1.409844332838e-04 true resid norm >>>> 2.771790506811e+00 ||r(i)||/||b|| 7.620447596402e-03 >>>> Linear solve did not converge due to DIVERGED_ITS iterations 300 >>>> Residual norms for fieldsplit_0_ solve. >>>> 0 KSP preconditioned resid norm 1.294272083271e-03 true resid norm >>>> 1.776945075651e+00 ||r(i)||/||b|| 1.000000000000e+00 >>>> ... >>>> ... >>>> >>>> >>>> >>>> >>>>> Matt >>>>> >>>>> >>>>>> >>>>>> >>>>>> >>>>>>> This is the kind of investigation you msut be comfortable with if >>>>>>> you want to experiment with these solvers. >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> Linear solve converged due to CONVERGED_RTOL iterations 2 >>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>>>>>>>>> Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> has attached null space >>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>> type: fieldsplit >>>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>>> factorization FULL >>>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>>> provided matrix >>>>>>>>>> Split info: >>>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>>> Split number 1 Fields 3 >>>>>>>>>> KSP solver for A00 block >>>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>> divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: ilu >>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>> 0 levels of fill >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: natural >>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=8232, cols=8232 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>>> used is 5 >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=8232, cols=8232 >>>>>>>>>> total: nonzeros=576000, allocated nonzeros=576000 >>>>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>>>> =0 >>>>>>>>>> using I-node routines: found 2744 nodes, limit used >>>>>>>>>> is 5 >>>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>> divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> has attached null space >>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>> type: ilu >>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>> 0 levels of fill >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>> matrix ordering: natural >>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2744, cols=2744 >>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> linear system matrix followed by preconditioner matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: schurcomplement >>>>>>>>>> rows=2744, cols=2744 >>>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>>> A11 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2744, cols=2744 >>>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> A10 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2744, cols=8232 >>>>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> KSP of A00 >>>>>>>>>> KSP Object: >>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: gmres >>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>> divergence=10000 >>>>>>>>>> left preconditioning >>>>>>>>>> using PRECONDITIONED norm type for convergence >>>>>>>>>> test >>>>>>>>>> PC Object: >>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>> type: ilu >>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>> 0 levels of fill >>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>> using diagonal shift on blocks to prevent zero >>>>>>>>>> pivot >>>>>>>>>> matrix ordering: natural >>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>> Factored matrix follows: >>>>>>>>>> Matrix Object: 1 MPI >>>>>>>>>> processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=8232, cols=8232 >>>>>>>>>> package used to perform factorization: >>>>>>>>>> petsc >>>>>>>>>> total: nonzeros=576000, allocated >>>>>>>>>> nonzeros=576000 >>>>>>>>>> total number of mallocs used during >>>>>>>>>> MatSetValues calls =0 >>>>>>>>>> using I-node routines: found 2744 >>>>>>>>>> nodes, limit used is 5 >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=8232, cols=8232 >>>>>>>>>> total: nonzeros=576000, allocated >>>>>>>>>> nonzeros=576000 >>>>>>>>>> total number of mallocs used during >>>>>>>>>> MatSetValues calls =0 >>>>>>>>>> using I-node routines: found 2744 nodes, >>>>>>>>>> limit used is 5 >>>>>>>>>> A01 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=8232, cols=2744 >>>>>>>>>> total: nonzeros=192000, allocated nonzeros=192000 >>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>> calls =0 >>>>>>>>>> using I-node routines: found 2744 nodes, limit >>>>>>>>>> used is 5 >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=2744, cols=2744 >>>>>>>>>> total: nonzeros=64000, allocated nonzeros=64000 >>>>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>>>> =0 >>>>>>>>>> not using I-node routines >>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>> type: seqaij >>>>>>>>>> rows=10976, cols=10976, bs=4 >>>>>>>>>> total: nonzeros=1024000, allocated nonzeros=1024000 >>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>> using I-node routines: found 2744 nodes, limit used is 5 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> I am testing a small problem with CONSTANT viscosity for grid >>>>>>>>>>>> size of 14^3 with the run time option: >>>>>>>>>>>> -ksp_type gcr -pc_type fieldsplit -pc_fieldsplit_type schur >>>>>>>>>>>> -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view >>>>>>>>>>>> -fieldsplit_0_ksp_type gcr -fieldsplit_0_pc_type gamg >>>>>>>>>>>> -fieldsplit_0_ksp_monitor_true_residual -fieldsplit_0_ksp_converged_reason >>>>>>>>>>>> -fieldsplit_1_ksp_monitor_true_residual >>>>>>>>>>>> >>>>>>>>>>>> Here is my relevant code of the solve function: >>>>>>>>>>>> PetscErrorCode ierr; >>>>>>>>>>>> PetscFunctionBeginUser; >>>>>>>>>>>> ierr = >>>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); //mDa >>>>>>>>>>>> with dof = 4, vx,vy,vz and p. >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpace);CHKERRQ(ierr);//nullSpace for the main >>>>>>>>>>>> system >>>>>>>>>>>> ierr = KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>> //register the fieldsplits obtained from options. >>>>>>>>>>>> >>>>>>>>>>>> //Setting up user PC for Schur Complement >>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPc);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPc,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> KSP *subKsp; >>>>>>>>>>>> PetscInt subKspPos = 0; >>>>>>>>>>>> //Set up nearNullspace for A00 block. >>>>>>>>>>>> ierr = >>>>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>>>> MatNullSpace rigidBodyModes; >>>>>>>>>>>> Vec coords; >>>>>>>>>>>> ierr = DMGetCoordinates(mDa,&coords);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> MatNullSpaceCreateRigidBody(coords,&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>>> Mat matA00; >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPGetOperators(subKsp[0],&matA00,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> MatSetNearNullSpace(matA00,rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>>> ierr = MatNullSpaceDestroy(&rigidBodyModes);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> //Position 1 => Ksp corresponding to Schur complement S on >>>>>>>>>>>> pressure space >>>>>>>>>>>> subKspPos = 1; >>>>>>>>>>>> ierr = >>>>>>>>>>>> PCFieldSplitGetSubKSP(mPc,&subKspPos,&subKsp);CHKERRQ(ierr); >>>>>>>>>>>> //Set up the null space of constant pressure. >>>>>>>>>>>> ierr = KSPSetNullSpace(subKsp[1],mNullSpaceP);CHKERRQ(ierr); >>>>>>>>>>>> PetscBool isNull; >>>>>>>>>>>> Mat matSc; >>>>>>>>>>>> ierr = >>>>>>>>>>>> KSPGetOperators(subKsp[1],&matSc,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>> ierr = MatNullSpaceTest(mNullSpaceP,matSc,&isNull); >>>>>>>>>>>> if(!isNull) >>>>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>>>> pressure null space \n"); >>>>>>>>>>>> ierr = KSPGetOperators(mKsp,&mA,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>> ierr = >>>>>>>>>>>> MatNullSpaceTest(mNullSpace,mA,&isNull);CHKERRQ(ierr); >>>>>>>>>>>> if(!isNull) >>>>>>>>>>>> SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_PLIB,"not a valid >>>>>>>>>>>> system null space \n"); >>>>>>>>>>>> >>>>>>>>>>>> ierr = PetscFree(subKsp);CHKERRQ(ierr); >>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>> ierr = KSPGetSolution(mKsp,&mX);CHKERRQ(ierr); >>>>>>>>>>>> ierr = KSPGetRhs(mKsp,&mB);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> PetscFunctionReturn(0); >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Wed, Aug 7, 2013 at 2:15 PM, Matthew Knepley < >>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>> >>>>>>>>>>>>> On Wed, Aug 7, 2013 at 7:07 AM, Bishesh Khanal < >>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:34 PM, Matthew Knepley < >>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 AM, Bishesh Khanal < >>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 4:40 PM, Matthew Knepley < >>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 8:06 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 4:14 PM, Matthew Knepley < >>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 8:48 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 3:17 PM, Matthew Knepley < >>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> On Mon, Aug 5, 2013 at 7:54 AM, Bishesh Khanal < >>>>>>>>>>>>>>>>>>>>> bisheshkh at gmail.com> wrote: >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> On Wed, Jul 17, 2013 at 9:48 PM, Jed Brown < >>>>>>>>>>>>>>>>>>>>>> jedbrown at mcs.anl.gov> wrote: >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Bishesh Khanal writes: >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> > Now, I implemented two different approaches, each >>>>>>>>>>>>>>>>>>>>>>> for both 2D and 3D, in >>>>>>>>>>>>>>>>>>>>>>> > MATLAB. It works for the smaller sizes but I have >>>>>>>>>>>>>>>>>>>>>>> problems solving it for >>>>>>>>>>>>>>>>>>>>>>> > the problem size I need (250^3 grid size). >>>>>>>>>>>>>>>>>>>>>>> > I use staggered grid with p on cell centers, and >>>>>>>>>>>>>>>>>>>>>>> components of v on cell >>>>>>>>>>>>>>>>>>>>>>> > faces. Similar split up of K to cell center and >>>>>>>>>>>>>>>>>>>>>>> faces to account for the >>>>>>>>>>>>>>>>>>>>>>> > variable viscosity case) >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>> Okay, you're using a staggered-grid finite >>>>>>>>>>>>>>>>>>>>>>> difference discretization of >>>>>>>>>>>>>>>>>>>>>>> variable-viscosity Stokes. This is a common problem >>>>>>>>>>>>>>>>>>>>>>> and I recommend >>>>>>>>>>>>>>>>>>>>>>> starting with PCFieldSplit with Schur complement >>>>>>>>>>>>>>>>>>>>>>> reduction (make that >>>>>>>>>>>>>>>>>>>>>>> work first, then switch to block preconditioner). >>>>>>>>>>>>>>>>>>>>>>> You can use PCLSC or >>>>>>>>>>>>>>>>>>>>>>> (probably better for you), assemble a >>>>>>>>>>>>>>>>>>>>>>> preconditioning matrix containing >>>>>>>>>>>>>>>>>>>>>>> the inverse viscosity in the pressure-pressure >>>>>>>>>>>>>>>>>>>>>>> block. This diagonal >>>>>>>>>>>>>>>>>>>>>>> matrix is a spectrally equivalent (or nearly so, >>>>>>>>>>>>>>>>>>>>>>> depending on >>>>>>>>>>>>>>>>>>>>>>> discretization) approximation of the Schur >>>>>>>>>>>>>>>>>>>>>>> complement. The velocity >>>>>>>>>>>>>>>>>>>>>>> block can be solved with algebraic multigrid. Read >>>>>>>>>>>>>>>>>>>>>>> the PCFieldSplit >>>>>>>>>>>>>>>>>>>>>>> docs (follow papers as appropriate) and let us know >>>>>>>>>>>>>>>>>>>>>>> if you get stuck. >>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>> I was trying to assemble the inverse viscosity >>>>>>>>>>>>>>>>>>>>>> diagonal matrix to use as the preconditioner for the Schur complement solve >>>>>>>>>>>>>>>>>>>>>> step as you suggested. I've few questions about the ways to implement this >>>>>>>>>>>>>>>>>>>>>> in Petsc: >>>>>>>>>>>>>>>>>>>>>> A naive approach that I can think of would be to >>>>>>>>>>>>>>>>>>>>>> create a vector with its components as reciprocal viscosities of the cell >>>>>>>>>>>>>>>>>>>>>> centers corresponding to the pressure variables, and then create a diagonal >>>>>>>>>>>>>>>>>>>>>> matrix from this vector. However I'm not sure about: >>>>>>>>>>>>>>>>>>>>>> How can I make this matrix, (say S_p) compatible to >>>>>>>>>>>>>>>>>>>>>> the Petsc distribution of the different rows of the main system matrix over >>>>>>>>>>>>>>>>>>>>>> different processors ? The main matrix was created using the DMDA structure >>>>>>>>>>>>>>>>>>>>>> with 4 dof as explained before. >>>>>>>>>>>>>>>>>>>>>> The main matrix correspond to the DMDA with 4 dofs >>>>>>>>>>>>>>>>>>>>>> but for the S_p matrix would correspond to only pressure space. Should the >>>>>>>>>>>>>>>>>>>>>> distribution of the rows of S_p among different processor not correspond to >>>>>>>>>>>>>>>>>>>>>> the distribution of the rhs vector, say h' if it is solving for p with Sp = >>>>>>>>>>>>>>>>>>>>>> h' where S = A11 inv(A00) A01 ? >>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> PETSc distributed vertices, not dofs, so it never >>>>>>>>>>>>>>>>>>>>> breaks blocks. The P distribution is the same as the entire problem divided >>>>>>>>>>>>>>>>>>>>> by 4. >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks Matt. So if I create a new DMDA with same grid >>>>>>>>>>>>>>>>>>>> size but with dof=1 instead of 4, the vertices for this new DMDA will be >>>>>>>>>>>>>>>>>>>> identically distributed as for the original DMDA ? Or should I inform PETSc >>>>>>>>>>>>>>>>>>>> by calling a particular function to make these two DMDA have identical >>>>>>>>>>>>>>>>>>>> distribution of the vertices ? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Yes. >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Even then I think there might be a problem due to the >>>>>>>>>>>>>>>>>>>> presence of "fictitious pressure vertices". The system matrix (A) contains >>>>>>>>>>>>>>>>>>>> an identity corresponding to these fictitious pressure nodes, thus when >>>>>>>>>>>>>>>>>>>> using a -pc_fieldsplit_detect_saddle_point, will detect a A11 zero block of >>>>>>>>>>>>>>>>>>>> size that correspond to only non-fictitious P-nodes. So the preconditioner >>>>>>>>>>>>>>>>>>>> S_p for the Schur complement outer solve with Sp = h' will also need to >>>>>>>>>>>>>>>>>>>> correspond to only the non-fictitious P-nodes. This means its size does not >>>>>>>>>>>>>>>>>>>> directly correspond to the DMDA grid defined for the original problem. >>>>>>>>>>>>>>>>>>>> Could you please suggest an efficient way of assembling this S_p matrix ? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Don't use detect_saddle, but split it by fields >>>>>>>>>>>>>>>>>>> -pc_fieldsplit_0_fields 0,1,2 -pc_fieldsplit_1_fields 4 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> How can I set this split in the code itself without >>>>>>>>>>>>>>>>>> giving it as a command line option when the system matrix is assembled from >>>>>>>>>>>>>>>>>> the DMDA for the whole system with 4 dofs. (i.e. *without >>>>>>>>>>>>>>>>>> * using the DMComposite or *without* using the nested >>>>>>>>>>>>>>>>>> block matrices to assemble different blocks separately and then combine >>>>>>>>>>>>>>>>>> them together). >>>>>>>>>>>>>>>>>> I need the split to get access to the fieldsplit_1_ksp in >>>>>>>>>>>>>>>>>> my code, because not using detect_saddle_point means I cannot use >>>>>>>>>>>>>>>>>> -fieldsplit_1_ksp_constant_null_space due to the presence of identity for >>>>>>>>>>>>>>>>>> the fictitious pressure nodes present in the fieldsplit_1_ block. I need to >>>>>>>>>>>>>>>>>> use PCFieldSplitGetSubKsp() so that I can set proper null-space basis. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> This is currently a real problem with the DMDA. In the >>>>>>>>>>>>>>>>> unstructured case, where we always need specialized spaces, you can >>>>>>>>>>>>>>>>> use something like >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> PetscObject pressure; >>>>>>>>>>>>>>>>> MatNullSpace nullSpacePres; >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ierr = DMGetField(dm, 1, &pressure);CHKERRQ(ierr); >>>>>>>>>>>>>>>>> ierr = MatNullSpaceCreate(PetscObjectComm(pressure), >>>>>>>>>>>>>>>>> PETSC_TRUE, 0, NULL, &nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>>> ierr = PetscObjectCompose(pressure, "nullspace", >>>>>>>>>>>>>>>>> (PetscObject) nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>>> MatNullSpaceDestroy(&nullSpacePres);CHKERRQ(ierr); >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> and then DMGetSubDM() uses this information to attach the >>>>>>>>>>>>>>>>> null space to the IS that is created using the information in the >>>>>>>>>>>>>>>>> PetscSection. >>>>>>>>>>>>>>>>> If you use a PetscSection to set the data layout over the >>>>>>>>>>>>>>>>> DMDA, I think this works correctly, but this has not been tested at all and >>>>>>>>>>>>>>>>> is very >>>>>>>>>>>>>>>>> new code. Eventually, I think we want all DMs to use this >>>>>>>>>>>>>>>>> mechanism, but we are still working it out. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Currently I do not use PetscSection. If this makes a >>>>>>>>>>>>>>>> cleaner approach, I'd try it too but may a bit later (right now I'd like >>>>>>>>>>>>>>>> test my model with a quickfix even if it means a little dirty code!) >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Bottom line: For custom null spaces using the default >>>>>>>>>>>>>>>>> layout in DMDA, you need to take apart the PCFIELDSPLIT after it has been >>>>>>>>>>>>>>>>> setup, >>>>>>>>>>>>>>>>> which is somewhat subtle. You need to call KSPSetUp() and >>>>>>>>>>>>>>>>> then reach in and get the PC, and the subKSPs. I don't like this at all, >>>>>>>>>>>>>>>>> but we >>>>>>>>>>>>>>>>> have not reorganized that code (which could be very simple >>>>>>>>>>>>>>>>> and inflexible since its very structured). >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> So I tried to get this approach working but I could not >>>>>>>>>>>>>>>> succeed and encountered some errors. Here is a code snippet: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> //mDa is the DMDA that describes the whole grid with all 4 >>>>>>>>>>>>>>>> dofs (3 velocity components and 1 pressure comp.) >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> DMKSPSetComputeRHS(mDa,computeRHSTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> DMKSPSetComputeOperators(mDa,computeMatrixTaras3D,this);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = KSPSetDM(mKsp,mDa);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //I've the >>>>>>>>>>>>>>>> mNullSpaceSystem based on mDa, that contains a null space basis for the >>>>>>>>>>>>>>>> complete system. >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>>>> //This I expect would register these options I give:-pc_type fieldsplit >>>>>>>>>>>>>>>> -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>>>> //-pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter); //Now get the PC >>>>>>>>>>>>>>>> that was obtained from the options (fieldsplit) >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>>>>> //I have created the matrix mPcForSc using a DMDA with identical //size to >>>>>>>>>>>>>>>> mDa but with dof=1 corresponding to the pressure nodes (say mDaPressure). >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ierr = PCSetUp(mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>>>>> //The null space is the one that correspond to only pressure nodes, created >>>>>>>>>>>>>>>> using the mDaPressure. >>>>>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Sorry, you need to return to the old DMDA behavior, so you >>>>>>>>>>>>>>> want >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -pc_fieldsplit_dm_splits 0 >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, with this it seems I can attach the null space >>>>>>>>>>>>>> properly, but I have a question regarding whether the Schur complement ksp >>>>>>>>>>>>>> solver is actually using the preconditioner matrix I provide. >>>>>>>>>>>>>> When using -ksp_view, the outer level pc object of type >>>>>>>>>>>>>> fieldsplit does report that: "Preconditioner for the Schur complement >>>>>>>>>>>>>> formed from user provided matrix", but in the KSP solver for Schur >>>>>>>>>>>>>> complement S, the pc object (fieldsplit_1_) is of type ilu and doesn't say >>>>>>>>>>>>>> that it is using the matrix I provide. Am I missing something here ? >>>>>>>>>>>>>> Below are the relevant commented code snippet and the output >>>>>>>>>>>>>> of the -ksp_view >>>>>>>>>>>>>> (The options I used: -pc_type fieldsplit -pc_fieldsplit_type >>>>>>>>>>>>>> schur -pc_fieldsplit_dm_splits 0 -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 -ksp_converged_reason -ksp_view ) >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> If ILU does not error, it means it is using your matrix, >>>>>>>>>>>>> because the Schur complement matrix cannot be factored, and FS says it is >>>>>>>>>>>>> using your matrix. >>>>>>>>>>>>> >>>>>>>>>>>>> Matt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> Code snippet: >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> KSPSetNullSpace(mKsp,mNullSpaceSystem);CHKERRQ(ierr); //The nullspace for >>>>>>>>>>>>>> the whole system >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> KSPSetFromOptions(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = KSPSetUp(mKsp);CHKERRQ(ierr); >>>>>>>>>>>>>> //Set up mKsp with the options provided with fieldsplit and the fields >>>>>>>>>>>>>> associated with the two splits. >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = KSPGetPC(mKsp,&mPcOuter);CHKERRQ(ierr); >>>>>>>>>>>>>> //Get the fieldsplit pc set up from the options >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> PCFieldSplitSchurPrecondition(mPcOuter,PC_FIELDSPLIT_SCHUR_PRE_USER,mPcForSc);CHKERRQ(ierr); >>>>>>>>>>>>>> //Use mPcForSc as the preconditioner for Schur Complement >>>>>>>>>>>>>> >>>>>>>>>>>>>> KSP *kspSchur; >>>>>>>>>>>>>> PetscInt kspSchurPos = 1; >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> PCFieldSplitGetSubKSP(mPcOuter,&kspSchurPos,&kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = >>>>>>>>>>>>>> KSPSetNullSpace(kspSchur[1],mNullSpacePressure);CHKERRQ(ierr); >>>>>>>>>>>>>> //Attach the null-space for the Schur complement ksp solver. >>>>>>>>>>>>>> ierr = PetscFree(kspSchur);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> ierr = KSPSolve(mKsp,NULL,NULL);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> the output of the -ksp_view >>>>>>>>>>>>>> KSP Object: 1 MPI processes >>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> has attached null space >>>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>>> PC Object: 1 MPI processes >>>>>>>>>>>>>> type: fieldsplit >>>>>>>>>>>>>> FieldSplit with Schur preconditioner, blocksize = 4, >>>>>>>>>>>>>> factorization FULL >>>>>>>>>>>>>> Preconditioner for the Schur complement formed from user >>>>>>>>>>>>>> provided matrix >>>>>>>>>>>>>> Split info: >>>>>>>>>>>>>> Split number 0 Fields 0, 1, 2 >>>>>>>>>>>>>> Split number 1 Fields 3 >>>>>>>>>>>>>> KSP solver for A00 block >>>>>>>>>>>>>> KSP Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>>> PC Object: (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>>> type: ilu >>>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>>> total: nonzeros=140625, allocated nonzeros=140625 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>>> calls =0 >>>>>>>>>>>>>> using I-node routines: found 729 nodes, limit >>>>>>>>>>>>>> used is 5 >>>>>>>>>>>>>> KSP solver for S = A11 - A10 inv(A00) A01 >>>>>>>>>>>>>> KSP Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>> GMRES: restart=30, using Classical (unmodified) >>>>>>>>>>>>>> Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> has attached null space >>>>>>>>>>>>>> using PRECONDITIONED norm type for convergence test >>>>>>>>>>>>>> PC Object: (fieldsplit_1_) 1 MPI processes >>>>>>>>>>>>>> type: ilu >>>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>> using diagonal shift on blocks to prevent zero pivot >>>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>>> package used to perform factorization: petsc >>>>>>>>>>>>>> total: nonzeros=15625, allocated >>>>>>>>>>>>>> nonzeros=15625 >>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> linear system matrix followed by preconditioner >>>>>>>>>>>>>> matrix: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: schurcomplement >>>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>>> Schur complement A11 - A10 inv(A00) A01 >>>>>>>>>>>>>> A11 >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>>> total: nonzeros=15625, allocated >>>>>>>>>>>>>> nonzeros=15625 >>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> A10 >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=729, cols=2187 >>>>>>>>>>>>>> total: nonzeros=46875, allocated >>>>>>>>>>>>>> nonzeros=46875 >>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> KSP of A00 >>>>>>>>>>>>>> KSP Object: >>>>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>>> type: gmres >>>>>>>>>>>>>> GMRES: restart=30, using Classical >>>>>>>>>>>>>> (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >>>>>>>>>>>>>> GMRES: happy breakdown tolerance 1e-30 >>>>>>>>>>>>>> maximum iterations=10000, initial guess is >>>>>>>>>>>>>> zero >>>>>>>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, >>>>>>>>>>>>>> divergence=10000 >>>>>>>>>>>>>> left preconditioning >>>>>>>>>>>>>> using PRECONDITIONED norm type for >>>>>>>>>>>>>> convergence test >>>>>>>>>>>>>> PC Object: >>>>>>>>>>>>>> (fieldsplit_0_) 1 MPI processes >>>>>>>>>>>>>> type: ilu >>>>>>>>>>>>>> ILU: out-of-place factorization >>>>>>>>>>>>>> 0 levels of fill >>>>>>>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>>>>>>> using diagonal shift on blocks to prevent >>>>>>>>>>>>>> zero pivot >>>>>>>>>>>>>> matrix ordering: natural >>>>>>>>>>>>>> factor fill ratio given 1, needed 1 >>>>>>>>>>>>>> Factored matrix follows: >>>>>>>>>>>>>> Matrix Object: 1 >>>>>>>>>>>>>> MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>>> package used to perform >>>>>>>>>>>>>> factorization: petsc >>>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>> using I-node routines: found 729 >>>>>>>>>>>>>> nodes, limit used is 5 >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=2187, cols=2187 >>>>>>>>>>>>>> total: nonzeros=140625, allocated >>>>>>>>>>>>>> nonzeros=140625 >>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>>> A01 >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=2187, cols=729 >>>>>>>>>>>>>> total: nonzeros=46875, allocated >>>>>>>>>>>>>> nonzeros=46875 >>>>>>>>>>>>>> total number of mallocs used during >>>>>>>>>>>>>> MatSetValues calls =0 >>>>>>>>>>>>>> using I-node routines: found 729 nodes, >>>>>>>>>>>>>> limit used is 5 >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=729, cols=729 >>>>>>>>>>>>>> total: nonzeros=15625, allocated nonzeros=15625 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues >>>>>>>>>>>>>> calls =0 >>>>>>>>>>>>>> not using I-node routines >>>>>>>>>>>>>> linear system matrix = precond matrix: >>>>>>>>>>>>>> Matrix Object: 1 MPI processes >>>>>>>>>>>>>> type: seqaij >>>>>>>>>>>>>> rows=2916, cols=2916, bs=4 >>>>>>>>>>>>>> total: nonzeros=250000, allocated nonzeros=250000 >>>>>>>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>>>>>>> using I-node routines: found 729 nodes, limit used is 5 >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> or >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> PCFieldSplitSetDMSplits(pc, PETSC_FALSE) >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> The errors I get when running with options: -pc_type >>>>>>>>>>>>>>>> fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_0_fields 0,1,2 >>>>>>>>>>>>>>>> -pc_fieldsplit_1_fields 3 >>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>>>>>> ------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation for this >>>>>>>>>>>>>>>> object type! >>>>>>>>>>>>>>>> [0]PETSC ERROR: Support only implemented for 2d! >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent >>>>>>>>>>>>>>>> updates. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble >>>>>>>>>>>>>>>> shooting. >>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: src/AdLemMain on a arch-linux2-cxx-debug >>>>>>>>>>>>>>>> named edwards by bkhanal Tue Aug 6 17:35:30 2013 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from >>>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/arch-linux2-cxx-debug/lib >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Fri Jul 19 14:25:01 2013 >>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --with-cc=gcc >>>>>>>>>>>>>>>> --with-fc=g77 --with-cxx=g++ --download-f-blas-lapack=1 --download-mpich=1 >>>>>>>>>>>>>>>> -with-clanguage=cxx --download-hypre=1 >>>>>>>>>>>>>>>> [0]PETSC ERROR: >>>>>>>>>>>>>>>> ------------------------------------------------------------------------ >>>>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM_DA() line 188 in >>>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/impls/da/dacreate.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: DMCreateSubDM() line 1267 in >>>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/dm/interface/dm.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCFieldSplitSetDefaults() line 337 in >>>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp_FieldSplit() line 458 in >>>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: PCSetUp() line 890 in >>>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/pc/interface/precon.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: KSPSetUp() line 278 in >>>>>>>>>>>>>>>> /home/bkhanal/Documents/softwares/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c >>>>>>>>>>>>>>>> [0]PETSC ERROR: solveModel() line 181 in >>>>>>>>>>>>>>>> "unknowndirectory/"/user/bkhanal/home/works/AdLemModel/src/PetscAdLemTaras3D.cxx >>>>>>>>>>>>>>>> WARNING! There are options you set that were not used! >>>>>>>>>>>>>>>> WARNING! could be spelling mistake, etc! >>>>>>>>>>>>>>>> Option left: name:-pc_fieldsplit_1_fields value: 3 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Matt >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they >>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to >>>>>>>>>>>>>>>>>>> which their experiments lead. >>>>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> -- >>>>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -- >>>>>>>>>>>>> What most experimenters take for granted before they begin >>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>>>> their experiments lead. >>>>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 KSP Object:(fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=4, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object:(fieldsplit_0_) 1 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 package used to perform factorization: petsc total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=96, cols=96, bs=6 total: nonzeros=9216, allocated nonzeros=9216 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 20 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 3.45879, max = 72.6346 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=4074, cols=4074, bs=6 total: nonzeros=2248308, allocated nonzeros=2248308 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 1348 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.143329, max = 3.0099 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=8232, cols=8232 total: nonzeros=576000, allocated nonzeros=576000 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2744 nodes, limit used is 5 Linear solve did not converge due to DIVERGED_ITS iterations 1 From suifengls at gmail.com Fri Aug 23 12:14:24 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Fri, 23 Aug 2013 10:14:24 -0700 Subject: [petsc-users] Copy X in ksp solver Message-ID: Dear all, Running with ./ex23 -ksp_type cg, everything works well. If running with mpi: mpiexec -n 4 ./ex23 -ksp_type cg In /src/ksp/ksp/impls/cg/cg.c, when I try to make a copy of the solution vector X by VecCopy(X, myX), the following error comes out. [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Object is in wrong state! [1]PETSC ERROR: Not for unassembled vector! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ./ex23 on a ksp-intel named head.cluster by lchen Fri Aug 23 11:57:49 2013 [1]PETSC ERROR: Libraries linked from /home/lchen/petsc-3.4.2/ftksp-intel/lib [1]PETSC ERROR: Configure run at Mon Aug 19 18:32:17 2013 [1]PETSC ERROR: Configure options PETSC_ARCH=ksp-intel --with-shared-libraries=0 --with-blas-lib=/opt/acml5.3.0/gfortran64/lib/libacml.a --with-lapack-lib=/opt/acml5.3.0/gfortran64/lib/libacml.a [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: VecCopy() line 1686 in /home/lchen/petsc-3.4.2/src/vec/vec/interface/vector.c But VecCopy(myX, X) works. Any problem with my copy? Are there any methods to copy X to myX? Thanks, Longxiang -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 23 12:59:26 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 23 Aug 2013 12:59:26 -0500 Subject: [petsc-users] Copy X in ksp solver In-Reply-To: References: Message-ID: <093E7F46-FDD3-44F4-9C19-921EAF53CC98@mcs.anl.gov> This error happens because of if (x->stash.insertmode != NOT_SET_VALUES) SETERRQ(PETSC_COMM_SELF,PETSC_ERR_ARG_WRONGSTATE,"Not for unassembled vector"); It can only happen if 1) memory corruption due to previously writing outside array regions or 2) VecSetValues() was called on the input vector but VecAssemblyBegin/End() was never called. So verify you are not calling VecSetValues() on that X with the assembly and if it loops good then run with valgrind http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind On Aug 23, 2013, at 12:14 PM, Longxiang Chen wrote: > Dear all, > > Running with ./ex23 -ksp_type cg, everything works well. > If running with mpi: mpiexec -n 4 ./ex23 -ksp_type cg > In /src/ksp/ksp/impls/cg/cg.c, when I try to make a copy of the solution vector X by VecCopy(X, myX), the following error comes out. > > [1]PETSC ERROR: --------------------- Error Message ------------------------------------ > [1]PETSC ERROR: Object is in wrong state! > [1]PETSC ERROR: Not for unassembled vector! > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: ./ex23 on a ksp-intel named head.cluster by lchen Fri Aug 23 11:57:49 2013 > [1]PETSC ERROR: Libraries linked from /home/lchen/petsc-3.4.2/ftksp-intel/lib > [1]PETSC ERROR: Configure run at Mon Aug 19 18:32:17 2013 > [1]PETSC ERROR: Configure options PETSC_ARCH=ksp-intel --with-shared-libraries=0 --with-blas-lib=/opt/acml5.3.0/gfortran64/lib/libacml.a --with-lapack-lib=/opt/acml5.3.0/gfortran64/lib/libacml.a > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: VecCopy() line 1686 in /home/lchen/petsc-3.4.2/src/vec/vec/interface/vector.c > > But VecCopy(myX, X) works. > Any problem with my copy? > Are there any methods to copy X to myX? > Thanks, > > Longxiang From mpovolot at purdue.edu Fri Aug 23 16:28:32 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Fri, 23 Aug 2013 17:28:32 -0400 Subject: [petsc-users] Mat Destroy Message-ID: <5217D400.5090100@purdue.edu> Dear Petsc developers, I have a question about the MatDestroy function. Does it free the memory immediately? Or do you keep the memory for further usage? Thank you, Michael. -- Michael Povolotskyi, PhD Research Assistant Professor Network for Computational Nanotechnology 207 S Martin Jischke Drive Purdue University, DLR, room 441-10 West Lafayette, Indiana 47907 phone: +1-765-494-9396 fax: +1-765-496-6026 From bsmith at mcs.anl.gov Fri Aug 23 16:39:41 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 23 Aug 2013 16:39:41 -0500 Subject: [petsc-users] Mat Destroy In-Reply-To: <5217D400.5090100@purdue.edu> References: <5217D400.5090100@purdue.edu> Message-ID: On Aug 23, 2013, at 4:28 PM, Michael Povolotskyi wrote: > Dear Petsc developers, > I have a question about the MatDestroy function. > Does it free the memory immediately? > Or do you keep the memory for further usage? We call free() at that point. But not that in Unix this does not mean the memory is returned to the operating system so you will not see the process memory go down. If you then allocate new objects they will reuse this memory. Barry > > Thank you, > Michael. > > -- > Michael Povolotskyi, PhD > Research Assistant Professor > Network for Computational Nanotechnology > 207 S Martin Jischke Drive > Purdue University, DLR, room 441-10 > West Lafayette, Indiana 47907 > > phone: +1-765-494-9396 > fax: +1-765-496-6026 > From jedbrown at mcs.anl.gov Fri Aug 23 18:37:25 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 23 Aug 2013 18:37:25 -0500 Subject: [petsc-users] Mat Destroy In-Reply-To: References: <5217D400.5090100@purdue.edu> Message-ID: <87fvu0htca.fsf@mcs.anl.gov> Barry Smith writes: > We call free() at that point. But not that in Unix this does not > mean the memory is returned to the operating system so you will not > see the process memory go down. If you then allocate new objects > they will reuse this memory. Also note that MatDestroy only releases a references, so if another object still holds a reference to your matrix, nothing will be freed. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From zyzhang at nuaa.edu.cn Sat Aug 24 00:07:41 2013 From: zyzhang at nuaa.edu.cn (Zhang) Date: Sat, 24 Aug 2013 13:07:41 +0800 (GMT+08:00) Subject: [petsc-users] which is the best PC for GMRES Poisson solver? Message-ID: <111453d.38c8.140aeb87e7e.Coremail.zyzhang@nuaa.edu.cn> Hi, Recently I wrote a code of projection method for solving the incompressile flow. It includes a poisson solution of pressure. I transfered the code to be based on Petsc . However, in the case of 3D lid-driven flow case, the speed of petsc version is not advantageous yet. I tried different combination of preconditioner with GMRES solver. Among them GMRES+PCILU or GMRES+SOR are both the fastest. For a grid 80x80x80, GMRES+SOR serial version used 185.816307 secs. However, for case 120x120x120, it diverged. So is GMRES+PCILU. Then I tried a parallel comparison, as follows, ############################################################################################# # with Petsc-3.4.2, time comparison (sec) # size (80,80,80), 200 steps, dt=0.002 #debug version 177.695939 sec #opt version 106.694733 sec #mpirun -np 8 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 #debug version 514.718544 sec #opt version 331.114555 sec #mpirun -np 12 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 #debug version 796.765428 sec #opt version 686.151788 sec #mpirun -np 16 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 I do know sometimes problem with speed is not due to petsc, but my own code, so please you are welcome for any suggestion about which combination is the best for such a computation. I know solving Poiison and Helmholtz is so common to see in numerical work. Thank you first. BTW, I also tried to use superLU_dist as PC. #mpirun -np 16 ./nsproj ../input/sample_20.zzy.dat -ksp_type gmres -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_rtol 1.e-2 But with 16 nodes, except case of 20x20x20 grids, all larger grids run extremly slow. Since I never use the direct PC before, is it true that a good usage of direct LU as Preconditioner requires that the amount of procedures be much larger so that for each node the calculation of direct solver is smalled enough to use it? Cheers, Zhenyu -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Sat Aug 24 03:23:38 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Sat, 24 Aug 2013 16:23:38 +0800 (CST) Subject: [petsc-users] how to use the Euclid preconditioner from Hypre in the petsc Message-ID: <1431bf84.16a4e.140af6be40a.Coremail.ztdepyahoo@163.com> I configured the petsc without the hypre download. i want to download and install the hypre seperately. could you please tell me how to use the euclid precontioner from hypre in petsc. Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From zyzhang at nuaa.edu.cn Sat Aug 24 02:54:59 2013 From: zyzhang at nuaa.edu.cn (Zhang) Date: Sat, 24 Aug 2013 15:54:59 +0800 (GMT+08:00) Subject: [petsc-users] A problem with sizeof grid Message-ID: <7d8f01.392c.140af51aa5f.Coremail.zyzhang@nuaa.edu.cn> Hi, I ran my 3d structured CFD code for a cavity flow with different grid size from 20x20x20 to 80x80x80. Whether fast or not, all they converged and gave me the expected result. but when I tried with 120x120x120 grid, errors appeared as follows. Since I believe my physical memory is large enough for these test. So I just wonder what really happened with the GMRES solver + hypre PC. I hope you may leave me a hint. Thanks. Zhenyu mpirun -np 8 ./nsproj ../input/sample_120.zzy.dat -px 2 -py 2 -pz 2 -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 ....GRID POINT DIMENSIONS W/ GHOST CELLS : 122 X 122 X 122 px= 2, py= 2, pz= 2 ....BEGINNING SIMULATION time: 0, SolvePressure iter= 0, residnorm= 0.000000e+00 time: 20, SolvePressure iter= 2, residnorm= 1.329163e+21 [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Error in external library! [0]PETSC ERROR: Error in HYPRE solver, error code 1! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [0]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [0]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Error in external library! [1]PETSC ERROR: Error in HYPRE solver, error code 1! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [1]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [1]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [1]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message ------------------------------------ [2]PETSC ERROR: Error in external library! [2]PETSC ERROR: Error in HYPRE solver, error code 1! [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [2]PETSC ERROR: See docs/changes/index.html for recent updates. [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: See docs/index.html for manual pages. [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [2]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [2]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [2]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [2]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [2]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [2]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [2]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [3]PETSC ERROR: --------------------- Error Message ------------------------------------ [3]PETSC ERROR: Error in external library! [3]PETSC ERROR: Error in HYPRE solver, error code 1! [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [3]PETSC ERROR: See docs/changes/index.html for recent updates. [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [3]PETSC ERROR: See docs/index.html for manual pages. [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [3]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [3]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [3]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [3]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [3]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [3]PETSC ERROR: [4]PETSC ERROR: --------------------- Error Message ------------------------------------ [4]PETSC ERROR: Error in external library! [4]PETSC ERROR: Error in HYPRE solver, error code 1! [4]PETSC ERROR: ------------------------------------------------------------------------ [4]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [4]PETSC ERROR: See docs/changes/index.html for recent updates. [4]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [4]PETSC ERROR: See docs/index.html for manual pages. [4]PETSC ERROR: ------------------------------------------------------------------------ [4]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [4]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [4]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [4]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [4]PETSC ERROR: ------------------------------------------------------------------------ [4]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [4]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [4]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [4]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [4]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [4]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [4]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [4]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc [5]PETSC ERROR: --------------------- Error Message ------------------------------------ [5]PETSC ERROR: Error in external library! [5]PETSC ERROR: Error in HYPRE solver, error code 1! [5]PETSC ERROR: ------------------------------------------------------------------------ [5]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [5]PETSC ERROR: See docs/changes/index.html for recent updates. [5]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [5]PETSC ERROR: See docs/index.html for manual pages. [5]PETSC ERROR: ------------------------------------------------------------------------ [5]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [5]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [5]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [5]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [5]PETSC ERROR: ------------------------------------------------------------------------ [5]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [5]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [5]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [5]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [5]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [5]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [5]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [5]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc [6]PETSC ERROR: --------------------- Error Message ------------------------------------ [6]PETSC ERROR: Error in external library! [6]PETSC ERROR: Error in HYPRE solver, error code 1! [6]PETSC ERROR: ------------------------------------------------------------------------ [6]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [6]PETSC ERROR: See docs/changes/index.html for recent updates. [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [6]PETSC ERROR: See docs/index.html for manual pages. [6]PETSC ERROR: ------------------------------------------------------------------------ [6]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [6]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [6]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [6]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [6]PETSC ERROR: ------------------------------------------------------------------------ [6]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [6]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [6]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [6]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [6]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [6]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [7]PETSC ERROR: --------------------- Error Message ------------------------------------ [7]PETSC ERROR: Error in external library! [7]PETSC ERROR: Error in HYPRE solver, error code 1! [7]PETSC ERROR: ------------------------------------------------------------------------ [7]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [7]PETSC ERROR: See docs/changes/index.html for recent updates. [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [7]PETSC ERROR: See docs/index.html for manual pages. [7]PETSC ERROR: ------------------------------------------------------------------------ [7]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 [7]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib [7]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 [7]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [7]PETSC ERROR: ------------------------------------------------------------------------ [7]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [7]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [7]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [7]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [7]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [7]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [7]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [7]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc [0]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c [0]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [0]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [0]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [0]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [0]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c [1]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [1]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [1]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [1]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [1]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc [2]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [2]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [2]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h [3]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c [3]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c [3]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [3]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc [6]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc [6]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc -------------------------------------------------------------------------- mpirun noticed that the job aborted, but has no info as to the process that caused that situation. -------------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Sat Aug 24 04:57:00 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Sat, 24 Aug 2013 17:57:00 +0800 (CST) Subject: [petsc-users] use kspsolve repeately Message-ID: <24845df7.17887.140afc15fbf.Coremail.ztdepyahoo@163.com> in my code, i need to use kspsolve in the following way KSP ksp; KSPCreate(PETSC_COMM_WORLD,&ksp); KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); KSPSetInitialGuessNonzero(ksp,PETSC_TRUE) KSPSetType(ksp,KSPBCGS); KSPSetFromOptions(ksp); set the matrix A value and right hand side bu and bv. kspsolve(A,bu); kspsolve(A,bv); change the value of matrix A and bu bv, kspsolve(A,bu); kspsolve(A,bv); the first and second call to the kspsolve use the same preconditioner. but which preconditoner does the third and fourth call to the kspsolve since the value of the matrix A has changed. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Aug 24 05:17:07 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 24 Aug 2013 05:17:07 -0500 Subject: [petsc-users] how to use the Euclid preconditioner from Hypre in the petsc In-Reply-To: <1431bf84.16a4e.140af6be40a.Coremail.ztdepyahoo@163.com> References: <1431bf84.16a4e.140af6be40a.Coremail.ztdepyahoo@163.com> Message-ID: On Sat, Aug 24, 2013 at 3:23 AM, ??? wrote: > I configured the petsc without the hypre download. i want to download and > install the hypre seperately. could you please tell me how to use the > euclid precontioner from hypre in petsc. > You can always use -help, but I believe its -pc_hypre_type euclid Matt > Regards > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Aug 24 05:18:17 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 24 Aug 2013 05:18:17 -0500 Subject: [petsc-users] use kspsolve repeately In-Reply-To: <24845df7.17887.140afc15fbf.Coremail.ztdepyahoo@163.com> References: <24845df7.17887.140afc15fbf.Coremail.ztdepyahoo@163.com> Message-ID: On Sat, Aug 24, 2013 at 4:57 AM, ??? wrote: > in my code, i need to use kspsolve in the following way > > KSP ksp; > > KSPCreate(PETSC_COMM_WORLD,&ksp); > KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); > KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); > KSPSetInitialGuessNonzero(ksp,PETSC_TRUE) > KSPSetType(ksp,KSPBCGS); > KSPSetFromOptions(ksp); > > set the matrix A value and right hand side bu and bv. > kspsolve(A,bu); > kspsolve(A,bv); > change the value of matrix A and bu bv, > kspsolve(A,bu); > kspsolve(A,bv); > > the first and second call to the kspsolve use the same preconditioner. > but which preconditoner does the third and fourth call to the kspsolve > since the value of the matrix A has changed. > The first call is redundant, and the subsequent solves use the same preconditioner, as you asked. Matt > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Aug 24 05:19:07 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 24 Aug 2013 05:19:07 -0500 Subject: [petsc-users] A problem with sizeof grid In-Reply-To: <7d8f01.392c.140af51aa5f.Coremail.zyzhang@nuaa.edu.cn> References: <7d8f01.392c.140af51aa5f.Coremail.zyzhang@nuaa.edu.cn> Message-ID: On Sat, Aug 24, 2013 at 2:54 AM, Zhang wrote: > Hi, > > I ran my 3d structured CFD code for a cavity flow with different grid size > from 20x20x20 to 80x80x80. Whether fast or not, all they converged and gave > me the expected result. > > but when I tried with 120x120x120 grid, errors appeared as follows. Since > I believe my physical memory is large enough for these test. So I just > wonder what really happened with the GMRES solver + hypre PC. I hope you > may leave me a hint. Thanks. > This is a question for the Hypre list. It looks like out of memory errors. Matt > Zhenyu > mpirun -np 8 ./nsproj ../input/sample_120.zzy.dat -px 2 -py 2 -pz 2 > -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 > > > > ....GRID POINT DIMENSIONS W/ GHOST CELLS : 122 X 122 X 122 > px= 2, py= 2, pz= 2 > ....BEGINNING SIMULATION > time: 0, SolvePressure iter= 0, residnorm= 0.000000e+00 > time: 20, SolvePressure iter= 2, residnorm= 1.329163e+21 > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Error in external library! > [0]PETSC ERROR : Error in HYPRE solver, error code 1! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [0]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [0]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [1]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [1]PETSC ERROR: Error in external library! > [1]PETSC ERROR: Error in HYPRE solver, error code 1! > [1]PETSC ERROR: ------------------------------------------------- > ----------------------- > [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [1]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [1]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [1]PETSC ERROR: Configure options --with-debugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve=1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download -chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: PCApply_HYPRE() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [2]PETSC ERROR: Error in external library! > [2]PETSC ERROR: Error in HYPRE solver, error code 1! > [2]PETSC ERROR: > ------------------------------------------------------------------------ > [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [2]PETSC ERROR: See docs/changes/index.html for recent updates. > [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [2]PETSC ERROR: See docs/index.html for manual pages. > [2]PETSC ERROR: ------------------------------------------------- > ----------------------- > [2]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [2]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [2]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [2]PETSC ERROR: Configure options --with-debugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve=1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [2]PETSC ERROR: > ------------------------------------------------------------------------ > [2]PETSC ERROR: PCApply_HYPRE() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hyp re/hypre.c > [2]PETSC ERROR: PCApply() line 442 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [2]PETSC ERROR: KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [2]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [3]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [3]PETSC ERROR: Error in external library! > [3]PETSC ERROR: Error in HYPRE solver, error code 1! > [3]PETSC ERROR: > ------------------------------------------------------------------------ > [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [3]PETSC ERROR: See docs/changes/index.html for recent updates. > [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [3]PETSC ERROR: See docs/index.html for manual pages. > [3]PETSC ERROR: > ------------------------------------------------------------------------ > [3]PETSC ERROR: ./nsproj on a arch- linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [3]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [3]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [3]PETSC ERROR: Configure options --with-debugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve=1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [3]PETSC ERROR: > ------------------------------------------------------------------------ > [3]PETSC ERROR: PCApply_HYPRE() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [3]PETSC ERROR: PCApply() line 442 in /home/zhen > yu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [3]PETSC ERROR: [4]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [4]PETSC ERROR: Error in external library! > [4]PETSC ERROR: Error in HYPRE solver, error code 1! > [4]PETSC ERROR: > ------------------------------------------------------------------------ > [4]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [4]PETSC ERROR: See docs/changes/index.html for recent updates. > [4]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [4]PETSC ERROR: See docs/index.html for manual pages. > [4]PETSC ERROR: > ------------------------------------------------------------------------ > [4]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [4]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [4]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [4]PETSC ERROR: Configure options --with-de bugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve=1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [4]PETSC ERROR: > ------------------------------------------------------------------------ > [4]PETSC ERROR: PCApply_HYPRE() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [4]PETSC ERROR: PCApply() line 442 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [4]PETSC ERROR: KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [4]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c*[4]PETSC ERROR: > KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [4]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [4]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [4]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [5]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [5]PETSC ERROR: Error in external library! > [5]PETSC ERROR: Error in HYPRE solver, error code 1! > [5]PETSC ERROR: > ------------------------------------------------------------------------ > [5]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [5]PETSC ERROR: See docs/changes/index.html for recent updates. > [5]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [5]PETSC ERROR: See docs/index.html for manual pages. > [5]P ETSC ERROR: > ------------------------------------------------------------------------ > [5]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [5]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [5]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [5]PETSC ERROR: Configure options --with-debugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve=1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [5]PETSC ERROR: > ------------------------------------------------------------------------ > [5]PETSC ERROR: PCApply_HYPR E() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [5]PETSC ERROR: PCApply() line 442 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [5]PETSC ERROR: KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [5]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [5]PETSC ERROR: KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [5]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [5]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [5]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [6]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [6]PETSC ERROR: Error in external library! > [6]PETSC ERROR: Error in HYPRE so lver, error code 1! > [6]PETSC ERROR: > ------------------------------------------------------------------------ > [6]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [6]PETSC ERROR: See docs/changes/index.html for recent updates. > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [6]PETSC ERROR: See docs/index.html for manual pages. > [6]PETSC ERROR: > ------------------------------------------------------------------------ > [6]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [6]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [6]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [6]PETSC ERROR: Configure options --with-debugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve =1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [6]PETSC ERROR: > ------------------------------------------------------------------------ > [6]PETSC ERROR: PCApply_HYPRE() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [6]PETSC ERROR: PCApply() line 442 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [6]PETSC ERROR: KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [6]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [6]PETSC ERROR: KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [6]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [7]PETSC ERROR: -------------- ------- Error Message > ------------------------------------ > [7]PETSC ERROR: Error in external library! > [7]PETSC ERROR: Error in HYPRE solver, error code 1! > [7]PETSC ERROR: > ------------------------------------------------------------------------ > [7]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [7]PETSC ERROR: See docs/changes/index.html for recent updates. > [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [7]PETSC ERROR: See docs/index.html for manual pages. > [7]PETSC ERROR: > ------------------------------------------------------------------------ > [7]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat > Aug 24 16:05:05 2013 > [7]PETSC ERROR: Libraries linked from > /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [7]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [7]PETSC ERROR: Configure options --with-debugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir =/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve=1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [7]PETSC ERROR: > ------------------------------------------------------------------------ > [7]PETSC ERROR: PCApply_HYPRE() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [7]PETSC ERROR: PCApply() line 442 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [7]PETSC ERROR: KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [7]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [7]PETSC ERROR: KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmre s/gmres.c > [7]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [7]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [7]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [0]PETSC ERROR: Configure options --with-debugging=0 > --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 > --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc > --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 > --with-opt-sieve=1 --download-generator --download-triangle > --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 > --download-hypre=1 --download-metis --download-parmetis --download-hdf5 > --download-openmpi --download-f-blas-lapack --download-superlu_dist > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: PCApply_HYPRE() line 170 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: PCApply() line 442 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [0]PETSC ERROR: KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [0]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [0]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > PCApply() line 442 in > /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [1]PETSC ERROR: KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/pet sc-private/kspimpl.h > [1]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [1]PETSC ERROR: KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [1]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [1]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [2]PETSC ERROR: KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [2]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [2]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/ nsproj_petsc.cc > KSP_PCApply() line 227 in > /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [3]PETSC ERROR: KSPInitialResidual() line 64 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [3]PETSC ERROR: KSPSolve_GMRES() line 239 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [3]PETSC ERROR: KSPSolve() line 441 in > /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [3]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [6]PETSC ERROR: SolvePressure() line 898 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [6]PETSC ERROR: start_sim() line 190 in > "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > -------------------------------------------------------------------------- > mpirun noticed that the j ob aborted, but has no info as to the process > that caused that situation. > -------------------------------------------------------------------------- > > > > > > * -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Aug 24 05:20:00 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 24 Aug 2013 05:20:00 -0500 Subject: [petsc-users] which is the best PC for GMRES Poisson solver? In-Reply-To: <111453d.38c8.140aeb87e7e.Coremail.zyzhang@nuaa.edu.cn> References: <111453d.38c8.140aeb87e7e.Coremail.zyzhang@nuaa.edu.cn> Message-ID: On Sat, Aug 24, 2013 at 12:07 AM, Zhang wrote: > Hi, > > Recently I wrote a code of projection method for solving the > incompressile flow. It includes a poisson solution of pressure. > I transfered the code to be based on Petsc . > Use -pc_type gamg Matt > However, in the case of 3D lid-driven flow case, the speed of petsc > version is not advantageous yet. > > I tried different combination of preconditioner with GMRES solver. Among > them GMRES+PCILU or GMRES+SOR are both the fastest. > For a grid 80x80x80, GMRES+SOR serial version used 185.816307 secs. > However, for case 120x120x120, it diverged. So is GMRES+PCILU. > > Then I tried a parallel comparison, as follows, > > > > ############################################################################################# > # with Petsc-3.4.2, time comparison (sec) > # size (80,80,80), 200 steps, dt=0.002 > #debug version 177.695939 sec > #opt version 106.694733 sec > #mpirun -np 8 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -p > c_type hypre -ksp_rtol 1.e-2 > #debug version 514.718544 sec > #opt version 331.114555 sec > #mpirun -np 12 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres > -pc_type hypre -ksp_rtol 1.e-2 > #debug version 796.765428 sec > #opt version 686.151788 sec > #mpirun -np 16 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres > -pc_type hypre -ksp_rtol 1.e-2 > > I do know sometimes problem with speed is not due to petsc, but my own > code, so please you are welcome for any suggestion > about which combination is the best for such a computation. I know solving > Poiison and Helmholtz is so common to see in numerical work. > Thank you first. > > BTW, I also tried to use superLU_dist as PC. > #mpirun -np 16 ./nsproj ../input/sample_20.zzy.dat -ksp_type gmres > -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_rtol 1.e-2 > > But with 16 nodes, except case of 20x20x20 grids, all larger grids run e > xtremly slow. > Since I never use the direct PC before, is it true that a good usage of > direct LU as Preconditioner > requires that the amount of procedures be much larger so that for each > node the calculation of direct solver is smalled enough to use it? > > Cheers, > > Zhenyu > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Sat Aug 24 06:48:14 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Sat, 24 Aug 2013 19:48:14 +0800 (CST) Subject: [petsc-users] use kspsolve repeately In-Reply-To: References: <24845df7.17887.140afc15fbf.Coremail.ztdepyahoo@163.com> Message-ID: <245db206.1829d.140b027348d.Coremail.ztdepyahoo@163.com> thank you . but the matrix element has been changed in the third and fourth call. how to re calculate the preconditioner. ? 2013-08-24 18:18:17?"Matthew Knepley" ??? On Sat, Aug 24, 2013 at 4:57 AM, ??? wrote: in my code, i need to use kspsolve in the following way KSP ksp; KSPCreate(PETSC_COMM_WORLD,&ksp); KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); KSPSetInitialGuessNonzero(ksp,PETSC_TRUE) KSPSetType(ksp,KSPBCGS); KSPSetFromOptions(ksp); set the matrix A value and right hand side bu and bv. kspsolve(A,bu); kspsolve(A,bv); change the value of matrix A and bu bv, kspsolve(A,bu); kspsolve(A,bv); the first and second call to the kspsolve use the same preconditioner. but which preconditoner does the third and fourth call to the kspsolve since the value of the matrix A has changed. The first call is redundant, and the subsequent solves use the same preconditioner, as you asked. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Aug 24 07:00:40 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 24 Aug 2013 07:00:40 -0500 Subject: [petsc-users] use kspsolve repeately In-Reply-To: <245db206.1829d.140b027348d.Coremail.ztdepyahoo@163.com> References: <24845df7.17887.140afc15fbf.Coremail.ztdepyahoo@163.com> <245db206.1829d.140b027348d.Coremail.ztdepyahoo@163.com> Message-ID: On Sat, Aug 24, 2013 at 6:48 AM, ??? wrote: > thank you . but the matrix element has been changed in the third and > fourth call. how to re calculate the preconditioner. > Don't give SAME_PRECONDITIONER, and read the KSP section of the manual. Matt > > > ? 2013-08-24 18:18:17?"Matthew Knepley" ??? > > > On Sat, Aug 24, 2013 at 4:57 AM, ??? wrote: > >> in my code, i need to use kspsolve in the following way >> >> KSP ksp; >> >> KSPCreate(PETSC_COMM_WORLD,&ksp); >> KSPSetOperators(ksp,A,A,SAME_NONZERO_PATTERN); >> KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER); >> KSPSetInitialGuessNonzero(ksp,PETSC_TRUE) >> KSPSetType(ksp,KSPBCGS); >> KSPSetFromOptions(ksp); >> >> set the matrix A value and right hand side bu and bv. >> kspsolve(A,bu); >> kspsolve(A,bv); >> change the value of matrix A and bu bv, >> kspsolve(A,bu); >> kspsolve(A,bv); >> >> the first and second call to the kspsolve use the same preconditioner. >> but which preconditoner does the third and fourth call to the kspsolve >> since the value of the matrix A has changed. >> > > The first call is redundant, and the subsequent solves use the same > preconditioner, as you asked. > > Matt > > >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Aug 24 09:12:28 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 24 Aug 2013 09:12:28 -0500 Subject: [petsc-users] A problem with sizeof grid In-Reply-To: References: <7d8f01.392c.140af51aa5f.Coremail.zyzhang@nuaa.edu.cn> Message-ID: On Aug 24, 2013, at 5:19 AM, Matthew Knepley wrote: > On Sat, Aug 24, 2013 at 2:54 AM, Zhang wrote: > Hi, > > I ran my 3d structured CFD code for a cavity flow with different grid size from 20x20x20 to 80x80x80. Whether fast or not, all they converged and gave me the expected result. > > but when I tried with 120x120x120 grid, errors appeared as follows. Since I believe my physical memory is large enough for these test. So I just wonder what really happened with the GMRES solver + hypre PC. I hope you may leave me a hint. Thanks. > > This is a question for the Hypre list. It looks like out of memory errors. Since this occurs in the PCApply_Hypre() it is less likely to be a memory problem (by then most of the memory setup has occurred). The hypre error code of 1 is HYPRE_ERROR_GENERIC which is not really useful :-) We need to get a better handle on exactly where the error occurs in hypre. Please do the following: switch to another PETSC_ARCH where you will build a debug version of PETSc, for example export PETSC_ARCH=arch-debug edit config/PETSc/packages/hypre.py and locate the line args.append('--without-superlu') add a new line args.append('--with-print-errors') immediately after it. save the file run the same configure command you ran before but this time WITHOUT --with-debugging=0 then run make to build the PETSc libraries then recompile your program run your program as before. When it hits the hypre error hypre should print some messages about exactly where it hit the problem. Send them to us. You can also run the program under valgrind and see if it reports any memory corruption errors. http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind Barry Thanks for the report. Because of it I will add the hypre ability to print the error messages to PETSc's debug build to make it easier to track down problems like this. > > Matt > > Zhenyu > mpirun -np 8 ./nsproj ../input/sample_120.zzy.dat -px 2 -py 2 -pz 2 -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 > > > > ....GRID POINT DIMENSIONS W/ GHOST CELLS : 122 X 122 X 122 > px= 2, py= 2, pz= 2 > ....BEGINNING SIMULATION > time: 0, SolvePressure iter= 0, residnorm= 0.000000e+00 > time: 20, SolvePressure iter= 2, residnorm= 1.329163e+21 > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Error in external library! > [0]PETSC ERROR : Error in HYPRE solver, error code 1! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [0]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [0]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [1]PETSC ERROR: --------------------- Error Message ------------------------------------ > [1]PETSC ERROR: Error in external library! > [1]PETSC ERROR: Error in HYPRE solver, error code 1! > [1]PETSC ERROR: ------------------------------------------------- ----------------------- > [1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [1]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [1]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [1]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [1]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download -chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message ------------------------------------ > [2]PETSC ERROR: Error in external library! > [2]PETSC ERROR: Error in HYPRE solver, error code 1! > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [2]PETSC ERROR: See docs/changes/index.html for recent updates. > [2]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [2]PETSC ERROR: See docs/index.html for manual pages. > [2]PETSC ERROR: ------------------------------------------------- ----------------------- > [2]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [2]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [2]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [2]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [2]PETSC ERROR: ------------------------------------------------------------------------ > [2]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hyp re/hypre.c > [2]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [2]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [2]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > [3]PETSC ERROR: Error in external library! > [3]PETSC ERROR: Error in HYPRE solver, error code 1! > [3]PETSC ERROR: ------------------------------------------------------------------------ > [3]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [3]PETSC ERROR: See docs/changes/index.html for recent updates. > [3]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [3]PETSC ERROR: See docs/index.html for manual pages. > [3]PETSC ERROR: ------------------------------------------------------------------------ > [3]PETSC ERROR: ./nsproj on a arch- linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [3]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [3]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [3]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [3]PETSC ERROR: ------------------------------------------------------------------------ > [3]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [3]PETSC ERROR: PCApply() line 442 in /home/zhen yu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [3]PETSC ERROR: [4]PETSC ERROR: --------------------- Error Message ------------------------------------ > [4]PETSC ERROR: Error in external library! > [4]PETSC ERROR: Error in HYPRE solver, error code 1! > [4]PETSC ERROR: ------------------------------------------------------------------------ > [4]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [4]PETSC ERROR: See docs/changes/index.html for recent updates. > [4]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [4]PETSC ERROR: See docs/index.html for manual pages. > [4]PETSC ERROR: ------------------------------------------------------------------------ > [4]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [4]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [4]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [4]PETSC ERROR: Configure options --with-de bugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [4]PETSC ERROR: ------------------------------------------------------------------------ > [4]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [4]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [4]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [4]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c[4]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [4]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [4]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [4]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [5]PETSC ERROR: --------------------- Error Message ------------------------------------ > [5]PETSC ERROR: Error in external library! > [5]PETSC ERROR: Error in HYPRE solver, error code 1! > [5]PETSC ERROR: ------------------------------------------------------------------------ > [5]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [5]PETSC ERROR: See docs/changes/index.html for recent updates. > [5]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [5]PETSC ERROR: See docs/index.html for manual pages. > [5]P ETSC ERROR: ------------------------------------------------------------------------ > [5]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [5]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [5]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [5]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [5]PETSC ERROR: ------------------------------------------------------------------------ > [5]PETSC ERROR: PCApply_HYPR E() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [5]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [5]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [5]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [5]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [5]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [5]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [5]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [6]PETSC ERROR: --------------------- Error Message ------------------------------------ > [6]PETSC ERROR: Error in external library! > [6]PETSC ERROR: Error in HYPRE so lver, error code 1! > [6]PETSC ERROR: ------------------------------------------------------------------------ > [6]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [6]PETSC ERROR: See docs/changes/index.html for recent updates. > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [6]PETSC ERROR: See docs/index.html for manual pages. > [6]PETSC ERROR: ------------------------------------------------------------------------ > [6]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [6]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [6]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [6]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve =1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [6]PETSC ERROR: ------------------------------------------------------------------------ > [6]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [6]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [6]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [6]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [6]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [6]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [7]PETSC ERROR: -------------- ------- Error Message ------------------------------------ > [7]PETSC ERROR: Error in external library! > [7]PETSC ERROR: Error in HYPRE solver, error code 1! > [7]PETSC ERROR: ------------------------------------------------------------------------ > [7]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 > [7]PETSC ERROR: See docs/changes/index.html for recent updates. > [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [7]PETSC ERROR: See docs/index.html for manual pages. > [7]PETSC ERROR: ------------------------------------------------------------------------ > [7]PETSC ERROR: ./nsproj on a arch-linux2-c-opt named amax by zhenyu Sat Aug 24 16:05:05 2013 > [7]PETSC ERROR: Libraries linked from /home/zhenyu/petsc-3.4.2/arch-linux2-c-opt/lib > [7]PETSC ERROR: Configure run at Sat Aug 24 12:43:12 2013 > [7]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir =/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [7]PETSC ERROR: ------------------------------------------------------------------------ > [7]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [7]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [7]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [7]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [7]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmre s/gmres.c > [7]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [7]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [7]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [0]PETSC ERROR: Configure options --with-debugging=0 --with-shared-libraries=1 --with-dynamic-loading=1 --with-x=1 --with-blas-lapack-dir=/usr/lib/lapack --with-valgrind=1 --with-cc=gcc --with-fc=gfortran --with-clanguage=C++ --with-c++-support=1 --with-sieve=1 --with-opt-sieve=1 --download-generator --download-triangle --download-ctetgen --with-ctetgen --download-chaco --download-boost=1 --download-hypre=1 --download-metis --download-parmetis --download-hdf5 --download-openmpi --download-f-blas-lapack --download-superlu_dist > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: PCApply_HYPRE() line 170 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/impls/hypre/hypre.c > [0]PETSC ERROR: PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [0]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [0]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [0]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [0]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > PCApply() line 442 in /home/zhenyu/petsc-3.4.2/src/ksp/pc/interface/precon.c > [1]PETSC ERROR: KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/pet sc-private/kspimpl.h > [1]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [1]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [1]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [1]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [2]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [2]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [2]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [2]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/ nsproj_petsc.cc > KSP_PCApply() line 227 in /home/zhenyu/petsc-3.4.2/include/petsc-private/kspimpl.h > [3]PETSC ERROR: KSPInitialResidual() line 64 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itres.c > [3]PETSC ERROR: KSPSolve_GMRES() line 239 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/impls/gmres/gmres.c > [3]PETSC ERROR: KSPSolve() line 441 in /home/zhenyu/petsc-3.4.2/src/ksp/ksp/interface/itfunc.c > [3]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [3]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > [6]PETSC ERROR: SolvePressure() line 898 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nssolver_petsc.cc > [6]PETSC ERROR: start_sim() line 190 in "unknowndirectory/"/home/zhenyu/work/nsproj_0.3_zzy/src/nsproj_petsc.cc > -------------------------------------------------------------------------- > mpirun noticed that the j ob aborted, but has no info as to the process > that caused that situation. > -------------------------------------------------------------------------- > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From jedbrown at mcs.anl.gov Sat Aug 24 12:17:56 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sat, 24 Aug 2013 12:17:56 -0500 Subject: [petsc-users] how to use the Euclid preconditioner from Hypre in the petsc In-Reply-To: <1431bf84.16a4e.140af6be40a.Coremail.ztdepyahoo@163.com> References: <1431bf84.16a4e.140af6be40a.Coremail.ztdepyahoo@163.com> Message-ID: <8738pzhut7.fsf@mcs.anl.gov> ??? writes: > I configured the petsc without the hypre download. i want to download > and install the hypre seperately. could you please tell me how to use > the euclid precontioner from hypre in petsc. -pc_type hypre -pc_hypre_type euclid As Matt says, use -help to see more options. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mpovolot at purdue.edu Sun Aug 25 07:38:39 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Sun, 25 Aug 2013 08:38:39 -0400 Subject: [petsc-users] Mat Destroy In-Reply-To: <87fvu0htca.fsf@mcs.anl.gov> References: <5217D400.5090100@purdue.edu> <87fvu0htca.fsf@mcs.anl.gov> Message-ID: <5219FACF.7020203@purdue.edu> On 8/23/2013 7:37 PM, Jed Brown wrote: > Barry Smith writes: >> We call free() at that point. But not that in Unix this does not >> mean the memory is returned to the operating system so you will not >> see the process memory go down. If you then allocate new objects >> they will reuse this memory. > Also note that MatDestroy only releases a references, so if another > object still holds a reference to your matrix, nothing will be freed. Thank you! Just to clarify: if a code reads like this: Mat A; MatCreate(MPI_COMM_WORLD, &A); Mat B = A; .... MatDestroy(&B); Will the free() function be called for the memory that contains the matrix data in this case? Thank you, Michael. From knepley at gmail.com Sun Aug 25 07:41:48 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 25 Aug 2013 07:41:48 -0500 Subject: [petsc-users] Mat Destroy In-Reply-To: <5219FACF.7020203@purdue.edu> References: <5217D400.5090100@purdue.edu> <87fvu0htca.fsf@mcs.anl.gov> <5219FACF.7020203@purdue.edu> Message-ID: On Sun, Aug 25, 2013 at 7:38 AM, Michael Povolotskyi wrote: > On 8/23/2013 7:37 PM, Jed Brown wrote: > >> Barry Smith writes: >> >>> We call free() at that point. But not that in Unix this does not >>> mean the memory is returned to the operating system so you will not >>> see the process memory go down. If you then allocate new objects >>> they will reuse this memory. >>> >> Also note that MatDestroy only releases a references, so if another >> object still holds a reference to your matrix, nothing will be freed. >> > Thank you! > Just to clarify: if a code reads like this: > > Mat A; > MatCreate(MPI_COMM_WORLD, &A); > > Mat B = A; > .... > MatDestroy(&B); > > Will the free() function be called for the memory that contains the matrix > data in this case? > Yes, it will. You might want Mat B = A; PetscObjectReference((PetscObject) B); so that you also need MatDestroy(B); Matt > Thank you, > Michael. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From cathy at smokjoy.com Sun Aug 25 10:22:32 2013 From: cathy at smokjoy.com (cathy at smokjoy.com) Date: Sun, 25 Aug 2013 23:22:32 +0800 (CST) Subject: [petsc-users] "GS-H2 + Vivi Nova = ? " (K300, iTaste 134 available)Do the addition of e-cigs! Message-ID: <18891329.47037.1377444152047.JavaMail.WIN-V38V9QFP9YO$@WIN-V38V9QFP9YO> Hello my friend, GS H2 + Vivi Nova =? The newest GS-H5 clearomizer can give you the best answer! As you know, Vivi Nova has been prevailed for a long time. Plus, most of you might have noticed the popularity of GS-H2 in the last two months. Now our updated GS-H5 clearomizer which is a kind of transparent clearomizer with big capacity of 3ML and replaceable 2.4ohm coil owns the clear tank like GS-H2 and the classic image of Vivi Nova. Therefore, big vapor and easy using will be offered. It is expected that your customers will enjoy this product. Dear friends, Join our hot presale now! The first GS-H5 will arrive next week. Do the addition of "GS-H2 + Vivi Nova ". Do the addition of your e-cig business! Your soonest and positive reply shall be highly appreciated. Thanks a lot for your valuable time for reading. Yours sincerely, Cathy Shen Zhen Wanna Technology CO.,LTD Mobile No :+86 13662225940 My skype:cathyecigare E-mail:cathy at smokjoy.com? cathyming1366 at 163.com Web Site: http://wanna.en.alibaba.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 20130822184037541.png(4).png(1).png.png Type: image/jpeg Size: 52198 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 20130822184037166.png(4).png(1).png.png Type: image/jpeg Size: 45953 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 20130822184037882.png(4).png(1).png.png Type: image/jpeg Size: 54522 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 20130822184037703.png(4).png(1).png.png Type: image/jpeg Size: 39774 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 20130822184037834.png(4).png(1).png.png Type: image/jpeg Size: 51373 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 20130822184037472.png(4).png(1).png.png Type: image/jpeg Size: 39222 bytes Desc: not available URL: From fangxingjun0319 at gmail.com Sun Aug 25 15:30:16 2013 From: fangxingjun0319 at gmail.com (Frank) Date: Sun, 25 Aug 2013 15:30:16 -0500 Subject: [petsc-users] Weird memory leakage Message-ID: <521A6958.2040103@gmail.com> Hi, I have very weird problem here. I am using FORTRAN to call PETSc to solve Poisson equation. When I run my code with 8 cores, it works fine, and the consumed memory does not increase. However, when it is run with 64 cores, first of all it gives lots of error like this: [n310:18951] [[62652,0],2] -> [[62652,0],10] (node: n219) oob-tcp: Number of attempts to create TCP connection has been exceeded. Can not communicate with peer [n310:18951] [[62652,0],2] -> [[62652,0],18] (node: n128) oob-tcp: Number of attempts to create TCP connection has been exceeded. Can not communicate with peer [n310:18951] [[62652,0],2] -> [[62652,0],34] (node: n089) oob-tcp: Number of attempts to create TCP connection has been exceeded. Can not communicate with peer [n310:18951] [[62652,0],2] ORTED_CMD_PROCESSOR: STUCK IN INFINITE LOOP - ABORTING [n310:18951] *** Process received signal *** [n310:18951] Signal: Aborted (6) [n310:18951] Signal code: (-6) [n310:18951] [ 0] /lib64/libpthread.so.0() [0x35b120f500] [n310:18951] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x35b0e328a5] [n310:18951] [ 2] /lib64/libc.so.6(abort+0x175) [0x35b0e34085] [n310:18951] [ 3] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_daemon_cmd_processor+0x243) [0x2ae5e02f0813] [n310:18951] [ 4] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_base_loop+0x31a) [0x2ae5e032f56a] [n310:18951] [ 5] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_loop+0x12) [0x2ae5e032f242] [n310:18951] [ 6] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_progress+0x5c) [0x2ae5e031845c] [n310:18951] [ 7] /global/software/openmpi-1.6.1-intel1/lib/openmpi/mca_grpcomm_bad.so(+0x1bd7) [0x2ae5e28debd7] [n310:18951] [ 8] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_ess_base_orted_finalize+0x1e) [0x2ae5e02f431e] [n310:18951] [ 9] /global/software/openmpi-1.6.1-intel1/lib/openmpi/mca_ess_tm.so(+0x1294) [0x2ae5e1ab1294] [n310:18951] [10] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_finalize+0x4e) [0x2ae5e02d0fbe] [n310:18951] [11] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(+0x4840b) [0x2ae5e02f040b] [n310:18951] [12] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_base_loop+0x31a) [0x2ae5e032f56a] [n310:18951] [13] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_loop+0x12) [0x2ae5e032f242] [n310:18951] [14] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_progress+0x5c) [0x2ae5e031845c] [n310:18951] [15] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_trigger_event+0x50) [0x2ae5e02dc930] [n310:18951] [16] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(+0x4916f) [0x2ae5e02f116f] [n310:18951] [17] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_daemon_cmd_processor+0x149) [0x2ae5e02f0719] [n310:18951] [18] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_base_loop+0x31a) [0x2ae5e032f56a] [n310:18951] [19] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_loop+0x12) [0x2ae5e032f242] [n310:18951] [20] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_dispatch+0x8) [0x2ae5e032f228] [n310:18951] [21] /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_daemon+0x9f0) [0x2ae5e02ef8a0] [n310:18951] [22] orted(main+0x88) [0x4024d8] [n310:18951] [23] /lib64/libc.so.6(__libc_start_main+0xfd) [0x35b0e1ecdd] [n310:18951] [24] orted() [0x402389] [n310:18951] *** End of error message *** but the program still gives the right result for a short period. After that, it suddenly stopped because memory exceeds some limit. I don't understand this. If there is memory leakage in my code, how come it can work with 8 cores? Please help me.Thank you so much! Sincerely Xingjun From mfadams at lbl.gov Sun Aug 25 16:46:47 2013 From: mfadams at lbl.gov (Mark F. Adams) Date: Sun, 25 Aug 2013 17:46:47 -0400 Subject: [petsc-users] Weird memory leakage In-Reply-To: <521A6958.2040103@gmail.com> References: <521A6958.2040103@gmail.com> Message-ID: <0CF173D5-4D42-4E0F-8569-4497E8151D72@lbl.gov> On Aug 25, 2013, at 4:30 PM, Frank wrote: > Hi, > I have very weird problem here. > I am using FORTRAN to call PETSc to solve Poisson equation. > When I run my code with 8 cores, it works fine, and the consumed memory does not increase. However, when it is run with 64 cores, first of all it gives lots of error like this: > > [n310:18951] [[62652,0],2] -> [[62652,0],10] (node: n219) oob-tcp: > Number of attempts to create TCP connection has been exceeded. Can not > communicate with peer > [n310:18951] [[62652,0],2] -> [[62652,0],18] (node: n128) oob-tcp: > Number of attempts to create TCP connection has been exceeded. Can not > communicate with peer > [n310:18951] [[62652,0],2] -> [[62652,0],34] (node: n089) oob-tcp: > Number of attempts to create TCP connection has been exceeded. Can not > communicate with peer > [n310:18951] [[62652,0],2] ORTED_CMD_PROCESSOR: STUCK IN INFINITE LOOP - > ABORTING I don't know where you are getting "memory" errors but this looks like a pretty fatal error. Unless someone recognizes something else I'd look at this in a debugger and see where this is happening. See if its deterministic or not. And if it is see what code is killing it. Mark > [n310:18951] *** Process received signal *** > [n310:18951] Signal: Aborted (6) > [n310:18951] Signal code: (-6) > [n310:18951] [ 0] /lib64/libpthread.so.0() [0x35b120f500] > [n310:18951] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x35b0e328a5] > [n310:18951] [ 2] /lib64/libc.so.6(abort+0x175) [0x35b0e34085] > [n310:18951] [ 3] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_daemon_cmd_processor+0x243) > [0x2ae5e02f0813] > [n310:18951] [ 4] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_base_loop+0x31a) > [0x2ae5e032f56a] > [n310:18951] [ 5] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_loop+0x12) > [0x2ae5e032f242] > [n310:18951] [ 6] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_progress+0x5c) > [0x2ae5e031845c] > [n310:18951] [ 7] > /global/software/openmpi-1.6.1-intel1/lib/openmpi/mca_grpcomm_bad.so(+0x1bd7) > [0x2ae5e28debd7] > [n310:18951] [ 8] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_ess_base_orted_finalize+0x1e) > [0x2ae5e02f431e] > [n310:18951] [ 9] > /global/software/openmpi-1.6.1-intel1/lib/openmpi/mca_ess_tm.so(+0x1294) > [0x2ae5e1ab1294] > [n310:18951] [10] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_finalize+0x4e) > [0x2ae5e02d0fbe] > [n310:18951] [11] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(+0x4840b) > [0x2ae5e02f040b] > [n310:18951] [12] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_base_loop+0x31a) > [0x2ae5e032f56a] > [n310:18951] [13] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_loop+0x12) > [0x2ae5e032f242] > [n310:18951] [14] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_progress+0x5c) > [0x2ae5e031845c] > [n310:18951] [15] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_trigger_event+0x50) > [0x2ae5e02dc930] > [n310:18951] [16] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(+0x4916f) > [0x2ae5e02f116f] > [n310:18951] [17] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_daemon_cmd_processor+0x149) > [0x2ae5e02f0719] > [n310:18951] [18] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_base_loop+0x31a) > [0x2ae5e032f56a] > [n310:18951] [19] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_loop+0x12) > [0x2ae5e032f242] > [n310:18951] [20] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(opal_event_dispatch+0x8) > [0x2ae5e032f228] > [n310:18951] [21] > /global/software/openmpi-1.6.1-intel1/lib/libopen-rte.so.4(orte_daemon+0x9f0) > [0x2ae5e02ef8a0] > [n310:18951] [22] orted(main+0x88) [0x4024d8] > [n310:18951] [23] /lib64/libc.so.6(__libc_start_main+0xfd) [0x35b0e1ecdd] > [n310:18951] [24] orted() [0x402389] > [n310:18951] *** End of error message *** > > but the program still gives the right result for a short period. After that, it suddenly stopped because memory exceeds some limit. I don't understand this. If there is memory leakage in my code, how come it can work with 8 cores? Please help me.Thank you so much! > > Sincerely > Xingjun > > From knepley at gmail.com Sun Aug 25 17:48:40 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 25 Aug 2013 17:48:40 -0500 Subject: [petsc-users] Weird memory leakage In-Reply-To: <521A6958.2040103@gmail.com> References: <521A6958.2040103@gmail.com> Message-ID: On Sun, Aug 25, 2013 at 3:30 PM, Frank wrote: > Hi, > I have very weird problem here. > I am using FORTRAN to call PETSc to solve Poisson equation. > When I run my code with 8 cores, it works fine, and the consumed memory > does not increase. However, when it is run with 64 cores, first of all it > gives lots of error like this: > > [n310:18951] [[62652,0],2] -> [[62652,0],10] (node: n219) oob-tcp: > Number of attempts to create TCP connection has been exceeded. Can not > communicate with peer > [n310:18951] [[62652,0],2] -> [[62652,0],18] (node: n128) oob-tcp: > Number of attempts to create TCP connection has been exceeded. Can not > communicate with peer > [n310:18951] [[62652,0],2] -> [[62652,0],34] (node: n089) oob-tcp: > Number of attempts to create TCP connection has been exceeded. Can not > communicate with peer > [n310:18951] [[62652,0],2] ORTED_CMD_PROCESSOR: STUCK IN INFINITE LOOP - > ABORTING > [n310:18951] *** Process received signal *** > [n310:18951] Signal: Aborted (6) > [n310:18951] Signal code: (-6) > [n310:18951] [ 0] /lib64/libpthread.so.0() [0x35b120f500] > [n310:18951] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x35b0e328a5] > [n310:18951] [ 2] /lib64/libc.so.6(abort+0x175) [0x35b0e34085] > [n310:18951] [ 3] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > orte_daemon_cmd_processor+**0x243) > [0x2ae5e02f0813] > [n310:18951] [ 4] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_event_base_loop+0x31a) > [0x2ae5e032f56a] > [n310:18951] [ 5] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_event_loop+0x12) > [0x2ae5e032f242] > [n310:18951] [ 6] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_progress+0x5c) > [0x2ae5e031845c] > [n310:18951] [ 7] > /global/software/openmpi-1.6.**1-intel1/lib/openmpi/mca_** > grpcomm_bad.so(+0x1bd7) > [0x2ae5e28debd7] > [n310:18951] [ 8] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > orte_ess_base_orted_finalize+**0x1e) > [0x2ae5e02f431e] > [n310:18951] [ 9] > /global/software/openmpi-1.6.**1-intel1/lib/openmpi/mca_ess_** > tm.so(+0x1294) > [0x2ae5e1ab1294] > [n310:18951] [10] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > orte_finalize+0x4e) > [0x2ae5e02d0fbe] > [n310:18951] [11] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(**+0x4840b) > [0x2ae5e02f040b] > [n310:18951] [12] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_event_base_loop+0x31a) > [0x2ae5e032f56a] > [n310:18951] [13] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_event_loop+0x12) > [0x2ae5e032f242] > [n310:18951] [14] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_progress+0x5c) > [0x2ae5e031845c] > [n310:18951] [15] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > orte_trigger_event+0x50) > [0x2ae5e02dc930] > [n310:18951] [16] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(**+0x4916f) > [0x2ae5e02f116f] > [n310:18951] [17] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > orte_daemon_cmd_processor+**0x149) > [0x2ae5e02f0719] > [n310:18951] [18] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_event_base_loop+0x31a) > [0x2ae5e032f56a] > [n310:18951] [19] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_event_loop+0x12) > [0x2ae5e032f242] > [n310:18951] [20] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > opal_event_dispatch+0x8) > [0x2ae5e032f228] > [n310:18951] [21] > /global/software/openmpi-1.6.**1-intel1/lib/libopen-rte.so.4(** > orte_daemon+0x9f0) > [0x2ae5e02ef8a0] > [n310:18951] [22] orted(main+0x88) [0x4024d8] > [n310:18951] [23] /lib64/libc.so.6(__libc_start_**main+0xfd) > [0x35b0e1ecdd] > [n310:18951] [24] orted() [0x402389] > [n310:18951] *** End of error message *** > > but the program still gives the right result for a short period. After > that, it suddenly stopped because memory exceeds some limit. I don't > understand this. If there is memory leakage in my code, how come it can > work with 8 cores? Please help me.Thank you so much! > All of the errors are OpenMPI errors. The first thing to do is track down why they are happening. I think your only option here is to get the system administrator on your machine to help. Since you have MPI errors, any number of weird things could be happening, like your job launching on many fewer than 64 nodes (as the error says some could not be contacted), accounting for memory running out. Thanks, Matt > Sincerely > Xingjun > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zyzhang at nuaa.edu.cn Mon Aug 26 02:29:57 2013 From: zyzhang at nuaa.edu.cn (Zhang) Date: Mon, 26 Aug 2013 15:29:57 +0800 (GMT+08:00) Subject: [petsc-users] which is the best PC for GMRES Poisson solver? In-Reply-To: References: <111453d.38c8.140aeb87e7e.Coremail.zyzhang@nuaa.edu.cn> Message-ID: Dear Matt, I tried with gamg, sor and hypre. Here is the result # grid 120x120x120, 200 physical time steps gamg 428.727438 sec hypre 477.186299 sec sor 1296.237177 sec Petsc-3.4.2 opt version Cheers, Zhenyu -----????----- ???: "Matthew Knepley" ????: 2013-08-24 18:20:00 (???) ???: Zhang ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] which is the best PC for GMRES Poisson solver? On Sat, Aug 24, 2013 at 12:07 AM, Zhang wrote: Hi, Recently I wrote a code of projection method for solving the incompressile flow. It includes a poisson solution of pressure. I transfered the code to be based on Petsc . Use -pc_type gamg Matt However, in the case of 3D lid-driven flow case, the speed of petsc version is not advantageous yet. I tried different combination of preconditioner with GMRES solver. Among them GMRES+PCILU or GMRES+SOR are both the fastest. For a grid 80x80x80, GMRES+SOR serial version used 185.816307 secs. However, for case 120x120x120, it diverged. So is GMRES+PCILU. Then I tried a parallel comparison, as follows, ############################################################################################# # with Petsc-3.4.2, time comparison (sec) # size (80,80,80), 200 steps, dt=0.002 #debug version 177.695939 sec #opt version 106.694733 sec #mpirun -np 8 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -p c_type hypre -ksp_rtol 1.e-2 #debug version 514.718544 sec #opt version 331.114555 sec #mpirun -np 12 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 #debug version 796.765428 sec #opt version 686.151788 sec #mpirun -np 16 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 I do know sometimes problem with speed is not due to petsc, but my own code, so please you are welcome for any suggestion about which combination is the best for such a computation. I know solving Poiison and Helmholtz is so common to see in numerical work. Thank you first. BTW, I also tried to use superLU_dist as PC. #mpirun -np 16 ./nsproj ../input/sample_20.zzy.dat -ksp_type gmres -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_rtol 1.e-2 But with 16 nodes, except case of 20x20x20 grids, all larger grids run e xtremly slow. Since I never use the direct PC before, is it true that a good usage of direct LU as Preconditioner requires that the amount of procedures be much larger so that for each node the calculation of direct solver is smalled enough to use it? Cheers, Zhenyu -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zyzhang at nuaa.edu.cn Mon Aug 26 02:33:22 2013 From: zyzhang at nuaa.edu.cn (Zhang) Date: Mon, 26 Aug 2013 15:33:22 +0800 (GMT+08:00) Subject: [petsc-users] which is the best PC for GMRES Poisson solver? In-Reply-To: References: <111453d.38c8.140aeb87e7e.Coremail.zyzhang@nuaa.edu.cn> Message-ID: <7d89d3.3df4.140b98a98a4.Coremail.zyzhang@nuaa.edu.cn> Hi, Matt BTW, Could you leave me any suggestion of those direct solver PC, such as superLU_dist? Thanks Zhenyu -----????----- ???: "Matthew Knepley" ????: 2013-08-24 18:20:00 (???) ???: Zhang ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] which is the best PC for GMRES Poisson solver? On Sat, Aug 24, 2013 at 12:07 AM, Zhang wrote: Hi, Recently I wrote a code of projection method for solving the incompressile flow. It includes a poisson solution of pressure. I transfered the code to be based on Petsc . Use -pc_type gamg Matt However, in the case of 3D lid-driven flow case, the speed of petsc version is not advantageous yet. I tried different combination of preconditioner with GMRES solver. Among them GMRES+PCILU or GMRES+SOR are both the fastest. For a grid 80x80x80, GMRES+SOR serial version used 185.816307 secs. However, for case 120x120x120, it diverged. So is GMRES+PCILU. Then I tried a parallel comparison, as follows, ############################################################################################# # with Petsc-3.4.2, time comparison (sec) # size (80,80,80), 200 steps, dt=0.002 #debug version 177.695939 sec #opt version 106.694733 sec #mpirun -np 8 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -p c_type hypre -ksp_rtol 1.e-2 #debug version 514.718544 sec #opt version 331.114555 sec #mpirun -np 12 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 #debug version 796.765428 sec #opt version 686.151788 sec #mpirun -np 16 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -pc_type hypre -ksp_rtol 1.e-2 I do know sometimes problem with speed is not due to petsc, but my own code, so please you are welcome for any suggestion about which combination is the best for such a computation. I know solving Poiison and Helmholtz is so common to see in numerical work. Thank you first. BTW, I also tried to use superLU_dist as PC. #mpirun -np 16 ./nsproj ../input/sample_20.zzy.dat -ksp_type gmres -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_rtol 1.e-2 But with 16 nodes, except case of 20x20x20 grids, all larger grids run e xtremly slow. Since I never use the direct PC before, is it true that a good usage of direct LU as Preconditioner requires that the amount of procedures be much larger so that for each node the calculation of direct solver is smalled enough to use it? Cheers, Zhenyu -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Mon Aug 26 04:31:32 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Mon, 26 Aug 2013 17:31:32 +0800 (CST) Subject: [petsc-users] how to know the local size of the grid after partition Message-ID: <2c10a448.11639.140b9f6c7b1.Coremail.ztdepyahoo@163.com> I partitioned the FVM Mesh into 4 parts. the isg and is contain the new goloba number and new process number of each local nodes. but could you please tell me how to know the local size of the grid and local range of the grid. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Aug 26 06:00:41 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 26 Aug 2013 06:00:41 -0500 Subject: [petsc-users] which is the best PC for GMRES Poisson solver? In-Reply-To: <7d89d3.3df4.140b98a98a4.Coremail.zyzhang@nuaa.edu.cn> References: <111453d.38c8.140aeb87e7e.Coremail.zyzhang@nuaa.edu.cn> <7d89d3.3df4.140b98a98a4.Coremail.zyzhang@nuaa.edu.cn> Message-ID: On Mon, Aug 26, 2013 at 2:33 AM, Zhang wrote: > Hi, Matt > BTW, > > Could you leave me any suggestion of those direct solver PC, such as > superLU_dist? Thanks For simple Poisson, I do not expect direct methods to be competitive with MG, however you can easily try by configuring with --download-superlu and -pc_type lu -pc_factor_mat_solver_package superlu Matt > Zhenyu > > > -----????----- > *???:* "Matthew Knepley" > *????:* 2013-08-24 18:20:00 (???) > *???:* Zhang > *??:* petsc-users at mcs.anl.gov > *??:* Re: [petsc-users] which is the best PC for GMRES Poisson solver? > > On Sat, Aug 24, 2013 at 12:07 AM, Zhang wrote: > >> Hi, >> >> Recently I wrote a code of projection method for solving the >> incompressile flow. It includes a poisson solution of pressure. >> I transfered the code to be based on Petsc . >> > > Use -pc_type gamg > > Matt > > >> However, in the case of 3D lid-driven flow case, the speed of petsc >> version is not advantageous yet. >> >> I tried different combination of preconditioner with GMRES solver. Among >> them GMRES+PCILU or GMRES+SOR are both the fastest. >> For a grid 80x80x80, GMRES+SOR serial version used 185.816307 secs. >> However, for case 120x120x120, it diverged. So is GMRES+PCILU. >> >> Then I tried a parallel comparison, as follows, >> >> >> >> ############################################################################################# >> # with Petsc-3.4.2, time comparison (sec) >> # size (80,80,80), 200 steps, dt=0.002 >> #debug version 177.695939 sec >> #opt version 106.694733 sec >> #mpirun -np 8 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres -p >> c_type hypre -ksp_rtol 1.e-2 >> #debug version 514.718544 sec >> #opt version 331.114555 sec >> #mpirun -np 12 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres >> -pc_type hypre -ksp_rtol 1.e-2 >> #debug version 796.765428 sec >> #opt version 686.151788 sec >> #mpirun -np 16 ./nsproj ../input/sample_80.zzy.dat -ksp_type gmres >> -pc_type hypre -ksp_rtol 1.e-2 >> >> I do know sometimes problem with speed is not due to petsc, but my own >> code, so please you are welcome for any suggestion >> about which combination is the best for such a computation. I know >> solving Poiison and Helmholtz is so common to see in numerical work. >> Thank you first. >> >> BTW, I also tried to use superLU_dist as PC. >> #mpirun -np 16 ./nsproj ../input/sample_20.zzy.dat -ksp_type gmres >> -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_rtol 1.e-2 >> >> But with 16 nodes, except case of 20x20x20 grids, all larger grids run e >> xtremly slow. >> Since I never use the direct PC before, is it true that a good usage of >> direct LU as Preconditioner >> requires that the amount of procedures be much larger so that for each >> node the calculation of direct solver is smalled enough to use it? >> >> Cheers, >> >> Zhenyu >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Mon Aug 26 09:11:36 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 26 Aug 2013 09:11:36 -0500 Subject: [petsc-users] which is the best PC for GMRES Poisson solver? In-Reply-To: References: <111453d.38c8.140aeb87e7e.Coremail.zyzhang@nuaa.edu.cn> <7d89d3.3df4.140b98a98a4.Coremail.zyzhang@nuaa.edu.cn> Message-ID: <87r4dgczjb.fsf@mcs.anl.gov> Matthew Knepley writes: > On Mon, Aug 26, 2013 at 2:33 AM, Zhang wrote: > >> Hi, Matt >> BTW, >> >> Could you leave me any suggestion of those direct solver PC, such as >> superLU_dist? Thanks > > > For simple Poisson, I do not expect direct methods to be competitive with > MG, however you can > easily try by configuring with --download-superlu and -pc_type lu > -pc_factor_mat_solver_package superlu And by this, Matt means configuring with --download-superlu_dist (or --download-mumps) and running with -pc_type lu -pc_factor_mat_solver_package superlu_dist (or mumps). -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Mon Aug 26 13:58:05 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 26 Aug 2013 13:58:05 -0500 Subject: [petsc-users] how to know the local size of the grid after partition In-Reply-To: <2c10a448.11639.140b9f6c7b1.Coremail.ztdepyahoo@163.com> References: <2c10a448.11639.140b9f6c7b1.Coremail.ztdepyahoo@163.com> Message-ID: <87zjs4b7pe.fsf@mcs.anl.gov> ??? writes: > I partitioned the FVM Mesh into 4 parts. the isg and is contain the > new goloba number and new process number of each local nodes. but > could you please tell me how to know the local size of the grid and > local range of the grid. You haven't given enough context to know what question you are asking. Are you using DMPlex or rolling your own grid, for example? -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From ztdepyahoo at 163.com Mon Aug 26 18:55:54 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Tue, 27 Aug 2013 07:55:54 +0800 (CST) Subject: [petsc-users] how to know the local size of the grid after partition In-Reply-To: <87zjs4b7pe.fsf@mcs.anl.gov> References: <2c10a448.11639.140b9f6c7b1.Coremail.ztdepyahoo@163.com> <87zjs4b7pe.fsf@mcs.anl.gov> Message-ID: <5b470886.7c0.140bd0e20f2.Coremail.ztdepyahoo@163.com> i read in my own unstructured grid system, i have vertices coodinate, and connection table. ? 2013-08-27 02:58:05?"Jed Brown" ??? >??? writes: > >> I partitioned the FVM Mesh into 4 parts. the isg and is contain the >> new goloba number and new process number of each local nodes. but >> could you please tell me how to know the local size of the grid and >> local range of the grid. > >You haven't given enough context to know what question you are asking. >Are you using DMPlex or rolling your own grid, for example? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Aug 26 19:05:31 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 26 Aug 2013 19:05:31 -0500 Subject: [petsc-users] how to know the local size of the grid after partition In-Reply-To: <5b470886.7c0.140bd0e20f2.Coremail.ztdepyahoo@163.com> References: <2c10a448.11639.140b9f6c7b1.Coremail.ztdepyahoo@163.com> <87zjs4b7pe.fsf@mcs.anl.gov> <5b470886.7c0.140bd0e20f2.Coremail.ztdepyahoo@163.com> Message-ID: On Mon, Aug 26, 2013 at 6:55 PM, ??? wrote: > i read in my own unstructured grid system, i have vertices coodinate, and > connection table. > We still do not understand what you are using in PETSc. Matt > ? 2013-08-27 02:58:05?"Jed Brown" ??? > >??? writes: > > > >> I partitioned the FVM Mesh into 4 parts. the isg and is contain the > >> new goloba number and new process number of each local nodes. but > >> could you please tell me how to know the local size of the grid and > >> local range of the grid. > > > >You haven't given enough context to know what question you are asking. > >Are you using DMPlex or rolling your own grid, for example? > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From beonbuzz at mail.com Tue Aug 27 00:01:49 2013 From: beonbuzz at mail.com (=?utf-8?Q?ProTechLab?=) Date: Tue, 27 Aug 2013 05:01:49 +0000 Subject: [petsc-users] =?utf-8?q?08/27/2013_New_ProTechLab_Lowcost_Service?= =?utf-8?q?s_for_you!?= Message-ID: Use this area to offer a short preview of your email's content. View this email in your browser (http://us7.campaign-archive2.com/?u=d07c4d3eaca48ebba26bd510c&id=4c675bbb36&e=5ff02cf322) ** New service deals for you! ------------------------------------------------------------ ** The world's biggest marketplace for services, starting at $10 ------------------------------------------------------------ Discover the low cost services that we have for you today. New deals every day. Find something you like and place your order. Follow the progress and receive your order. Review your order and share with friends. The cheapest way to complete your tasks and services. Find out other services that you may want, or suggest us a service that you need: ProTechLab (http://blogspot.us7.list-manage.com/track/click?u=d07c4d3eaca48ebba26bd510c&id=19bcdffb7b&e=5ff02cf322) ============================================================ ** unsubscribe from this list (http://blogspot.us7.list-manage1.com/unsubscribe?u=d07c4d3eaca48ebba26bd510c&id=b25ef2457d&e=5ff02cf322&c=4c675bbb36) ** update subscription preferences (http://blogspot.us7.list-manage1.com/profile?u=d07c4d3eaca48ebba26bd510c&id=b25ef2457d&e=5ff02cf322) -------------- next part -------------- An HTML attachment was scrubbed... URL: From jtravs at gmail.com Tue Aug 27 08:46:51 2013 From: jtravs at gmail.com (John Travers) Date: Tue, 27 Aug 2013 15:46:51 +0200 Subject: [petsc-users] error with slepc shift invert: New nonzero at (0, 0) caused a malloc! Message-ID: <5F7A00DE-F4FF-4D36-A06D-0FFBA00F118B@gmail.com> Hi all, I'm trying to use shift and invert to find interior eigenvalues using slepc. I am using example 4 in slepc 3.41 which reads the matrix from a binary file ('A' saved from Matlab). If I run just: ./ex4 -file A -eps_nev 1 the program runs fine, and correctly finds the largest eigenvalue (although much slower than Matlab's eigs). However, if I run: ./ex4 -file A -eps_nev 1 -st_type sinvert -st_shift 0.0,0.5 to try and find an eigenvalue near 0.0+0.5i, it outputs the error message below. Any help would be much appreciated! Best regards, John ================================================================== Eigenproblem stored in file. Reading COMPLEX matrix from a binary file... [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Argument out of range! [0]PETSC ERROR: New nonzero at (0,0) caused a malloc! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: /Users/john/Documents/code/downloaded/slepc-3.4.1/src/eps/examples/tutorials/ex4 on a arch-darwin-cxx-debug named mpl002075.mpl.mpg.de by john Tue Aug 27 15:43:45 2013 [0]PETSC ERROR: Libraries linked from /Users/john/Documents/code/downloaded/petsc-3.4.2/arch-darwin-cxx-debug/lib [0]PETSC ERROR: Configure run at Fri Aug 23 08:07:38 2013 [0]PETSC ERROR: Configure options --with-cc=gcc-mp-4.8 --with-fc=gfortran-mp-4.8 --with-cxx=g++ --download-f-blas-lapack --with-mpi=0 --with-scalar-type=complex --with-clanguage=cxx [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatSetValues_SeqAIJ() line 353 in /Users/john/Documents/code/downloaded/petsc-3.4.2/src/mat/impls/aij/seq/aij.c [0]PETSC ERROR: MatSetValues() line 1106 in /Users/john/Documents/code/downloaded/petsc-3.4.2/src/mat/interface/matrix.c [0]PETSC ERROR: MatShift() line 166 in /Users/john/Documents/code/downloaded/petsc-3.4.2/src/mat/utils/axpy.c [0]PETSC ERROR: STMatGAXPY_Private() line 376 in /Users/john/Documents/code/downloaded/slepc-3.4.1/src/st/interface/stsolve.c [0]PETSC ERROR: STSetUp_Sinvert() line 138 in /Users/john/Documents/code/downloaded/slepc-3.4.1/src/st/impls/sinvert/sinvert.c [0]PETSC ERROR: STSetUp() line 290 in /Users/john/Documents/code/downloaded/slepc-3.4.1/src/st/interface/stsolve.c [0]PETSC ERROR: EPSSetUp() line 215 in /Users/john/Documents/code/downloaded/slepc-3.4.1/src/eps/interface/setup.c [0]PETSC ERROR: EPSSolve() line 90 in /Users/john/Documents/code/downloaded/slepc-3.4.1/src/eps/interface/solve.c [0]PETSC ERROR: main() line 88 in src/eps/examples/tutorials/ex4.c Abort trap: 6 From jedbrown at mcs.anl.gov Tue Aug 27 09:18:51 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 27 Aug 2013 09:18:51 -0500 Subject: [petsc-users] error with slepc shift invert: New nonzero at (0, 0) caused a malloc! In-Reply-To: <5F7A00DE-F4FF-4D36-A06D-0FFBA00F118B@gmail.com> References: <5F7A00DE-F4FF-4D36-A06D-0FFBA00F118B@gmail.com> Message-ID: <87d2oz9pys.fsf@mcs.anl.gov> John Travers writes: > Hi all, > > I'm trying to use shift and invert to find interior eigenvalues using slepc. > I am using example 4 in slepc 3.41 which reads the matrix from a binary file ('A' saved from Matlab). > > If I run just: > ./ex4 -file A -eps_nev 1 > the program runs fine, and correctly finds the largest eigenvalue (although much slower than Matlab's eigs). You can't compare performance when running in debug mode. If the algorithm converges slower, we can figure out why. > However, if I run: > ./ex4 -file A -eps_nev 1 -st_type sinvert -st_shift 0.0,0.5 > to try and find an eigenvalue near 0.0+0.5i, it outputs the error message below. Please just insert a value 0.0 along the diagonal of your matrix. MatShift would be more expensive and could hide memory performance bugs if it silently reallocated to add diagonal entries. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jroman at dsic.upv.es Tue Aug 27 09:25:38 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Tue, 27 Aug 2013 16:25:38 +0200 Subject: [petsc-users] error with slepc shift invert: New nonzero at (0, 0) caused a malloc! In-Reply-To: <87d2oz9pys.fsf@mcs.anl.gov> References: <5F7A00DE-F4FF-4D36-A06D-0FFBA00F118B@gmail.com> <87d2oz9pys.fsf@mcs.anl.gov> Message-ID: El 27/08/2013, a las 16:18, Jed Brown escribi?: > John Travers writes: > >> Hi all, >> >> I'm trying to use shift and invert to find interior eigenvalues using slepc. >> I am using example 4 in slepc 3.41 which reads the matrix from a binary file ('A' saved from Matlab). >> >> If I run just: >> ./ex4 -file A -eps_nev 1 >> the program runs fine, and correctly finds the largest eigenvalue (although much slower than Matlab's eigs). > > You can't compare performance when running in debug mode. If the > algorithm converges slower, we can figure out why. > >> However, if I run: >> ./ex4 -file A -eps_nev 1 -st_type sinvert -st_shift 0.0,0.5 >> to try and find an eigenvalue near 0.0+0.5i, it outputs the error message below. > > Please just insert a value 0.0 along the diagonal of your matrix. > MatShift would be more expensive and could hide memory performance bugs > if it silently reallocated to add diagonal entries. Note that you should use -eps_target 0.0,0.5 instead of -st_shift. Jose From jtravs at gmail.com Tue Aug 27 09:59:51 2013 From: jtravs at gmail.com (John Travers) Date: Tue, 27 Aug 2013 16:59:51 +0200 Subject: [petsc-users] error with slepc shift invert: New nonzero at (0, 0) caused a malloc! In-Reply-To: <87d2oz9pys.fsf@mcs.anl.gov> References: <5F7A00DE-F4FF-4D36-A06D-0FFBA00F118B@gmail.com> <87d2oz9pys.fsf@mcs.anl.gov> Message-ID: <9457FA93-74CE-44F5-A8B7-1DFA411ADE68@gmail.com> On 27 Aug 2013, at 16:18, Jed Brown wrote: > John Travers writes: > >> Hi all, >> >> I'm trying to use shift and invert to find interior eigenvalues using slepc. >> I am using example 4 in slepc 3.41 which reads the matrix from a binary file ('A' saved from Matlab). >> >> If I run just: >> ./ex4 -file A -eps_nev 1 >> the program runs fine, and correctly finds the largest eigenvalue (although much slower than Matlab's eigs). > > You can't compare performance when running in debug mode. If the > algorithm converges slower, we can figure out why. Thanks for this. It wasn't obvious to me. I recompiled and get much better performance. > >> However, if I run: >> ./ex4 -file A -eps_nev 1 -st_type sinvert -st_shift 0.0,0.5 >> to try and find an eigenvalue near 0.0+0.5i, it outputs the error message below. > > Please just insert a value 0.0 along the diagonal of your matrix. > MatShift would be more expensive and could hide memory performance bugs > if it silently reallocated to add diagonal entries. OK, this solved my problem. Although I couldn't add zeros on the diagonal (the software I use to write the sparse matrices optimises this out), so I put a diagonal of 1e-300. Do you think this will cause inaccuracies in the calculations (so far it seems OK). Thanks, John From jtravs at gmail.com Tue Aug 27 10:00:45 2013 From: jtravs at gmail.com (John Travers) Date: Tue, 27 Aug 2013 17:00:45 +0200 Subject: [petsc-users] error with slepc shift invert: New nonzero at (0, 0) caused a malloc! In-Reply-To: References: <5F7A00DE-F4FF-4D36-A06D-0FFBA00F118B@gmail.com> <87d2oz9pys.fsf@mcs.anl.gov> Message-ID: On 27 Aug 2013, at 16:25, Jose E. Roman wrote: > > El 27/08/2013, a las 16:18, Jed Brown escribi?: > >> John Travers writes: >> >>> Hi all, >>> >>> I'm trying to use shift and invert to find interior eigenvalues using slepc. >>> I am using example 4 in slepc 3.41 which reads the matrix from a binary file ('A' saved from Matlab). >>> >>> If I run just: >>> ./ex4 -file A -eps_nev 1 >>> the program runs fine, and correctly finds the largest eigenvalue (although much slower than Matlab's eigs). >> >> You can't compare performance when running in debug mode. If the >> algorithm converges slower, we can figure out why. >> >>> However, if I run: >>> ./ex4 -file A -eps_nev 1 -st_type sinvert -st_shift 0.0,0.5 >>> to try and find an eigenvalue near 0.0+0.5i, it outputs the error message below. >> >> Please just insert a value 0.0 along the diagonal of your matrix. >> MatShift would be more expensive and could hide memory performance bugs >> if it silently reallocated to add diagonal entries. > > Note that you should use -eps_target 0.0,0.5 instead of -st_shift. OK, thanks for the heads up. From jedbrown at mcs.anl.gov Tue Aug 27 10:11:26 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 27 Aug 2013 10:11:26 -0500 Subject: [petsc-users] error with slepc shift invert: New nonzero at (0, 0) caused a malloc! In-Reply-To: <9457FA93-74CE-44F5-A8B7-1DFA411ADE68@gmail.com> References: <5F7A00DE-F4FF-4D36-A06D-0FFBA00F118B@gmail.com> <87d2oz9pys.fsf@mcs.anl.gov> <9457FA93-74CE-44F5-A8B7-1DFA411ADE68@gmail.com> Message-ID: <8761ur9nj5.fsf@mcs.anl.gov> John Travers writes: > Thanks for this. It wasn't obvious to me. I recompiled and get much > better performance. Great. > OK, this solved my problem. Although I couldn't add zeros on the > diagonal (the software I use to write the sparse matrices optimises > this out), so I put a diagonal of 1e-300. Do you think this will cause > inaccuracies in the calculations (so far it seems OK). Anything at least 1e-16 smaller than the other entries is essentially 0.0 as far as floating point arithmetic is concerned. One possible concern with 1e-300 is creating floating point exceptions due to denormals. For example, if you get a NaN and then turn on trapping FP exceptions (-fp_trap in PETSc) and try to debug, you might get a false positive when the 1e-300 creates a denormal. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From stali at geology.wisc.edu Tue Aug 27 14:45:42 2013 From: stali at geology.wisc.edu (Tabrez Ali) Date: Tue, 27 Aug 2013 14:45:42 -0500 Subject: [petsc-users] GAMG and linear elasticity Message-ID: <521D01E6.7070003@geology.wisc.edu> Hello What is the proper way to use GAMG on a vanilla 3D linear elasticity problem. Should I use -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 or -pc_type fieldsplit -pc_fieldsplit_block_size 3 -fieldsplit_pc_type gamg -fieldsplit_pc_gamg_type agg -fieldsplit_pc_gamg_agg_nsmooths 1 Do these options even make sense? With the second set of options the % increase in number of iterations with increasing problem size is lower than the first but not optimal. Also, ksp/ksp/examples/ex56 performs much better in that the number of iterations remain more or less constant unlike what I see with my own problem. What am I doing wrong? The output of -ksp_view for the two set of options used is attached. Thanks in advance. Tabrez -------------- next part -------------- Reading input ... Partitioning mesh ... Reading mesh data ... Forming [K] ... Forming RHS ... Setting up solver ... Solving ... 0 KSP Residual norm 5.201733187820e-02 1 KSP Residual norm 1.099026395850e-02 2 KSP Residual norm 5.042531960219e-03 3 KSP Residual norm 2.900154719433e-03 4 KSP Residual norm 1.981423195364e-03 5 KSP Residual norm 1.427135427398e-03 6 KSP Residual norm 1.098375830345e-03 7 KSP Residual norm 8.171142731182e-04 8 KSP Residual norm 6.241353708263e-04 9 KSP Residual norm 4.594842173716e-04 10 KSP Residual norm 3.422820541875e-04 11 KSP Residual norm 2.288676731103e-04 12 KSP Residual norm 1.403795429712e-04 13 KSP Residual norm 8.497517268662e-05 14 KSP Residual norm 4.612536416341e-05 15 KSP Residual norm 2.617765913915e-05 16 KSP Residual norm 1.510196277776e-05 17 KSP Residual norm 9.019114875021e-06 18 KSP Residual norm 6.009327953180e-06 19 KSP Residual norm 4.355601035228e-06 20 KSP Residual norm 2.944914091024e-06 21 KSP Residual norm 1.695461437589e-06 22 KSP Residual norm 1.062228336911e-06 23 KSP Residual norm 6.663147669163e-07 24 KSP Residual norm 4.312489682055e-07 25 KSP Residual norm 2.893524615337e-07 26 KSP Residual norm 1.914089929812e-07 27 KSP Residual norm 1.238817489532e-07 28 KSP Residual norm 7.683272381931e-08 29 KSP Residual norm 4.169276110310e-08 30 KSP Residual norm 2.148781941016e-08 31 KSP Residual norm 1.403054516655e-08 32 KSP Residual norm 8.805306038787e-09 33 KSP Residual norm 5.401864440509e-09 34 KSP Residual norm 3.223812851026e-09 35 KSP Residual norm 2.014263357765e-09 36 KSP Residual norm 1.360071786265e-09 37 KSP Residual norm 8.977977623075e-10 38 KSP Residual norm 5.671948481098e-10 39 KSP Residual norm 3.671046658729e-10 40 KSP Residual norm 2.210643616019e-10 41 KSP Residual norm 1.495535545659e-10 42 KSP Residual norm 1.008918360828e-10 43 KSP Residual norm 6.783838063885e-11 44 KSP Residual norm 4.352151663612e-11 KSP Object: 4 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-09, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 4 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 4 MPI processes type: bjacobi block Jacobi: number of blocks = 4 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test KSP Object: (mg_coarse_sub_) 1 MPI processes KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly PC Object: (mg_coarse_sub_) 1 MPI processes type: lu type: preonly maximum iterations=1, initial guess is zero maximum iterations=1, initial guess is zero maximum iterations=1, initial guess is zero LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test using diagonal shift on blocks to prevent zero pivot matrix ordering: nd using NONE norm type for convergence test PC Object: (mg_coarse_sub_) PC Object: (mg_coarse_sub_) 1 MPI processes PC Object: (mg_coarse_sub_) 1 MPI processes factor fill ratio given 5, needed 1.06642 Factored matrix follows: 1 MPI processes type: lu LU: out-of-place factorization type: lu LU: out-of-place factorization type: lu LU: out-of-place factorization Matrix Object: 1 MPI processes tolerance for zero pivot 2.22045e-14 tolerance for zero pivot 2.22045e-14 type: seqaij using diagonal shift on blocks to prevent zero pivot matrix ordering: nd using diagonal shift on blocks to prevent zero pivot matrix ordering: nd tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot rows=26, cols=26 factor fill ratio given 5, needed 0 Factored matrix follows: factor fill ratio given 5, needed 0 Factored matrix follows: matrix ordering: nd factor fill ratio given 5, needed 0 package used to perform factorization: petsc Matrix Object: Matrix Object: Factored matrix follows: Matrix Object: total: nonzeros=578, allocated nonzeros=578 total number of mallocs used during MatSetValues calls =0 1 MPI processes type: seqaij 1 MPI processes type: seqaij 1 MPI processes rows=0, cols=0 type: seqaij using I-node routines: found 14 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: package used to perform factorization: petsc rows=0, cols=0 package used to perform factorization: petsc total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 not using I-node routines 1 MPI processes type: seqaij rows=26, cols=26 total: nonzeros=542, allocated nonzeros=542 total number of mallocs used during MatSetValues calls =0 not using I-node routines - - - - - - - - - - - - - - - - - - total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 linear system matrix = precond matrix: Matrix Object: rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 1 MPI processes type: seqaij package used to perform factorization: petsc total number of mallocs used during MatSetValues calls =0 not using I-node routines rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 total number of mallocs used during MatSetValues calls =0 not using I-node routines not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines [1] number of local blocks = 1, first local block number = 1 [1] local block number 0 - - - - - - - - - - - - - - - - - - [2] number of local blocks = 1, first local block number = 2 [2] local block number 0 - - - - - - - - - - - - - - - - - - [3] number of local blocks = 1, first local block number = 3 [3] local block number 0 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=26, cols=26 total: nonzeros=542, allocated nonzeros=542 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.0661807, max = 1.38979 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=1386, cols=1386 total: nonzeros=49460, allocated nonzeros=49460 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.1332, max = 2.7972 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=202878, cols=202878 total: nonzeros=15595884, allocated nonzeros=63297936 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 16907 nodes, limit used is 5 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=202878, cols=202878 total: nonzeros=15595884, allocated nonzeros=63297936 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 16907 nodes, limit used is 5 Recovering stress ... Cleaning up ... Finished -------------- next part -------------- Reading input ... Partitioning mesh ... Reading mesh data ... Forming [K] ... Forming RHS ... Setting up solver ... Solving ... 0 KSP Residual norm 4.062139693015e-02 1 KSP Residual norm 1.466113648411e-02 2 KSP Residual norm 6.479863911139e-03 3 KSP Residual norm 4.569049461591e-03 4 KSP Residual norm 3.130857994128e-03 5 KSP Residual norm 1.983543889095e-03 6 KSP Residual norm 1.156789632219e-03 7 KSP Residual norm 5.899045914732e-04 8 KSP Residual norm 2.837798321640e-04 9 KSP Residual norm 1.359117543889e-04 10 KSP Residual norm 6.385081462171e-05 11 KSP Residual norm 2.935882041357e-05 12 KSP Residual norm 1.493739596377e-05 13 KSP Residual norm 9.201338063289e-06 14 KSP Residual norm 5.884399324670e-06 15 KSP Residual norm 3.613939011973e-06 16 KSP Residual norm 2.382929136315e-06 17 KSP Residual norm 1.560623578712e-06 18 KSP Residual norm 9.197810318628e-07 19 KSP Residual norm 5.339056563737e-07 20 KSP Residual norm 3.060078898263e-07 21 KSP Residual norm 1.707524658269e-07 22 KSP Residual norm 9.973870483901e-08 23 KSP Residual norm 5.939404758593e-08 24 KSP Residual norm 3.323258377859e-08 25 KSP Residual norm 1.830778495567e-08 26 KSP Residual norm 1.141547456761e-08 27 KSP Residual norm 7.355063277008e-09 28 KSP Residual norm 4.857944128572e-09 29 KSP Residual norm 3.285608748712e-09 30 KSP Residual norm 2.021520313423e-09 31 KSP Residual norm 1.433518924534e-09 32 KSP Residual norm 1.022603460571e-09 33 KSP Residual norm 7.063122249368e-10 34 KSP Residual norm 4.470858335207e-10 35 KSP Residual norm 2.775173681825e-10 36 KSP Residual norm 1.703746060374e-10 37 KSP Residual norm 9.782267597341e-11 38 KSP Residual norm 5.315585921715e-11 39 KSP Residual norm 2.846271417839e-11 KSP Object: 4 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-09, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: fieldsplit FieldSplit with MULTIPLICATIVE composition: total splits = 3, blocksize = 3 Solver info for each split is in the following KSP objects: Split number 0 Fields 0 KSP Object: (fieldsplit_0_) 4 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 4 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_0_mg_coarse_) 4 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_coarse_) 4 MPI processes type: bjacobi block Jacobi: number of blocks = 4 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero KSP Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes type: lu tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning LU: out-of-place factorization using NONE norm type for convergence test PC Object: tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot (fieldsplit_0_mg_coarse_sub_) 1 MPI processes PC Object: (fieldsplit_0_mg_coarse_sub_) 1 MPI processes matrix ordering: nd type: lu LU: out-of-place factorization type: lu LU: out-of-place factorization PC Object: (fieldsplit_0_mg_coarse_sub_) factor fill ratio given 5, needed 1.07615 Factored matrix follows: tolerance for zero pivot 2.22045e-14 tolerance for zero pivot 2.22045e-14 1 MPI processes type: lu Matrix Object: using diagonal shift on blocks to prevent zero pivot using diagonal shift on blocks to prevent zero pivot matrix ordering: nd LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 1 MPI processes type: seqaij rows=25, cols=25 package used to perform factorization: petsc total: nonzeros=537, allocated nonzeros=537 matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 factor fill ratio given 5, needed 0 Factored matrix follows: Matrix Object: 1 MPI processes Factored matrix follows: Matrix Object: total number of mallocs used during MatSetValues calls =0 package used to perform factorization: petsc type: seqaij 1 MPI processes using I-node routines: found 13 nodes, limit used is 5 linear system matrix = precond matrix: total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 rows=0, cols=0 type: seqaij rows=0, cols=0 Matrix Object: 1 MPI processes type: seqaij not using I-node routines package used to perform factorization: petsc total: nonzeros=1, allocated nonzeros=1 package used to perform factorization: petsc rows=25, cols=25 linear system matrix = precond matrix: Matrix Object: total number of mallocs used during MatSetValues calls =0 total: nonzeros=1, allocated nonzeros=1 total: nonzeros=499, allocated nonzeros=499 total number of mallocs used during MatSetValues calls =0 1 MPI processes type: seqaij not using I-node routines linear system matrix = precond matrix: total number of mallocs used during MatSetValues calls =0 not using I-node routines - - - - - - - - - - - - - - - - - - rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 Matrix Object: 1 MPI processes total number of mallocs used during MatSetValues calls =0 type: seqaij not using I-node routines linear system matrix = precond matrix: not using I-node routines rows=0, cols=0 Matrix Object: 1 MPI processes total: nonzeros=0, allocated nonzeros=0 type: seqaij [1] number of local blocks = 1, first local block number = 1 total number of mallocs used during MatSetValues calls =0 not using I-node routines rows=0, cols=0 [1] local block number 0 - - - - - - - - - - - - - - - - - - total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 [2] number of local blocks = 1, first local block number = 2 not using I-node routines [2] local block number 0 - - - - - - - - - - - - - - - - - - [3] number of local blocks = 1, first local block number = 3 [3] local block number 0 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=25, cols=25 total: nonzeros=499, allocated nonzeros=499 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_1_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.0654046, max = 1.3735 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_1_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=1416, cols=1416 total: nonzeros=51260, allocated nonzeros=51260 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_0_mg_levels_2_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.132851, max = 2.78987 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_0_mg_levels_2_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=67624, cols=67624 total: nonzeros=1732830, allocated nonzeros=1732830 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=67624, cols=67624 total: nonzeros=1732830, allocated nonzeros=1732830 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Split number 1 Fields 1 KSP Object: (fieldsplit_1_) 4 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_1_) 4 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_1_mg_coarse_) 4 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_1_mg_coarse_) 4 MPI processes type: bjacobi block Jacobi: number of blocks = 4 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 KSP Object: KSP Object: KSP Object: [0] local block number 0 KSP Object: (fieldsplit_1_mg_coarse_sub_) 1 MPI processes (fieldsplit_1_mg_coarse_sub_) 1 MPI processes (fieldsplit_1_mg_coarse_sub_) 1 MPI processes (fieldsplit_1_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero type: preonly maximum iterations=1, initial guess is zero type: preonly maximum iterations=1, initial guess is zero type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: using NONE norm type for convergence test PC Object: tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: using NONE norm type for convergence test PC Object: (fieldsplit_1_mg_coarse_sub_) 1 MPI processes type: lu (fieldsplit_1_mg_coarse_sub_) 1 MPI processes type: lu (fieldsplit_1_mg_coarse_sub_) 1 MPI processes (fieldsplit_1_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 type: lu LU: out-of-place factorization LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd using diagonal shift on blocks to prevent zero pivot matrix ordering: nd tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: factor fill ratio given 5, needed 0 Factored matrix follows: matrix ordering: nd factor fill ratio given 5, needed 0 factor fill ratio given 5, needed 1.13901 Factored matrix follows: Matrix Object: 1 MPI processes Matrix Object: 1 MPI processes Factored matrix follows: Matrix Object: type: seqaij type: seqaij rows=0, cols=0 Matrix Object: 1 MPI processes 1 MPI processes type: seqaij rows=0, cols=0 package used to perform factorization: petsc package used to perform factorization: petsc type: seqaij rows=24, cols=24 total: nonzeros=1, allocated nonzeros=1 total: nonzeros=1, allocated nonzeros=1 rows=0, cols=0 package used to perform factorization: petsc total: nonzeros=508, allocated nonzeros=508 total number of mallocs used during MatSetValues calls =0 not using I-node routines total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 package used to perform factorization: petsc total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: total number of mallocs used during MatSetValues calls =0 using I-node routines: found 14 nodes, limit used is 5 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=24, cols=24 total: nonzeros=446, allocated nonzeros=446 linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines 1 MPI processes type: seqaij rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines - - - - - - - - - - - - - - - - - - total number of mallocs used during MatSetValues calls =0 [1] number of local blocks = 1, first local block number = 1 [1] local block number 0 not using I-node routines - - - - - - - - - - - - - - - - - - [2] number of local blocks = 1, first local block number = 2 [2] local block number 0 - - - - - - - - - - - - - - - - - - [3] number of local blocks = 1, first local block number = 3 [3] local block number 0 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=24, cols=24 total: nonzeros=446, allocated nonzeros=446 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_1_mg_levels_1_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.0662612, max = 1.39149 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_1_mg_levels_1_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=1410, cols=1410 total: nonzeros=50558, allocated nonzeros=50558 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_1_mg_levels_2_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.132704, max = 2.78678 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_1_mg_levels_2_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=67624, cols=67624 total: nonzeros=1732770, allocated nonzeros=1732770 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=67624, cols=67624 total: nonzeros=1732770, allocated nonzeros=1732770 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Split number 2 Fields 2 KSP Object: (fieldsplit_2_) 4 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_2_) 4 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (fieldsplit_2_mg_coarse_) 4 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_2_mg_coarse_) 4 MPI processes type: bjacobi block Jacobi: number of blocks = 4 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 KSP Object: KSP Object: [0] local block number 0 KSP Object: (fieldsplit_2_mg_coarse_sub_) 1 MPI processes (fieldsplit_2_mg_coarse_sub_) 1 MPI processes (fieldsplit_2_mg_coarse_sub_) 1 MPI processes KSP Object: (fieldsplit_2_mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero type: preonly maximum iterations=1, initial guess is zero type: preonly maximum iterations=1, initial guess is zero type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_2_mg_coarse_sub_) PC Object: (fieldsplit_2_mg_coarse_sub_) using NONE norm type for convergence test PC Object: using NONE norm type for convergence test PC Object: 1 MPI processes type: lu 1 MPI processes type: lu (fieldsplit_2_mg_coarse_sub_) 1 MPI processes (fieldsplit_2_mg_coarse_sub_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: Matrix Object: LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: Matrix Object: 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot matrix ordering: nd factor fill ratio given 5, needed 0 Factored matrix follows: matrix ordering: nd factor fill ratio given 5, needed 1.13309 Factored matrix follows: Matrix Object: 1 MPI processes 1 MPI processes type: seqaij rows=0, cols=0 type: seqaij rows=0, cols=0 package used to perform factorization: petsc Matrix Object: 1 MPI processes type: seqaij rows=27, cols=27 package used to perform factorization: petsc total: nonzeros=1, allocated nonzeros=1 total: nonzeros=1, allocated nonzeros=1 type: seqaij rows=0, cols=0 package used to perform factorization: petsc total number of mallocs used during MatSetValues calls =0 total number of mallocs used during MatSetValues calls =0 not using I-node routines package used to perform factorization: petsc total: nonzeros=613, allocated nonzeros=613 not using I-node routines linear system matrix = precond matrix: linear system matrix = precond matrix: Matrix Object: total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 14 nodes, limit used is 5 Matrix Object: 1 MPI processes 1 MPI processes type: seqaij total number of mallocs used during MatSetValues calls =0 linear system matrix = precond matrix: Matrix Object: type: seqaij rows=0, cols=0 rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 1 MPI processes type: seqaij total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: rows=27, cols=27 not using I-node routines not using I-node routines 1 MPI processes total: nonzeros=541, allocated nonzeros=541 total number of mallocs used during MatSetValues calls =0 type: seqaij not using I-node routines rows=0, cols=0 - - - - - - - - - - - - - - - - - - total: nonzeros=0, allocated nonzeros=0 [1] number of local blocks = 1, first local block number = 1 total number of mallocs used during MatSetValues calls =0 [1] local block number 0 - - - - - - - - - - - - - - - - - - not using I-node routines [2] number of local blocks = 1, first local block number = 2 [2] local block number 0 - - - - - - - - - - - - - - - - - - [3] number of local blocks = 1, first local block number = 3 [3] local block number 0 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=27, cols=27 total: nonzeros=541, allocated nonzeros=541 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (fieldsplit_2_mg_levels_1_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.0659669, max = 1.38531 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_2_mg_levels_1_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=1411, cols=1411 total: nonzeros=50491, allocated nonzeros=50491 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (fieldsplit_2_mg_levels_2_) 4 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.132415, max = 2.78072 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (fieldsplit_2_mg_levels_2_) 4 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=67624, cols=67624 total: nonzeros=1732808, allocated nonzeros=1732808 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=67624, cols=67624 total: nonzeros=1732808, allocated nonzeros=1732808 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines linear system matrix = precond matrix: Matrix Object: 4 MPI processes type: mpiaij rows=202878, cols=202878 total: nonzeros=15595884, allocated nonzeros=63297936 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 16907 nodes, limit used is 5 Recovering stress ... Cleaning up ... Finished From knepley at gmail.com Tue Aug 27 15:10:42 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 27 Aug 2013 15:10:42 -0500 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <521D01E6.7070003@geology.wisc.edu> References: <521D01E6.7070003@geology.wisc.edu> Message-ID: On Tue, Aug 27, 2013 at 2:45 PM, Tabrez Ali wrote: > Hello > > What is the proper way to use GAMG on a vanilla 3D linear elasticity > problem. Should I use > > -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 > This is fine. > or > > -pc_type fieldsplit -pc_fieldsplit_block_size 3 -fieldsplit_pc_type gamg > -fieldsplit_pc_gamg_type agg -fieldsplit_pc_gamg_agg_**nsmooths 1 > > Do these options even make sense? With the second set of options the % > increase in number of iterations with increasing problem size is lower than > the first but not optimal. > > Also, ksp/ksp/examples/ex56 performs much better in that the number of > iterations remain more or less constant unlike what I see with my own > problem. What am I doing wrong? > You need to give it a good near null space. Usually, we use the 3 translational and 3 rotations modes that are the null space of the elastic operator. MatNullSpaceCreateRigidBody() is an example of making them. PyLith does this by default :) Thanks, Matt > The output of -ksp_view for the two set of options used is attached. > > Thanks in advance. > > Tabrez > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue Aug 27 15:15:10 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 27 Aug 2013 15:15:10 -0500 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <521D01E6.7070003@geology.wisc.edu> References: <521D01E6.7070003@geology.wisc.edu> Message-ID: <87bo4i99gx.fsf@mcs.anl.gov> Tabrez Ali writes: > Hello > > What is the proper way to use GAMG on a vanilla 3D linear elasticity > problem. Should I use > > -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 Yeah, and only the first of these is needed because the others are default with -pc_type gamg. > -pc_type fieldsplit -pc_fieldsplit_block_size 3 -fieldsplit_pc_type gamg > -fieldsplit_pc_gamg_type agg -fieldsplit_pc_gamg_agg_nsmooths 1 > > Do these options even make sense? With the second set of options the % > increase in number of iterations with increasing problem size is lower > than the first but not optimal. And it's probably more expensive because it has to do inner solves. Also, if you have less compressible regions, it will get much worse. > Also, ksp/ksp/examples/ex56 performs much better in that the number of > iterations remain more or less constant unlike what I see with my own > problem. What am I doing wrong? You probably forgot to set the near null space. You can use MatSetNearNullSpace (and maybe MatNullSpaceCreateRigidBody) or the more hacky (IMO) PCSetCoordinates. It's important to have translational *and* rotational modes in the near null space that GAMG uses to build a coarse space. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From mfadams at lbl.gov Tue Aug 27 15:59:52 2013 From: mfadams at lbl.gov (Mark F. Adams) Date: Tue, 27 Aug 2013 16:59:52 -0400 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <87bo4i99gx.fsf@mcs.anl.gov> References: <521D01E6.7070003@geology.wisc.edu> <87bo4i99gx.fsf@mcs.anl.gov> Message-ID: > > You probably forgot to set the near null space. You can use > MatSetNearNullSpace (and maybe MatNullSpaceCreateRigidBody) or the more > hacky (IMO) PCSetCoordinates. It's important to have translational No debate this is a hack and but people use it and I don't want to force them to commute the RBM (error prone and tedious). I've never heard of MatNullSpaceCreateRigidBody, and its easy so I'd recommend using it. I'll fix ex55 & ex56 to use this one day. > *and* rotational modes in the near null space that GAMG uses to build a > coarse space. This can be tricky in testing because if you just crush a cube (not a bad test problem) then there are not rotational modes in the solution and so rotational bodes don't help. Which, as Jed points out correctly is not a good general conclusion. From vijay.m at gmail.com Tue Aug 27 16:45:32 2013 From: vijay.m at gmail.com (Vijay S. Mahadevan) Date: Tue, 27 Aug 2013 16:45:32 -0500 Subject: [petsc-users] Setting block size for MPI vectors Message-ID: I am trying to create an MPI vector and then trying to set the block size after the initial creation. But I receive an error in PetscLayoutSetBlockSize. Since there is no explicit block size argument during construction, I call VecSetBlockSize afterwards. The relevant code and error are below. Is there an alternate call to set the block size to the vector so that VecSetValuesBlocked will work correctly ? If there is no current way to do this, that's fine and I can try to get the local array and set the values accordingly. Code: ierr = VecCreateMPI(PETSC_COMM_WORLD, 10, PETSC_DECIDE, &x);CHKERRQ(ierr); ierr = VecSetBlockSize(x, 2);CHKERRQ(ierr); ierr = VecSetFromOptions(x);CHKERRQ(ierr); Relevant errors: 0]PETSC ERROR: Arguments are incompatible! [0]PETSC ERROR: Cannot change block size 1 to 2! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Development GIT revision: 0799ffdeffb47ef0458c1305293fae28c4d2cd92 GIT Date: 2013-08-01 06:36:16 +0800 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./ex1 on a arch-darwin-cxx-debug named anlextwls002-168.wl.anl-external.org by mahadevan Tue Aug 27 16:37:18 2013 [0]PETSC ERROR: Libraries linked from /opt/petsc-dbg/lib [0]PETSC ERROR: Configure run at Wed Jul 31 18:03:46 2013 [0]PETSC ERROR: Configure options --prefix=/opt/petsc-dbg PETSC_ARCH=arch-darwin-cxx-debug --download-scalapack=1 --download-blacs=1 --with-mumps-dir=/Users/mahadevan/source/MUMPS_4.10.0-p3 --download-hypre=1 --with-metis=1 --with-parmetis=1 --known-mpi-shared-libraries=1 --with-blas-lapack-lib=/System/Library/Frameworks/vecLib.framework/vecLib --with-c++-support=1 --with-c-support=1 --with-cc=mpicc --with-clanguage=C++ --with-moab-dir=/opt/moab --with-dynamic-loading=1 --with-fc=mpif90 --with-fortran=1 --with-mpi=1 --with-shared-libraries=1 --with-valgrind=1 --with-valgrind-dir=/opt/local --with-cc=mpicc --with-cxx=mpicxx COPTFLAGS="-g -fPIC" CXXOPTFLAGS="-g -fPIC" FOPTFLAGS="-g -fPIC" --with-metis-dir=/usr/local --with-parmetis-dir=/usr/local --with-netcdf-dir=/usr/local --with-zoltan=1 --with-zoltan-lib="-L/usr/local/lib -lptscotch -lscotch -lscotchmetis -lscotcherr -lscotcherrexit -lzoltan" --with-zoltan-include=/usr/local/include --with-hdf5 --with-hdf5-include=/usr/local/include --with-hdf5-lib="-L/usr/local/lib -lhdf5_fortran -lhdf5" --with-netcdf-dir=/usr/local --with-cmake=/opt/local/bin/cmake --with-x-dir=/opt/local [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: PetscLayoutSetBlockSize() line 459 in /Users/mahadevan/source/petsc-dev/src/vec/is/utils/pmap.c [0]PETSC ERROR: VecSetBlockSize() line 1471 in /Users/mahadevan/source/petsc-dev/src/vec/vec/interface/vector.c [0]PETSC ERROR: main() line 32 in src/vec/vec/examples/tests/ex1.c Thanks, Vijay From bsmith at mcs.anl.gov Tue Aug 27 16:49:42 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 27 Aug 2013 16:49:42 -0500 Subject: [petsc-users] Setting block size for MPI vectors In-Reply-To: References: Message-ID: You need to use the "long winded" construction of vectors to add these options. For example, VecCreate() VecSetBlockSize() VecSetSizes() VecSetType() the reason is that the Vec may be constructed differently based on these options but the VecCreateMPI() is suppose to produce an already constructed Vec. Barry On Aug 27, 2013, at 4:45 PM, "Vijay S. Mahadevan" wrote: > I am trying to create an MPI vector and then trying to set the block > size after the initial creation. But I receive an error in > PetscLayoutSetBlockSize. Since there is no explicit block size > argument during construction, I call VecSetBlockSize afterwards. The > relevant code and error are below. > > Is there an alternate call to set the block size to the vector so that > VecSetValuesBlocked will work correctly ? If there is no current way > to do this, that's fine and I can try to get the local array and set > the values accordingly. > > Code: > ierr = VecCreateMPI(PETSC_COMM_WORLD, 10, PETSC_DECIDE, &x);CHKERRQ(ierr); > ierr = VecSetBlockSize(x, 2);CHKERRQ(ierr); > ierr = VecSetFromOptions(x);CHKERRQ(ierr); > > Relevant errors: > > 0]PETSC ERROR: Arguments are incompatible! > [0]PETSC ERROR: Cannot change block size 1 to 2! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: > 0799ffdeffb47ef0458c1305293fae28c4d2cd92 GIT Date: 2013-08-01 > 06:36:16 +0800 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./ex1 on a arch-darwin-cxx-debug named > anlextwls002-168.wl.anl-external.org by mahadevan Tue Aug 27 16:37:18 > 2013 > [0]PETSC ERROR: Libraries linked from /opt/petsc-dbg/lib > [0]PETSC ERROR: Configure run at Wed Jul 31 18:03:46 2013 > [0]PETSC ERROR: Configure options --prefix=/opt/petsc-dbg > PETSC_ARCH=arch-darwin-cxx-debug --download-scalapack=1 > --download-blacs=1 > --with-mumps-dir=/Users/mahadevan/source/MUMPS_4.10.0-p3 > --download-hypre=1 --with-metis=1 --with-parmetis=1 > --known-mpi-shared-libraries=1 > --with-blas-lapack-lib=/System/Library/Frameworks/vecLib.framework/vecLib > --with-c++-support=1 --with-c-support=1 --with-cc=mpicc > --with-clanguage=C++ --with-moab-dir=/opt/moab > --with-dynamic-loading=1 --with-fc=mpif90 --with-fortran=1 > --with-mpi=1 --with-shared-libraries=1 --with-valgrind=1 > --with-valgrind-dir=/opt/local --with-cc=mpicc --with-cxx=mpicxx > COPTFLAGS="-g -fPIC" CXXOPTFLAGS="-g -fPIC" FOPTFLAGS="-g -fPIC" > --with-metis-dir=/usr/local --with-parmetis-dir=/usr/local > --with-netcdf-dir=/usr/local --with-zoltan=1 > --with-zoltan-lib="-L/usr/local/lib -lptscotch -lscotch -lscotchmetis > -lscotcherr -lscotcherrexit -lzoltan" > --with-zoltan-include=/usr/local/include --with-hdf5 > --with-hdf5-include=/usr/local/include > --with-hdf5-lib="-L/usr/local/lib -lhdf5_fortran -lhdf5" > --with-netcdf-dir=/usr/local --with-cmake=/opt/local/bin/cmake > --with-x-dir=/opt/local > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: PetscLayoutSetBlockSize() line 459 in > /Users/mahadevan/source/petsc-dev/src/vec/is/utils/pmap.c > [0]PETSC ERROR: VecSetBlockSize() line 1471 in > /Users/mahadevan/source/petsc-dev/src/vec/vec/interface/vector.c > [0]PETSC ERROR: main() line 32 in src/vec/vec/examples/tests/ex1.c > > Thanks, > Vijay From vijay.m at gmail.com Tue Aug 27 16:57:23 2013 From: vijay.m at gmail.com (Vijay S. Mahadevan) Date: Tue, 27 Aug 2013 16:57:23 -0500 Subject: [petsc-users] Setting block size for MPI vectors In-Reply-To: References: Message-ID: Thanks Barry. Yes, I tried that sequence and it works. But curiously if I call VecSetBlockSize after VecSetType, it does fails with the same error. Would it be useful to take block size as a user input in VecCreateMPI to be consistent with VecCreateMPIWithArray ? Just a thought. For now, I will stick with the long winded way. Vijay On Tue, Aug 27, 2013 at 4:49 PM, Barry Smith wrote: > > You need to use the "long winded" construction of vectors to add these options. For example, > > VecCreate() > VecSetBlockSize() > VecSetSizes() > VecSetType() > > the reason is that the Vec may be constructed differently based on these options but the VecCreateMPI() is suppose to produce an already constructed Vec. > > Barry > > > On Aug 27, 2013, at 4:45 PM, "Vijay S. Mahadevan" wrote: > >> I am trying to create an MPI vector and then trying to set the block >> size after the initial creation. But I receive an error in >> PetscLayoutSetBlockSize. Since there is no explicit block size >> argument during construction, I call VecSetBlockSize afterwards. The >> relevant code and error are below. >> >> Is there an alternate call to set the block size to the vector so that >> VecSetValuesBlocked will work correctly ? If there is no current way >> to do this, that's fine and I can try to get the local array and set >> the values accordingly. >> >> Code: >> ierr = VecCreateMPI(PETSC_COMM_WORLD, 10, PETSC_DECIDE, &x);CHKERRQ(ierr); >> ierr = VecSetBlockSize(x, 2);CHKERRQ(ierr); >> ierr = VecSetFromOptions(x);CHKERRQ(ierr); >> >> Relevant errors: >> >> 0]PETSC ERROR: Arguments are incompatible! >> [0]PETSC ERROR: Cannot change block size 1 to 2! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Development GIT revision: >> 0799ffdeffb47ef0458c1305293fae28c4d2cd92 GIT Date: 2013-08-01 >> 06:36:16 +0800 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./ex1 on a arch-darwin-cxx-debug named >> anlextwls002-168.wl.anl-external.org by mahadevan Tue Aug 27 16:37:18 >> 2013 >> [0]PETSC ERROR: Libraries linked from /opt/petsc-dbg/lib >> [0]PETSC ERROR: Configure run at Wed Jul 31 18:03:46 2013 >> [0]PETSC ERROR: Configure options --prefix=/opt/petsc-dbg >> PETSC_ARCH=arch-darwin-cxx-debug --download-scalapack=1 >> --download-blacs=1 >> --with-mumps-dir=/Users/mahadevan/source/MUMPS_4.10.0-p3 >> --download-hypre=1 --with-metis=1 --with-parmetis=1 >> --known-mpi-shared-libraries=1 >> --with-blas-lapack-lib=/System/Library/Frameworks/vecLib.framework/vecLib >> --with-c++-support=1 --with-c-support=1 --with-cc=mpicc >> --with-clanguage=C++ --with-moab-dir=/opt/moab >> --with-dynamic-loading=1 --with-fc=mpif90 --with-fortran=1 >> --with-mpi=1 --with-shared-libraries=1 --with-valgrind=1 >> --with-valgrind-dir=/opt/local --with-cc=mpicc --with-cxx=mpicxx >> COPTFLAGS="-g -fPIC" CXXOPTFLAGS="-g -fPIC" FOPTFLAGS="-g -fPIC" >> --with-metis-dir=/usr/local --with-parmetis-dir=/usr/local >> --with-netcdf-dir=/usr/local --with-zoltan=1 >> --with-zoltan-lib="-L/usr/local/lib -lptscotch -lscotch -lscotchmetis >> -lscotcherr -lscotcherrexit -lzoltan" >> --with-zoltan-include=/usr/local/include --with-hdf5 >> --with-hdf5-include=/usr/local/include >> --with-hdf5-lib="-L/usr/local/lib -lhdf5_fortran -lhdf5" >> --with-netcdf-dir=/usr/local --with-cmake=/opt/local/bin/cmake >> --with-x-dir=/opt/local >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: PetscLayoutSetBlockSize() line 459 in >> /Users/mahadevan/source/petsc-dev/src/vec/is/utils/pmap.c >> [0]PETSC ERROR: VecSetBlockSize() line 1471 in >> /Users/mahadevan/source/petsc-dev/src/vec/vec/interface/vector.c >> [0]PETSC ERROR: main() line 32 in src/vec/vec/examples/tests/ex1.c >> >> Thanks, >> Vijay > From stali at geology.wisc.edu Tue Aug 27 17:22:03 2013 From: stali at geology.wisc.edu (Tabrez Ali) Date: Tue, 27 Aug 2013 17:22:03 -0500 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <87bo4i99gx.fsf@mcs.anl.gov> References: <521D01E6.7070003@geology.wisc.edu> <87bo4i99gx.fsf@mcs.anl.gov> Message-ID: <0FCE0B45-0DF1-44D8-A12C-C4FA5F9F541D@geology.wisc.edu> I had missed PCSetCoordinates in ex56. So is there a correspondence b/w the coordinates [0.x 0.y 0.z 1.x 1.y 1.z ...] passed to PCSetCoordinates/MatNullSpaceCreateRigidBody and the solution vector [0.ux 0.uy 0.uz ...]? What happens when I am solving the poroelasticity problem where there is an additional pressure field associate with each node? Tabrez On Aug 27, 2013, at 3:15 PM, Jed Brown wrote: > Tabrez Ali writes: > >> Hello >> >> What is the proper way to use GAMG on a vanilla 3D linear elasticity >> problem. Should I use >> >> -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 > > Yeah, and only the first of these is needed because the others are > default with -pc_type gamg. > >> -pc_type fieldsplit -pc_fieldsplit_block_size 3 -fieldsplit_pc_type >> gamg >> -fieldsplit_pc_gamg_type agg -fieldsplit_pc_gamg_agg_nsmooths 1 >> >> Do these options even make sense? With the second set of options >> the % >> increase in number of iterations with increasing problem size is >> lower >> than the first but not optimal. > > And it's probably more expensive because it has to do inner solves. > Also, if you have less compressible regions, it will get much worse. > >> Also, ksp/ksp/examples/ex56 performs much better in that the number >> of >> iterations remain more or less constant unlike what I see with my own >> problem. What am I doing wrong? > > You probably forgot to set the near null space. You can use > MatSetNearNullSpace (and maybe MatNullSpaceCreateRigidBody) or the > more > hacky (IMO) PCSetCoordinates. It's important to have translational > *and* rotational modes in the near null space that GAMG uses to > build a > coarse space. From bsmith at mcs.anl.gov Tue Aug 27 17:23:50 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 27 Aug 2013 17:23:50 -0500 Subject: [petsc-users] Setting block size for MPI vectors In-Reply-To: References: Message-ID: On Aug 27, 2013, at 4:57 PM, Vijay S. Mahadevan wrote: > Thanks Barry. Yes, I tried that sequence and it works. But curiously > if I call VecSetBlockSize after VecSetType, it does fails with the > same error. Yupe. Because the SetType() function is where the construction actually takes place (in C++ the object is instantiated). > > Would it be useful to take block size as a user input in VecCreateMPI > to be consistent with VecCreateMPIWithArray ? Just a thought. For now, > I will stick with the long winded way. We've debated back and forth this. The WithArray() version has to take the block size because it has to instantiate the vector at that point because it needs a place to put the array pointer. Our longer term plan is that most PETSc programs would not directly be creating Vecs and Mats with VecCreate? or MatCreate? rather they would create a DM and then use the DM object to create the appropriately laid out vectors and matrices for the users and solvers with DMCreateGlobalVector() etc. Barry > > Vijay > > On Tue, Aug 27, 2013 at 4:49 PM, Barry Smith wrote: >> >> You need to use the "long winded" construction of vectors to add these options. For example, >> >> VecCreate() >> VecSetBlockSize() >> VecSetSizes() >> VecSetType() >> >> the reason is that the Vec may be constructed differently based on these options but the VecCreateMPI() is suppose to produce an already constructed Vec. >> >> Barry >> >> >> On Aug 27, 2013, at 4:45 PM, "Vijay S. Mahadevan" wrote: >> >>> I am trying to create an MPI vector and then trying to set the block >>> size after the initial creation. But I receive an error in >>> PetscLayoutSetBlockSize. Since there is no explicit block size >>> argument during construction, I call VecSetBlockSize afterwards. The >>> relevant code and error are below. >>> >>> Is there an alternate call to set the block size to the vector so that >>> VecSetValuesBlocked will work correctly ? If there is no current way >>> to do this, that's fine and I can try to get the local array and set >>> the values accordingly. >>> >>> Code: >>> ierr = VecCreateMPI(PETSC_COMM_WORLD, 10, PETSC_DECIDE, &x);CHKERRQ(ierr); >>> ierr = VecSetBlockSize(x, 2);CHKERRQ(ierr); >>> ierr = VecSetFromOptions(x);CHKERRQ(ierr); >>> >>> Relevant errors: >>> >>> 0]PETSC ERROR: Arguments are incompatible! >>> [0]PETSC ERROR: Cannot change block size 1 to 2! >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Development GIT revision: >>> 0799ffdeffb47ef0458c1305293fae28c4d2cd92 GIT Date: 2013-08-01 >>> 06:36:16 +0800 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./ex1 on a arch-darwin-cxx-debug named >>> anlextwls002-168.wl.anl-external.org by mahadevan Tue Aug 27 16:37:18 >>> 2013 >>> [0]PETSC ERROR: Libraries linked from /opt/petsc-dbg/lib >>> [0]PETSC ERROR: Configure run at Wed Jul 31 18:03:46 2013 >>> [0]PETSC ERROR: Configure options --prefix=/opt/petsc-dbg >>> PETSC_ARCH=arch-darwin-cxx-debug --download-scalapack=1 >>> --download-blacs=1 >>> --with-mumps-dir=/Users/mahadevan/source/MUMPS_4.10.0-p3 >>> --download-hypre=1 --with-metis=1 --with-parmetis=1 >>> --known-mpi-shared-libraries=1 >>> --with-blas-lapack-lib=/System/Library/Frameworks/vecLib.framework/vecLib >>> --with-c++-support=1 --with-c-support=1 --with-cc=mpicc >>> --with-clanguage=C++ --with-moab-dir=/opt/moab >>> --with-dynamic-loading=1 --with-fc=mpif90 --with-fortran=1 >>> --with-mpi=1 --with-shared-libraries=1 --with-valgrind=1 >>> --with-valgrind-dir=/opt/local --with-cc=mpicc --with-cxx=mpicxx >>> COPTFLAGS="-g -fPIC" CXXOPTFLAGS="-g -fPIC" FOPTFLAGS="-g -fPIC" >>> --with-metis-dir=/usr/local --with-parmetis-dir=/usr/local >>> --with-netcdf-dir=/usr/local --with-zoltan=1 >>> --with-zoltan-lib="-L/usr/local/lib -lptscotch -lscotch -lscotchmetis >>> -lscotcherr -lscotcherrexit -lzoltan" >>> --with-zoltan-include=/usr/local/include --with-hdf5 >>> --with-hdf5-include=/usr/local/include >>> --with-hdf5-lib="-L/usr/local/lib -lhdf5_fortran -lhdf5" >>> --with-netcdf-dir=/usr/local --with-cmake=/opt/local/bin/cmake >>> --with-x-dir=/opt/local >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: PetscLayoutSetBlockSize() line 459 in >>> /Users/mahadevan/source/petsc-dev/src/vec/is/utils/pmap.c >>> [0]PETSC ERROR: VecSetBlockSize() line 1471 in >>> /Users/mahadevan/source/petsc-dev/src/vec/vec/interface/vector.c >>> [0]PETSC ERROR: main() line 32 in src/vec/vec/examples/tests/ex1.c >>> >>> Thanks, >>> Vijay >> From jedbrown at mcs.anl.gov Tue Aug 27 17:50:16 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 27 Aug 2013 17:50:16 -0500 Subject: [petsc-users] Setting block size for MPI vectors In-Reply-To: References: Message-ID: <87ioyq7npz.fsf@mcs.anl.gov> Barry Smith writes: > Our longer term plan is that most PETSc programs would not directly > be creating Vecs and Mats with VecCreate? or MatCreate? rather they > would create a DM and then use the DM object to create the > appropriately laid out vectors and matrices for the users and > solvers with DMCreateGlobalVector() etc. I don't think that's ever going to be a general solution, so "raw" Vec creation will always matter too. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Tue Aug 27 17:52:11 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 27 Aug 2013 17:52:11 -0500 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <0FCE0B45-0DF1-44D8-A12C-C4FA5F9F541D@geology.wisc.edu> References: <521D01E6.7070003@geology.wisc.edu> <87bo4i99gx.fsf@mcs.anl.gov> <0FCE0B45-0DF1-44D8-A12C-C4FA5F9F541D@geology.wisc.edu> Message-ID: <87fvtu7nms.fsf@mcs.anl.gov> Tabrez Ali writes: > I had missed PCSetCoordinates in ex56. > > So is there a correspondence b/w the coordinates [0.x 0.y 0.z 1.x 1.y > 1.z ...] passed to PCSetCoordinates/MatNullSpaceCreateRigidBody and > the solution vector [0.ux 0.uy 0.uz ...]? MatNullSpaceCreateRigidBody assumes that you are in displacement form with this association. It just uses the coordinates to create the three translations and three rotations (in 3D). > What happens when I am solving the poroelasticity problem where there > is an additional pressure field associate with each node? Create your own near null space: 6 rigid body modes plus 1 constant pressure mode. Look at the code for MatNullSpaceCreateRigidBody for a start. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From vijay.m at gmail.com Tue Aug 27 18:21:48 2013 From: vijay.m at gmail.com (Vijay S. Mahadevan) Date: Tue, 27 Aug 2013 18:21:48 -0500 Subject: [petsc-users] Setting block size for MPI vectors In-Reply-To: <87ioyq7npz.fsf@mcs.anl.gov> References: <87ioyq7npz.fsf@mcs.anl.gov> Message-ID: > We've debated back and forth this. The WithArray() version has to take the block size because it has to instantiate the vector at that point because it needs a place to put the array pointer. Understood. But since VecCreateMPI is creating the vector completely too, should it not know the block size a-priori ? The long winded way is general enough that I like it but the specific API would be incomplete without giving this ability to the user IMO. > I don't think that's ever going to be a general solution, so "raw" Vec > creation will always matter too. And I agree with Jed. Even if DM helps make a user's life easier, there are so many applications that just want to create a matrix, vector and let petsc solve the system without needing to worry about creating a DM Wrapper and hop another indirection. On Tue, Aug 27, 2013 at 5:50 PM, Jed Brown wrote: > Barry Smith writes: >> Our longer term plan is that most PETSc programs would not directly >> be creating Vecs and Mats with VecCreate? or MatCreate? rather they >> would create a DM and then use the DM object to create the >> appropriately laid out vectors and matrices for the users and >> solvers with DMCreateGlobalVector() etc. > > I don't think that's ever going to be a general solution, so "raw" Vec > creation will always matter too. From bsmith at mcs.anl.gov Tue Aug 27 18:35:51 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 27 Aug 2013 18:35:51 -0500 Subject: [petsc-users] Setting block size for MPI vectors In-Reply-To: References: <87ioyq7npz.fsf@mcs.anl.gov> Message-ID: <231D70CA-F9C1-453A-97AE-00B7F49575ED@mcs.anl.gov> On Aug 27, 2013, at 6:21 PM, "Vijay S. Mahadevan" wrote: >> We've debated back and forth this. The WithArray() version has to take the block size because it has to instantiate the vector at that point because it needs a place to put the array pointer. > > Understood. But since VecCreateMPI is creating the vector completely > too, should it not know the block size a-priori ? The long winded way > is general enough that I like it but the specific API would be > incomplete without giving this ability to the user IMO. Ok we add one more argument for this feature. But what about another feature, another argument? And another feature another argument? Each incremental addition sounds good, adding a bit more functionality to the create at a small cost. But after adding ten new arguments the cost is suddenly high and you are writing lapack. > >> I don't think that's ever going to be a general solution, so "raw" Vec >> creation will always matter too. > > And I agree with Jed. Even if DM helps make a user's life easier, > there are so many applications that just want to create a matrix, > vector and let petsc solve the system without needing to worry about > creating a DM Wrapper and hop another indirection. > > > On Tue, Aug 27, 2013 at 5:50 PM, Jed Brown wrote: >> Barry Smith writes: >>> Our longer term plan is that most PETSc programs would not directly >>> be creating Vecs and Mats with VecCreate? or MatCreate? rather they >>> would create a DM and then use the DM object to create the >>> appropriately laid out vectors and matrices for the users and >>> solvers with DMCreateGlobalVector() etc. >> >> I don't think that's ever going to be a general solution, so "raw" Vec >> creation will always matter too. From vijay.m at gmail.com Tue Aug 27 20:47:51 2013 From: vijay.m at gmail.com (Vijay S. Mahadevan) Date: Tue, 27 Aug 2013 20:47:51 -0500 Subject: [petsc-users] Setting block size for MPI vectors In-Reply-To: <231D70CA-F9C1-453A-97AE-00B7F49575ED@mcs.anl.gov> References: <87ioyq7npz.fsf@mcs.anl.gov> <231D70CA-F9C1-453A-97AE-00B7F49575ED@mcs.anl.gov> Message-ID: > Ok we add one more argument for this feature. But what about another feature, another argument? And another feature another argument? I am not asking to add more and more arguments. But since this exists in *WithArray routines, it would be nice to have uniformity. As I said, I like the more general route and this was a specific use case that I needed only recently. Hence the reason for the thread and the original confusion. On Tue, Aug 27, 2013 at 6:35 PM, Barry Smith wrote: > > On Aug 27, 2013, at 6:21 PM, "Vijay S. Mahadevan" wrote: > >>> We've debated back and forth this. The WithArray() version has to take the block size because it has to instantiate the vector at that point because it needs a place to put the array pointer. >> >> Understood. But since VecCreateMPI is creating the vector completely >> too, should it not know the block size a-priori ? The long winded way >> is general enough that I like it but the specific API would be >> incomplete without giving this ability to the user IMO. > > Ok we add one more argument for this feature. But what about another feature, another argument? And another feature another argument? Each incremental addition sounds good, adding a bit more functionality to the create at a small cost. But after adding ten new arguments the cost is suddenly high and you are writing lapack. > > >> >>> I don't think that's ever going to be a general solution, so "raw" Vec >>> creation will always matter too. >> >> And I agree with Jed. Even if DM helps make a user's life easier, >> there are so many applications that just want to create a matrix, >> vector and let petsc solve the system without needing to worry about >> creating a DM Wrapper and hop another indirection. >> >> >> On Tue, Aug 27, 2013 at 5:50 PM, Jed Brown wrote: >>> Barry Smith writes: >>>> Our longer term plan is that most PETSc programs would not directly >>>> be creating Vecs and Mats with VecCreate? or MatCreate? rather they >>>> would create a DM and then use the DM object to create the >>>> appropriately laid out vectors and matrices for the users and >>>> solvers with DMCreateGlobalVector() etc. >>> >>> I don't think that's ever going to be a general solution, so "raw" Vec >>> creation will always matter too. > From olivier.bonnefon at avignon.inra.fr Wed Aug 28 03:28:12 2013 From: olivier.bonnefon at avignon.inra.fr (Olivier Bonnefon) Date: Wed, 28 Aug 2013 10:28:12 +0200 Subject: [petsc-users] distribute and cells mapping. In-Reply-To: References: <52177321.4080900@avignon.inra.fr> Message-ID: <521DB49C.6010101@avignon.inra.fr> On 08/23/2013 04:42 PM, Matthew Knepley wrote: > On Fri, Aug 23, 2013 at 9:35 AM, Olivier Bonnefon > > wrote: > > Hello, > > Thanks for your answers, I'm now able to import and distribute a mesh: > > > You might simplify this to > > if (rank) {obNbCells = 0; obNbVertex = 0;} > ierr = > DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); > > if (!rank){ > ierr = > DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); > for (i=0;i ierr =DMPlexSetLabelValue(*dm, "marker", > obBoundary[i]+obNbCells, 1);CHKERRQ(ierr); > } > }else { > ierr = > DMPlexCreateFromCellList(comm,dim,0,0,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); > } > > ierr = DMPlexDistribute(*dm, partitioner, 0, > &distributedMesh);CHKERRQ(ierr); > if (distributedMesh) { > ierr = DMDestroy(dm);CHKERRQ(ierr); > *dm = distributedMesh; > } > > Is it possible to known the resulting partition ? ie, What is the > mapping between the initial cell number and the local cell (used > in DMPlexComputeResidualFEM)? > I need this to write an efficient implementation of the FEM struct > functions f0 and g0, space depending. > > > Yes, but I really do not think you want to do things that way. I am > assuming you want different material models or something > in different places. The way I envision that is using a DMLabel to > mark up parts of the domain. All labels are automatically > distributed with the mesh. Is that what you want? Hello, It is exactly what I need: I'm mobilized a landscape, and the parameters of the model depend of the type of crop. Therefore, I have created a label for each type of crop and I have labeled each triangle with the corresponding label: for (i=0;i > Thanks, > > Matt > > Regards, > > Olivier B > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 72 21 58 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jtravs at gmail.com Wed Aug 28 07:42:53 2013 From: jtravs at gmail.com (John Travers) Date: Wed, 28 Aug 2013 14:42:53 +0200 Subject: [petsc-users] converting scipy sparse CSR matrix to petcs matrix with mpi Message-ID: <9B30BF4C-13C0-4123-BAAA-11D476FDA35C@gmail.com> Hi, I currently generate PETSc matrices from scipy.sparse CSR format matrices as follows (where A is a scipy sparse CSR matrix): pA = PETSc.Mat().createAIJ(size=A.shape, csr=(A.indptr, A.indices, A.data)) This work correctly on sequential runs, but if I run under MPI I get an error which I presume to be caused by the fact that all of my MPI processes try to simultaneously create this matrix, rather than splitting it? Eg. for 4 processes I get: ValueError: size(I) is 32769, expected 8193 csr=(A.indptr, A.indices, A.data)) File "Mat.pyx", line 256, in petsc4py.PETSc.Mat.createAIJ (src/petsc4py.PETSc.c:82905) What is the best/simplest/most efficient way to convert existing data in CSR format to a parallel sparse PETSc matrix for use by multiple MPI processes? Thanks for your help! John From jedbrown at mcs.anl.gov Wed Aug 28 07:58:08 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 28 Aug 2013 07:58:08 -0500 Subject: [petsc-users] converting scipy sparse CSR matrix to petcs matrix with mpi In-Reply-To: <9B30BF4C-13C0-4123-BAAA-11D476FDA35C@gmail.com> References: <9B30BF4C-13C0-4123-BAAA-11D476FDA35C@gmail.com> Message-ID: <87txia2crj.fsf@mcs.anl.gov> John Travers writes: > Hi, > > I currently generate PETSc matrices from scipy.sparse CSR format matrices as follows (where A is a scipy sparse CSR matrix): > > pA = PETSc.Mat().createAIJ(size=A.shape, csr=(A.indptr, A.indices, A.data)) > > This work correctly on sequential runs, but if I run under MPI I get an error which I presume to be caused by the fact that all of my MPI processes try to simultaneously create this matrix, rather than splitting it? Eg. for 4 processes I get: Yeah, the size of the passed CSR part doesn't match the local size of the matrix. I think that given a range [rstart,rend), you can pass csr=(A.indptr[rstart:rend] - A.indptr[rstart], A.indices[A.indptr[rstart]:A.indptr[rend]], A.data[A.indptr[rstart]:A.indptr[rend]]) More simply, you can just create the matrix and loop over rows calling MatSetValues, but you have to do half of the spec above to set the number of nonzeros per row if you want it to be fast. Do you have to start with redundantly-computed scipy matrices? (That'll be a scalability bottleneck. It's usually important to distribute computation of the matrix entries unless you're only trying to use a few cores.) > ValueError: size(I) is 32769, expected 8193 > csr=(A.indptr, A.indices, A.data)) > File "Mat.pyx", line 256, in petsc4py.PETSc.Mat.createAIJ (src/petsc4py.PETSc.c:82905) > > What is the best/simplest/most efficient way to convert existing data in CSR format to a parallel sparse PETSc matrix for use by multiple MPI processes? Do you know what partition you want to use? Once you have a range of rows, you should -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From garnet.vaz at gmail.com Wed Aug 28 08:13:15 2013 From: garnet.vaz at gmail.com (Garnet Vaz) Date: Wed, 28 Aug 2013 06:13:15 -0700 Subject: [petsc-users] Optimized run crashes on one machine but not another Message-ID: Hi, I just rebuilt PETSc on both my laptop and my desktop. On both machines the output of >grep GIT configure.log Defined "VERSION_GIT" to ""d8f7425765acda418e23a679c25fd616d9da8153"" Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" My code runs on both machines in the debug build without causing any problems. When I try to run the optimized build, the code crashes with a SEGV fault on my laptop but not on the desktop. I have built PETSc using the same configure options. I have attached the outputs of valgrind for both my laptop/desktop for both the debug/opt builds. How can I figure out what differences are causing the errors in one case and not the other? Thanks. -- Regards, Garnet -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: laptopopt.log Type: application/octet-stream Size: 47712 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: desktopopt.log Type: application/octet-stream Size: 22340 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: desktopdebug.log Type: application/octet-stream Size: 20269 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: laptopdebug.log Type: application/octet-stream Size: 20462 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Wed Aug 28 08:38:11 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 28 Aug 2013 08:38:11 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: Message-ID: <87ob8i2aws.fsf@mcs.anl.gov> Garnet Vaz writes: > Hi, > > I just rebuilt PETSc on both my laptop and my desktop. > On both machines the output of >grep GIT configure.log > Defined "VERSION_GIT" to > ""d8f7425765acda418e23a679c25fd616d9da8153"" > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" Thanks for the report. Matt just merged a bunch of DMPlex-related branches (about 60 commits in total). Can you 'git pull && make' to let us know if the problem is still there? (It may not fix the issue, but at least we'll be debugging current code.) When dealing with debug vs. optimized issues, it's useful to configure --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to include line numbers, but it (usually!) does not affect whether the error occurs. > My code runs on both machines in the debug build without causing > any problems. When I try to run the optimized build, the code crashes > with a SEGV fault on my laptop but not on the desktop. I have built > PETSc using the same configure options. > > I have attached the outputs of valgrind for both my laptop/desktop for > both the debug/opt builds. How can I figure out what differences are > causing the errors in one case and not the other? It looks like an uninitialized variable. Debug mode often ends up initializing local variables where as optimized leaves junk in them. Stack allocation alignment/padding is also often different. Unfortunately, valgrind is less powerful for debugging stack corruption, so the uninitialized warning is usually the best you get. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From olivier.bonnefon at avignon.inra.fr Wed Aug 28 10:01:53 2013 From: olivier.bonnefon at avignon.inra.fr (Olivier Bonnefon) Date: Wed, 28 Aug 2013 17:01:53 +0200 Subject: [petsc-users] distribute and cells mapping. In-Reply-To: <521DB49C.6010101@avignon.inra.fr> References: <52177321.4080900@avignon.inra.fr> <521DB49C.6010101@avignon.inra.fr> Message-ID: <521E10E1.50109@avignon.inra.fr> On 08/28/2013 10:28 AM, Olivier Bonnefon wrote: > On 08/23/2013 04:42 PM, Matthew Knepley wrote: >> On Fri, Aug 23, 2013 at 9:35 AM, Olivier Bonnefon >> > > wrote: >> >> Hello, >> >> Thanks for your answers, I'm now able to import and distribute a >> mesh: >> >> >> You might simplify this to >> >> if (rank) {obNbCells = 0; obNbVertex = 0;} >> ierr = >> DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >> >> if (!rank){ >> ierr = >> DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >> for (i=0;i> ierr =DMPlexSetLabelValue(*dm, "marker", >> obBoundary[i]+obNbCells, 1);CHKERRQ(ierr); >> } >> }else { >> ierr = >> DMPlexCreateFromCellList(comm,dim,0,0,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >> } >> >> ierr = DMPlexDistribute(*dm, partitioner, 0, >> &distributedMesh);CHKERRQ(ierr); >> if (distributedMesh) { >> ierr = DMDestroy(dm);CHKERRQ(ierr); >> *dm = distributedMesh; >> } >> >> Is it possible to known the resulting partition ? ie, What is the >> mapping between the initial cell number and the local cell (used >> in DMPlexComputeResidualFEM)? >> I need this to write an efficient implementation of the FEM >> struct functions f0 and g0, space depending. >> >> >> Yes, but I really do not think you want to do things that way. I am >> assuming you want different material models or something >> in different places. The way I envision that is using a DMLabel to >> mark up parts of the domain. All labels are automatically >> distributed with the mesh. Is that what you want? > Hello, > > It is exactly what I need: I'm mobilized a landscape, and the > parameters of the model depend of the type of crop. Therefore, I have > created a label for each type of crop and I have labeled each triangle > with the corresponding label: > > for (i=0;i if (labelCells[i]==1){ > ierr =DMPlexSetLabelValue(*dm, "marker1", i, 1);CHKERRQ(ierr); > }else{ > ierr =DMPlexSetLabelValue(*dm, "marker2", i, 1);CHKERRQ(ierr); > } > } > > So, I'm able to mark the triangles, but I'm not able to get this label > in the plugin "fem.f0Funcs" and "fem.g0Funcs": These plugins are > called by looping on the triangles in the function > "FEMIntegrateResidualBatch", but the dm is not available, so I can't > use the functions DMPlexGetLabel, DMLabelGetStratumSize and > DMLabelGetStratumIS. What is the good way to get the labels in the > user plugins of the fem struct ? > > > Thanks a lot for your help. > > Olivier B Hello, This is the solution I implemented to get the label level in the plugins "fem.f0Funcs" and "fem.g0Funcs": I need the DM and the index element, so i do: 1) I add some static variables: static DM * spDM[128]; static int scurElem[128]; 2) I overload the DMPlexComputeJacobianFEM with : PetscErrorCode MY_DMPlexComputeJacobianFEM(DM dm, Vec X, Mat Jac, Mat JacP, MatStructure *str,void *user) { PetscMPIInt rank; PetscErrorCode ierr; PetscFunctionBeginUser; ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank);CHKERRQ(ierr); spDM[rank]=&dm; PetscFunctionReturn(DMPlexComputeJacobianFEM(dm, X, Jac,JacP, str,user)); } 3) overload FEMIntegrateResidualBatch adding code: . . for (e = 0; e < Ne; ++e) { scurElem[rank]=e;//added ligne . . So that, I can get the label level using DMPlexHasLabel and DMLabelGetValue I'm sure this solution is awful, and works only in this version, but i didn't find a better way to get the label in the plugins fem struc. Do you know the correct way to do that ?? Thanks, Olivier B >> >> Thanks, >> >> Matt >> >> Regards, >> >> Olivier B >> >> -- >> Olivier Bonnefon >> INRA PACA-Avignon, Unit? BioSP >> Tel: +33 (0)4 32 72 21 58 >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener > > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 72 21 58 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 28 11:08:59 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Aug 2013 11:08:59 -0500 Subject: [petsc-users] distribute and cells mapping. In-Reply-To: <521E10E1.50109@avignon.inra.fr> References: <52177321.4080900@avignon.inra.fr> <521DB49C.6010101@avignon.inra.fr> <521E10E1.50109@avignon.inra.fr> Message-ID: On Wed, Aug 28, 2013 at 10:01 AM, Olivier Bonnefon < olivier.bonnefon at avignon.inra.fr> wrote: > On 08/28/2013 10:28 AM, Olivier Bonnefon wrote: > > On 08/23/2013 04:42 PM, Matthew Knepley wrote: > > On Fri, Aug 23, 2013 at 9:35 AM, Olivier Bonnefon < > olivier.bonnefon at avignon.inra.fr> wrote: > >> Hello, >> >> Thanks for your answers, I'm now able to import and distribute a mesh: >> > > You might simplify this to > > if (rank) {obNbCells = 0; obNbVertex = 0;} > ierr = > DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); > > >> if (!rank){ >> ierr = >> DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >> for (i=0;i> ierr =DMPlexSetLabelValue(*dm, "marker", >> obBoundary[i]+obNbCells, 1);CHKERRQ(ierr); >> } >> }else { >> ierr = >> DMPlexCreateFromCellList(comm,dim,0,0,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >> } >> >> ierr = DMPlexDistribute(*dm, partitioner, 0, >> &distributedMesh);CHKERRQ(ierr); >> if (distributedMesh) { >> ierr = DMDestroy(dm);CHKERRQ(ierr); >> *dm = distributedMesh; >> } >> >> Is it possible to known the resulting partition ? ie, What is the mapping >> between the initial cell number and the local cell (used in >> DMPlexComputeResidualFEM)? >> I need this to write an efficient implementation of the FEM struct >> functions f0 and g0, space depending. >> > > Yes, but I really do not think you want to do things that way. I am > assuming you want different material models or something > in different places. The way I envision that is using a DMLabel to mark up > parts of the domain. All labels are automatically > distributed with the mesh. Is that what you want? > > Hello, > > It is exactly what I need: I'm mobilized a landscape, and the parameters > of the model depend of the type of crop. Therefore, I have created a label > for each type of crop and I have labeled each triangle with the > corresponding label: > > for (i=0;i if (labelCells[i]==1){ > ierr =DMPlexSetLabelValue(*dm, "marker1", i, 1);CHKERRQ(ierr); > }else{ > ierr =DMPlexSetLabelValue(*dm, "marker2", i, 1);CHKERRQ(ierr); > } > } > > So, I'm able to mark the triangles, but I'm not able to get this label in > the plugin "fem.f0Funcs" and "fem.g0Funcs": These plugins are called by > looping on the triangles in the function "FEMIntegrateResidualBatch", but > the dm is not available, so I can't use the functions DMPlexGetLabel, > DMLabelGetStratumSize and DMLabelGetStratumIS. What is the good way to get > the labels in the user plugins of the fem struct ? > > So lets start with the abstract problem so that I can see exactly what you want to do. In ex12 (or ex62, etc.) I have a single equation, so I do a loop over all cells. This loop takes place in DMPlexComputeResidualFEM(). You would instead like to do a few loops over sets of cells with different material models, using different f0/f1. Is this correct? > Thanks a lot for your help. > > Olivier B > > Hello, > > This is the solution I implemented to get the label level in the plugins > "fem.f0Funcs" and "fem.g0Funcs": > > I need the DM and the index element, so i do: > 1) I add some static variables: > static DM * spDM[128]; > static int scurElem[128]; > Notice that the DM is available in DMPlexComputeResidualFEM(). Here is what the function does: a) Batches up elements into groups b) Integrates each group using a call to FEMIntegrateResidualBatch(). Notice that in 'next' this has changed to PetscFEIntegrateResidual() since we have added a few FEM classes to make things simpler and more flexible. What you can do, I think, to get what you want is: a) Write a new MY_DMPlexComputeResidualFEM() to do a few loops. This is supplied to your app using ierr = DMSNESSetFunctionLocal(dm, (PetscErrorCode (*)(DM,Vec,Vec,void*)) MY_DMPlexComputeResidualFEM, &user);CHKERRQ(ierr); ierr = DMSNESSetJacobianLocal(dm, (PetscErrorCode (*)(DM,Vec,Mat,Mat,MatStructure*,void*)) MY_DMPlexComputeJacobianFEM, &user);CHKERRQ(ierr); just as in the examples. You could use different f0/f1 for each loop somehow. b) Write a new PetscFEIntegrateResidual() that does what you want. The easiest way to do this is create a new PetscFE subclass, since they only really do one thing which is these integrals. I can help you. HOWEVER, if what you really want to do is get coefficient information into f0/f1 instead of a different physical model, then you can do something easier that we just put in. You can layout a coefficient, like nu in \div \nu \grad u = \rho and provide a DM for \nu. This will be passed all the way down inside until f0 gets f0Func(u, gradU, nu, gradNu, x, f0) so that the pointwise values of the your coefficient and its gradient are available to your physics. I am sure there will be questions about this, but the first thing to do is get entirely clear what you want to do. Thanks, Matt 2) I overload the DMPlexComputeJacobianFEM with : > PetscErrorCode MY_DMPlexComputeJacobianFEM(DM dm, Vec X, Mat Jac, Mat > JacP, MatStructure *str,void *user) > { > > PetscMPIInt rank; > PetscErrorCode ierr; > > PetscFunctionBeginUser; > ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank);CHKERRQ(ierr); > spDM[rank]=&dm; > PetscFunctionReturn(DMPlexComputeJacobianFEM(dm, X, Jac,JacP, str,user)); > > } > 3) overload FEMIntegrateResidualBatch adding code: > . > . > for (e = 0; e < Ne; ++e) { > scurElem[rank]=e;//added ligne > . > . > > So that, I can get the label level using DMPlexHasLabel and > DMLabelGetValue > > I'm sure this solution is awful, and works only in this version, but i > didn't find a better way to get the label in the plugins fem struc. Do you > know the correct way to do that ?? > > Thanks, > > Olivier B > > > Thanks, > > Matt > > >> Regards, >> >> Olivier B >> >> -- >> Olivier Bonnefon >> INRA PACA-Avignon, Unit? BioSP >> Tel: +33 (0)4 32 72 21 58 <%2B33%20%280%294%2032%2072%2021%2058> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From garnet.vaz at gmail.com Wed Aug 28 12:52:08 2013 From: garnet.vaz at gmail.com (Garnet Vaz) Date: Wed, 28 Aug 2013 10:52:08 -0700 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: <87ob8i2aws.fsf@mcs.anl.gov> References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: Thanks Jed. I did as you told and the code finally crashes on both builds. I installed the 3.4.2 release now. The problem now seems to come from DMPlexDistribute(). I have two versions to load the mesh. One creates a mesh using Triangle from PETSc and the other loads a mesh using DMPlexCreateFromCellList(). Is the following piece of code for creating a mesh using Triangle right? ierr = DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); if (user->dm) { DM refinedMesh = NULL; DM distributedMesh = NULL; ierr = DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); ierr = DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); if (refinedMesh) { ierr = DMDestroy(&user->dm);CHKERRQ(ierr); user->dm = refinedMesh; } ierr = DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); if (distributedMesh) { ierr = DMDestroy(&user->dm);CHKERRQ(ierr); user->dm = distributedMesh; } } Using gdb, the code gives a SEGV during distribution. The backtrace when the fault occurs points to an invalid pointer for ISGetIndices(). Attached is a screenshot of the gdb backtrace. Do I need to set up some index set here? The same error occurs when trying to distribute a mesh using DMPlexCreateFromCellList(). Thanks for the help. - Garnet On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: > Garnet Vaz writes: > > > Hi, > > > > I just rebuilt PETSc on both my laptop and my desktop. > > On both machines the output of >grep GIT configure.log > > Defined "VERSION_GIT" to > > ""d8f7425765acda418e23a679c25fd616d9da8153"" > > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" > > Thanks for the report. Matt just merged a bunch of DMPlex-related > branches (about 60 commits in total). Can you 'git pull && make' to let > us know if the problem is still there? (It may not fix the issue, but > at least we'll be debugging current code.) > > When dealing with debug vs. optimized issues, it's useful to configure > --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to include > line numbers, but it (usually!) does not affect whether the error > occurs. > > > My code runs on both machines in the debug build without causing > > any problems. When I try to run the optimized build, the code crashes > > with a SEGV fault on my laptop but not on the desktop. I have built > > PETSc using the same configure options. > > > > I have attached the outputs of valgrind for both my laptop/desktop for > > both the debug/opt builds. How can I figure out what differences are > > causing the errors in one case and not the other? > > It looks like an uninitialized variable. Debug mode often ends up > initializing local variables where as optimized leaves junk in them. > Stack allocation alignment/padding is also often different. > Unfortunately, valgrind is less powerful for debugging stack corruption, > so the uninitialized warning is usually the best you get. > -- Regards, Garnet -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot from 2013-08-28 10:31:25.png Type: image/png Size: 88274 bytes Desc: not available URL: From knepley at gmail.com Wed Aug 28 13:43:56 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Aug 2013 13:43:56 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: > Thanks Jed. I did as you told and the code finally crashes on both > builds. I installed the 3.4.2 release now. > > The problem now seems to come from DMPlexDistribute(). I have two > versions to load the mesh. One creates a mesh using Triangle > from PETSc and the other loads a mesh using DMPlexCreateFromCellList(). > > Is the following piece of code for creating a mesh using Triangle right? > Okay, something is really very wrong here. It is calling EnlargePartition(), but for that path to be taken, you have to trip and earlier exception. It should not be possible to call it. So I think you have memory corruption somewhere. Can you send a sample code we can run? Thanks, Matt > ierr = DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); > if (user->dm) { > DM refinedMesh = NULL; > DM distributedMesh = NULL; > ierr = > DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); > ierr = DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); > if (refinedMesh) { > ierr = DMDestroy(&user->dm);CHKERRQ(ierr); > user->dm = refinedMesh; > } > ierr = > DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); > if (distributedMesh) { > ierr = DMDestroy(&user->dm);CHKERRQ(ierr); > user->dm = distributedMesh; > } > } > > Using gdb, the code gives a SEGV during distribution. The backtrace when > the fault > occurs points to an invalid pointer for ISGetIndices(). Attached is a > screenshot of the > gdb backtrace. > Do I need to set up some index set here? > > The same error occurs when trying to distribute a mesh using > DMPlexCreateFromCellList(). > > Thanks for the help. > > > - > Garnet > > > On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: > >> Garnet Vaz writes: >> >> > Hi, >> > >> > I just rebuilt PETSc on both my laptop and my desktop. >> > On both machines the output of >grep GIT configure.log >> > Defined "VERSION_GIT" to >> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" >> >> Thanks for the report. Matt just merged a bunch of DMPlex-related >> branches (about 60 commits in total). Can you 'git pull && make' to let >> us know if the problem is still there? (It may not fix the issue, but >> at least we'll be debugging current code.) >> >> When dealing with debug vs. optimized issues, it's useful to configure >> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to include >> line numbers, but it (usually!) does not affect whether the error >> occurs. >> >> > My code runs on both machines in the debug build without causing >> > any problems. When I try to run the optimized build, the code crashes >> > with a SEGV fault on my laptop but not on the desktop. I have built >> > PETSc using the same configure options. >> > >> > I have attached the outputs of valgrind for both my laptop/desktop for >> > both the debug/opt builds. How can I figure out what differences are >> > causing the errors in one case and not the other? >> >> It looks like an uninitialized variable. Debug mode often ends up >> initializing local variables where as optimized leaves junk in them. >> Stack allocation alignment/padding is also often different. >> Unfortunately, valgrind is less powerful for debugging stack corruption, >> so the uninitialized warning is usually the best you get. >> > > > > -- > Regards, > Garnet > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From garnet.vaz at gmail.com Wed Aug 28 13:58:44 2013 From: garnet.vaz at gmail.com (Garnet Vaz) Date: Wed, 28 Aug 2013 11:58:44 -0700 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: Hi Matt, Attached is a folder containing the code and a sample mesh. Thanks for the help. - Garnet On Wed, Aug 28, 2013 at 11:43 AM, Matthew Knepley wrote: > On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: > >> Thanks Jed. I did as you told and the code finally crashes on both >> builds. I installed the 3.4.2 release now. >> >> The problem now seems to come from DMPlexDistribute(). I have two >> versions to load the mesh. One creates a mesh using Triangle >> from PETSc and the other loads a mesh using DMPlexCreateFromCellList(). >> >> Is the following piece of code for creating a mesh using Triangle right? >> > > Okay, something is really very wrong here. It is calling > EnlargePartition(), but for > that path to be taken, you have to trip and earlier exception. It should > not be possible > to call it. So I think you have memory corruption somewhere. > > Can you send a sample code we can run? > > Thanks, > > Matt > > >> ierr = DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); >> if (user->dm) { >> DM refinedMesh = NULL; >> DM distributedMesh = NULL; >> ierr = >> DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); >> ierr = DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); >> if (refinedMesh) { >> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >> user->dm = refinedMesh; >> } >> ierr = >> DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); >> if (distributedMesh) { >> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >> user->dm = distributedMesh; >> } >> } >> >> Using gdb, the code gives a SEGV during distribution. The backtrace when >> the fault >> occurs points to an invalid pointer for ISGetIndices(). Attached is a >> screenshot of the >> gdb backtrace. >> Do I need to set up some index set here? >> >> The same error occurs when trying to distribute a mesh using >> DMPlexCreateFromCellList(). >> >> Thanks for the help. >> >> >> - >> Garnet >> >> >> On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: >> >>> Garnet Vaz writes: >>> >>> > Hi, >>> > >>> > I just rebuilt PETSc on both my laptop and my desktop. >>> > On both machines the output of >grep GIT configure.log >>> > Defined "VERSION_GIT" to >>> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >>> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" >>> >>> Thanks for the report. Matt just merged a bunch of DMPlex-related >>> branches (about 60 commits in total). Can you 'git pull && make' to let >>> us know if the problem is still there? (It may not fix the issue, but >>> at least we'll be debugging current code.) >>> >>> When dealing with debug vs. optimized issues, it's useful to configure >>> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to include >>> line numbers, but it (usually!) does not affect whether the error >>> occurs. >>> >>> > My code runs on both machines in the debug build without causing >>> > any problems. When I try to run the optimized build, the code crashes >>> > with a SEGV fault on my laptop but not on the desktop. I have built >>> > PETSc using the same configure options. >>> > >>> > I have attached the outputs of valgrind for both my laptop/desktop for >>> > both the debug/opt builds. How can I figure out what differences are >>> > causing the errors in one case and not the other? >>> >>> It looks like an uninitialized variable. Debug mode often ends up >>> initializing local variables where as optimized leaves junk in them. >>> Stack allocation alignment/padding is also often different. >>> Unfortunately, valgrind is less powerful for debugging stack corruption, >>> so the uninitialized warning is usually the best you get. >>> >> >> >> >> -- >> Regards, >> Garnet >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Regards, Garnet -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: sample.tar.gz Type: application/x-gzip Size: 16226 bytes Desc: not available URL: From knepley at gmail.com Wed Aug 28 14:51:42 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Aug 2013 14:51:42 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 28, 2013 at 1:58 PM, Garnet Vaz wrote: > Hi Matt, > > Attached is a folder containing the code and a sample mesh. > I have built and run it here with the 'next' branch from today, and it does not crash. What branch are you using? Matt > Thanks for the help. > > - > Garnet > > > On Wed, Aug 28, 2013 at 11:43 AM, Matthew Knepley wrote: > >> On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: >> >>> Thanks Jed. I did as you told and the code finally crashes on both >>> builds. I installed the 3.4.2 release now. >>> >>> The problem now seems to come from DMPlexDistribute(). I have two >>> versions to load the mesh. One creates a mesh using Triangle >>> from PETSc and the other loads a mesh using DMPlexCreateFromCellList(). >>> >>> Is the following piece of code for creating a mesh using Triangle right? >>> >> >> Okay, something is really very wrong here. It is calling >> EnlargePartition(), but for >> that path to be taken, you have to trip and earlier exception. It should >> not be possible >> to call it. So I think you have memory corruption somewhere. >> >> Can you send a sample code we can run? >> >> Thanks, >> >> Matt >> >> >>> ierr = DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); >>> if (user->dm) { >>> DM refinedMesh = NULL; >>> DM distributedMesh = NULL; >>> ierr = >>> DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); >>> ierr = >>> DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); >>> if (refinedMesh) { >>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>> user->dm = refinedMesh; >>> } >>> ierr = >>> DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); >>> if (distributedMesh) { >>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>> user->dm = distributedMesh; >>> } >>> } >>> >>> Using gdb, the code gives a SEGV during distribution. The backtrace when >>> the fault >>> occurs points to an invalid pointer for ISGetIndices(). Attached is a >>> screenshot of the >>> gdb backtrace. >>> Do I need to set up some index set here? >>> >>> The same error occurs when trying to distribute a mesh using >>> DMPlexCreateFromCellList(). >>> >>> Thanks for the help. >>> >>> >>> - >>> Garnet >>> >>> >>> On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: >>> >>>> Garnet Vaz writes: >>>> >>>> > Hi, >>>> > >>>> > I just rebuilt PETSc on both my laptop and my desktop. >>>> > On both machines the output of >grep GIT configure.log >>>> > Defined "VERSION_GIT" to >>>> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >>>> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" >>>> >>>> Thanks for the report. Matt just merged a bunch of DMPlex-related >>>> branches (about 60 commits in total). Can you 'git pull && make' to let >>>> us know if the problem is still there? (It may not fix the issue, but >>>> at least we'll be debugging current code.) >>>> >>>> When dealing with debug vs. optimized issues, it's useful to configure >>>> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to include >>>> line numbers, but it (usually!) does not affect whether the error >>>> occurs. >>>> >>>> > My code runs on both machines in the debug build without causing >>>> > any problems. When I try to run the optimized build, the code crashes >>>> > with a SEGV fault on my laptop but not on the desktop. I have built >>>> > PETSc using the same configure options. >>>> > >>>> > I have attached the outputs of valgrind for both my laptop/desktop for >>>> > both the debug/opt builds. How can I figure out what differences are >>>> > causing the errors in one case and not the other? >>>> >>>> It looks like an uninitialized variable. Debug mode often ends up >>>> initializing local variables where as optimized leaves junk in them. >>>> Stack allocation alignment/padding is also often different. >>>> Unfortunately, valgrind is less powerful for debugging stack corruption, >>>> so the uninitialized warning is usually the best you get. >>>> >>> >>> >>> >>> -- >>> Regards, >>> Garnet >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > Regards, > Garnet > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From garnet.vaz at gmail.com Wed Aug 28 15:04:05 2013 From: garnet.vaz at gmail.com (Garnet Vaz) Date: Wed, 28 Aug 2013 13:04:05 -0700 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: Hi Matt, I just built the 3.4.2 release in the hope that it will work. It was working fine for the 'next' branch until a recent update last night. I updated my laptop/desktop with a 1/2 hour gap which caused crashes in one build but not in the other. Hence, I moved to the 3.4.2 release. I will rebuild using the current 'next' and let you know if there are any problems. Thanks. - Garnet On Wed, Aug 28, 2013 at 12:51 PM, Matthew Knepley wrote: > On Wed, Aug 28, 2013 at 1:58 PM, Garnet Vaz wrote: > >> Hi Matt, >> >> Attached is a folder containing the code and a sample mesh. >> > > I have built and run it here with the 'next' branch from today, and it > does not crash. > What branch are you using? > > Matt > > >> Thanks for the help. >> >> - >> Garnet >> >> >> On Wed, Aug 28, 2013 at 11:43 AM, Matthew Knepley wrote: >> >>> On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: >>> >>>> Thanks Jed. I did as you told and the code finally crashes on both >>>> builds. I installed the 3.4.2 release now. >>>> >>>> The problem now seems to come from DMPlexDistribute(). I have two >>>> versions to load the mesh. One creates a mesh using Triangle >>>> from PETSc and the other loads a mesh using DMPlexCreateFromCellList(). >>>> >>>> Is the following piece of code for creating a mesh using Triangle right? >>>> >>> >>> Okay, something is really very wrong here. It is calling >>> EnlargePartition(), but for >>> that path to be taken, you have to trip and earlier exception. It should >>> not be possible >>> to call it. So I think you have memory corruption somewhere. >>> >>> Can you send a sample code we can run? >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> ierr = >>>> DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); >>>> if (user->dm) { >>>> DM refinedMesh = NULL; >>>> DM distributedMesh = NULL; >>>> ierr = >>>> DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); >>>> ierr = >>>> DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); >>>> if (refinedMesh) { >>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>> user->dm = refinedMesh; >>>> } >>>> ierr = >>>> DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); >>>> if (distributedMesh) { >>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>> user->dm = distributedMesh; >>>> } >>>> } >>>> >>>> Using gdb, the code gives a SEGV during distribution. The backtrace >>>> when the fault >>>> occurs points to an invalid pointer for ISGetIndices(). Attached is a >>>> screenshot of the >>>> gdb backtrace. >>>> Do I need to set up some index set here? >>>> >>>> The same error occurs when trying to distribute a mesh using >>>> DMPlexCreateFromCellList(). >>>> >>>> Thanks for the help. >>>> >>>> >>>> - >>>> Garnet >>>> >>>> >>>> On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: >>>> >>>>> Garnet Vaz writes: >>>>> >>>>> > Hi, >>>>> > >>>>> > I just rebuilt PETSc on both my laptop and my desktop. >>>>> > On both machines the output of >grep GIT configure.log >>>>> > Defined "VERSION_GIT" to >>>>> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >>>>> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" >>>>> >>>>> Thanks for the report. Matt just merged a bunch of DMPlex-related >>>>> branches (about 60 commits in total). Can you 'git pull && make' to >>>>> let >>>>> us know if the problem is still there? (It may not fix the issue, but >>>>> at least we'll be debugging current code.) >>>>> >>>>> When dealing with debug vs. optimized issues, it's useful to configure >>>>> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to include >>>>> line numbers, but it (usually!) does not affect whether the error >>>>> occurs. >>>>> >>>>> > My code runs on both machines in the debug build without causing >>>>> > any problems. When I try to run the optimized build, the code crashes >>>>> > with a SEGV fault on my laptop but not on the desktop. I have built >>>>> > PETSc using the same configure options. >>>>> > >>>>> > I have attached the outputs of valgrind for both my laptop/desktop >>>>> for >>>>> > both the debug/opt builds. How can I figure out what differences are >>>>> > causing the errors in one case and not the other? >>>>> >>>>> It looks like an uninitialized variable. Debug mode often ends up >>>>> initializing local variables where as optimized leaves junk in them. >>>>> Stack allocation alignment/padding is also often different. >>>>> Unfortunately, valgrind is less powerful for debugging stack >>>>> corruption, >>>>> so the uninitialized warning is usually the best you get. >>>>> >>>> >>>> >>>> >>>> -- >>>> Regards, >>>> Garnet >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> Regards, >> Garnet >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Regards, Garnet -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 28 15:08:44 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Aug 2013 15:08:44 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 28, 2013 at 3:04 PM, Garnet Vaz wrote: > Hi Matt, > > I just built the 3.4.2 release in the hope that it will work. It was > working fine for the 'next' > branch until a recent update last night. I updated my laptop/desktop with > a 1/2 hour > gap which caused crashes in one build but not in the other. Hence, I moved > to the > 3.4.2 release. > > I will rebuild using the current 'next' and let you know if there are any > problems. > Can you send configure.log? I built against OpenMPI and it looks like a get a similar error which is not there with MPICH. Trying to confirm now. Matt > Thanks. > > - > Garnet > > > > On Wed, Aug 28, 2013 at 12:51 PM, Matthew Knepley wrote: > >> On Wed, Aug 28, 2013 at 1:58 PM, Garnet Vaz wrote: >> >>> Hi Matt, >>> >>> Attached is a folder containing the code and a sample mesh. >>> >> >> I have built and run it here with the 'next' branch from today, and it >> does not crash. >> What branch are you using? >> >> Matt >> >> >>> Thanks for the help. >>> >>> - >>> Garnet >>> >>> >>> On Wed, Aug 28, 2013 at 11:43 AM, Matthew Knepley wrote: >>> >>>> On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: >>>> >>>>> Thanks Jed. I did as you told and the code finally crashes on both >>>>> builds. I installed the 3.4.2 release now. >>>>> >>>>> The problem now seems to come from DMPlexDistribute(). I have two >>>>> versions to load the mesh. One creates a mesh using Triangle >>>>> from PETSc and the other loads a mesh using DMPlexCreateFromCellList(). >>>>> >>>>> Is the following piece of code for creating a mesh using Triangle >>>>> right? >>>>> >>>> >>>> Okay, something is really very wrong here. It is calling >>>> EnlargePartition(), but for >>>> that path to be taken, you have to trip and earlier exception. It >>>> should not be possible >>>> to call it. So I think you have memory corruption somewhere. >>>> >>>> Can you send a sample code we can run? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> ierr = >>>>> DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); >>>>> if (user->dm) { >>>>> DM refinedMesh = NULL; >>>>> DM distributedMesh = NULL; >>>>> ierr = >>>>> DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); >>>>> ierr = >>>>> DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); >>>>> if (refinedMesh) { >>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>> user->dm = refinedMesh; >>>>> } >>>>> ierr = >>>>> DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); >>>>> if (distributedMesh) { >>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>> user->dm = distributedMesh; >>>>> } >>>>> } >>>>> >>>>> Using gdb, the code gives a SEGV during distribution. The backtrace >>>>> when the fault >>>>> occurs points to an invalid pointer for ISGetIndices(). Attached is a >>>>> screenshot of the >>>>> gdb backtrace. >>>>> Do I need to set up some index set here? >>>>> >>>>> The same error occurs when trying to distribute a mesh using >>>>> DMPlexCreateFromCellList(). >>>>> >>>>> Thanks for the help. >>>>> >>>>> >>>>> - >>>>> Garnet >>>>> >>>>> >>>>> On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: >>>>> >>>>>> Garnet Vaz writes: >>>>>> >>>>>> > Hi, >>>>>> > >>>>>> > I just rebuilt PETSc on both my laptop and my desktop. >>>>>> > On both machines the output of >grep GIT configure.log >>>>>> > Defined "VERSION_GIT" to >>>>>> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >>>>>> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" >>>>>> >>>>>> Thanks for the report. Matt just merged a bunch of DMPlex-related >>>>>> branches (about 60 commits in total). Can you 'git pull && make' to >>>>>> let >>>>>> us know if the problem is still there? (It may not fix the issue, but >>>>>> at least we'll be debugging current code.) >>>>>> >>>>>> When dealing with debug vs. optimized issues, it's useful to configure >>>>>> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to >>>>>> include >>>>>> line numbers, but it (usually!) does not affect whether the error >>>>>> occurs. >>>>>> >>>>>> > My code runs on both machines in the debug build without causing >>>>>> > any problems. When I try to run the optimized build, the code >>>>>> crashes >>>>>> > with a SEGV fault on my laptop but not on the desktop. I have built >>>>>> > PETSc using the same configure options. >>>>>> > >>>>>> > I have attached the outputs of valgrind for both my laptop/desktop >>>>>> for >>>>>> > both the debug/opt builds. How can I figure out what differences are >>>>>> > causing the errors in one case and not the other? >>>>>> >>>>>> It looks like an uninitialized variable. Debug mode often ends up >>>>>> initializing local variables where as optimized leaves junk in them. >>>>>> Stack allocation alignment/padding is also often different. >>>>>> Unfortunately, valgrind is less powerful for debugging stack >>>>>> corruption, >>>>>> so the uninitialized warning is usually the best you get. >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Regards, >>>>> Garnet >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> Regards, >>> Garnet >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > Regards, > Garnet > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From garnet.vaz at gmail.com Wed Aug 28 15:32:07 2013 From: garnet.vaz at gmail.com (Garnet Vaz) Date: Wed, 28 Aug 2013 13:32:07 -0700 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: Hi Matt, I just ran git clone https://bitbucket.org/petsc/petsc and built the debug build. The code still crashes now with a slightly different back trace. It looks like a request for a large (wrong) amount of memory which could be from some unitialized value I have lying about. I will look into this some more. Attached is the configure.log file for my current build. - Garnet On Wed, Aug 28, 2013 at 1:08 PM, Matthew Knepley wrote: > On Wed, Aug 28, 2013 at 3:04 PM, Garnet Vaz wrote: > >> Hi Matt, >> >> I just built the 3.4.2 release in the hope that it will work. It was >> working fine for the 'next' >> branch until a recent update last night. I updated my laptop/desktop with >> a 1/2 hour >> gap which caused crashes in one build but not in the other. Hence, I >> moved to the >> 3.4.2 release. >> >> I will rebuild using the current 'next' and let you know if there are any >> problems. >> > > Can you send configure.log? I built against OpenMPI and it looks like a > get a similar error > which is not there with MPICH. Trying to confirm now. > > Matt > > >> Thanks. >> >> - >> Garnet >> >> >> >> On Wed, Aug 28, 2013 at 12:51 PM, Matthew Knepley wrote: >> >>> On Wed, Aug 28, 2013 at 1:58 PM, Garnet Vaz wrote: >>> >>>> Hi Matt, >>>> >>>> Attached is a folder containing the code and a sample mesh. >>>> >>> >>> I have built and run it here with the 'next' branch from today, and it >>> does not crash. >>> What branch are you using? >>> >>> Matt >>> >>> >>>> Thanks for the help. >>>> >>>> - >>>> Garnet >>>> >>>> >>>> On Wed, Aug 28, 2013 at 11:43 AM, Matthew Knepley wrote: >>>> >>>>> On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: >>>>> >>>>>> Thanks Jed. I did as you told and the code finally crashes on both >>>>>> builds. I installed the 3.4.2 release now. >>>>>> >>>>>> The problem now seems to come from DMPlexDistribute(). I have two >>>>>> versions to load the mesh. One creates a mesh using Triangle >>>>>> from PETSc and the other loads a mesh using >>>>>> DMPlexCreateFromCellList(). >>>>>> >>>>>> Is the following piece of code for creating a mesh using Triangle >>>>>> right? >>>>>> >>>>> >>>>> Okay, something is really very wrong here. It is calling >>>>> EnlargePartition(), but for >>>>> that path to be taken, you have to trip and earlier exception. It >>>>> should not be possible >>>>> to call it. So I think you have memory corruption somewhere. >>>>> >>>>> Can you send a sample code we can run? >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> ierr = >>>>>> DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); >>>>>> if (user->dm) { >>>>>> DM refinedMesh = NULL; >>>>>> DM distributedMesh = NULL; >>>>>> ierr = >>>>>> DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); >>>>>> ierr = >>>>>> DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); >>>>>> if (refinedMesh) { >>>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>>> user->dm = refinedMesh; >>>>>> } >>>>>> ierr = >>>>>> DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); >>>>>> if (distributedMesh) { >>>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>>> user->dm = distributedMesh; >>>>>> } >>>>>> } >>>>>> >>>>>> Using gdb, the code gives a SEGV during distribution. The backtrace >>>>>> when the fault >>>>>> occurs points to an invalid pointer for ISGetIndices(). Attached is a >>>>>> screenshot of the >>>>>> gdb backtrace. >>>>>> Do I need to set up some index set here? >>>>>> >>>>>> The same error occurs when trying to distribute a mesh using >>>>>> DMPlexCreateFromCellList(). >>>>>> >>>>>> Thanks for the help. >>>>>> >>>>>> >>>>>> - >>>>>> Garnet >>>>>> >>>>>> >>>>>> On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: >>>>>> >>>>>>> Garnet Vaz writes: >>>>>>> >>>>>>> > Hi, >>>>>>> > >>>>>>> > I just rebuilt PETSc on both my laptop and my desktop. >>>>>>> > On both machines the output of >grep GIT configure.log >>>>>>> > Defined "VERSION_GIT" to >>>>>>> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >>>>>>> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 -0500"" >>>>>>> >>>>>>> Thanks for the report. Matt just merged a bunch of DMPlex-related >>>>>>> branches (about 60 commits in total). Can you 'git pull && make' to >>>>>>> let >>>>>>> us know if the problem is still there? (It may not fix the issue, >>>>>>> but >>>>>>> at least we'll be debugging current code.) >>>>>>> >>>>>>> When dealing with debug vs. optimized issues, it's useful to >>>>>>> configure >>>>>>> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to >>>>>>> include >>>>>>> line numbers, but it (usually!) does not affect whether the error >>>>>>> occurs. >>>>>>> >>>>>>> > My code runs on both machines in the debug build without causing >>>>>>> > any problems. When I try to run the optimized build, the code >>>>>>> crashes >>>>>>> > with a SEGV fault on my laptop but not on the desktop. I have built >>>>>>> > PETSc using the same configure options. >>>>>>> > >>>>>>> > I have attached the outputs of valgrind for both my laptop/desktop >>>>>>> for >>>>>>> > both the debug/opt builds. How can I figure out what differences >>>>>>> are >>>>>>> > causing the errors in one case and not the other? >>>>>>> >>>>>>> It looks like an uninitialized variable. Debug mode often ends up >>>>>>> initializing local variables where as optimized leaves junk in them. >>>>>>> Stack allocation alignment/padding is also often different. >>>>>>> Unfortunately, valgrind is less powerful for debugging stack >>>>>>> corruption, >>>>>>> so the uninitialized warning is usually the best you get. >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Regards, >>>>>> Garnet >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> Regards, >>>> Garnet >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> Regards, >> Garnet >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Regards, Garnet -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: confi.tar.gz Type: application/x-gzip Size: 338725 bytes Desc: not available URL: From knepley at gmail.com Wed Aug 28 16:02:31 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Aug 2013 16:02:31 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 28, 2013 at 3:32 PM, Garnet Vaz wrote: > Hi Matt, > > I just ran git clone https://bitbucket.org/petsc/petsc and built > the debug build. The code still crashes now with a slightly > different back trace. It looks like a request for a large (wrong) > amount of memory which could be from some unitialized value > I have lying about. I will look into this some more. > It would really help if you could track this down in the debugger. I am not getting that here. You would think I would get an unititialized report from the compiler. Thanks, Matt > Attached is the configure.log file for my current build. > > - > Garnet > > > > On Wed, Aug 28, 2013 at 1:08 PM, Matthew Knepley wrote: > >> On Wed, Aug 28, 2013 at 3:04 PM, Garnet Vaz wrote: >> >>> Hi Matt, >>> >>> I just built the 3.4.2 release in the hope that it will work. It was >>> working fine for the 'next' >>> branch until a recent update last night. I updated my laptop/desktop >>> with a 1/2 hour >>> gap which caused crashes in one build but not in the other. Hence, I >>> moved to the >>> 3.4.2 release. >>> >>> I will rebuild using the current 'next' and let you know if there are >>> any problems. >>> >> >> Can you send configure.log? I built against OpenMPI and it looks like a >> get a similar error >> which is not there with MPICH. Trying to confirm now. >> >> Matt >> >> >>> Thanks. >>> >>> - >>> Garnet >>> >>> >>> >>> On Wed, Aug 28, 2013 at 12:51 PM, Matthew Knepley wrote: >>> >>>> On Wed, Aug 28, 2013 at 1:58 PM, Garnet Vaz wrote: >>>> >>>>> Hi Matt, >>>>> >>>>> Attached is a folder containing the code and a sample mesh. >>>>> >>>> >>>> I have built and run it here with the 'next' branch from today, and it >>>> does not crash. >>>> What branch are you using? >>>> >>>> Matt >>>> >>>> >>>>> Thanks for the help. >>>>> >>>>> - >>>>> Garnet >>>>> >>>>> >>>>> On Wed, Aug 28, 2013 at 11:43 AM, Matthew Knepley wrote: >>>>> >>>>>> On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: >>>>>> >>>>>>> Thanks Jed. I did as you told and the code finally crashes on both >>>>>>> builds. I installed the 3.4.2 release now. >>>>>>> >>>>>>> The problem now seems to come from DMPlexDistribute(). I have two >>>>>>> versions to load the mesh. One creates a mesh using Triangle >>>>>>> from PETSc and the other loads a mesh using >>>>>>> DMPlexCreateFromCellList(). >>>>>>> >>>>>>> Is the following piece of code for creating a mesh using Triangle >>>>>>> right? >>>>>>> >>>>>> >>>>>> Okay, something is really very wrong here. It is calling >>>>>> EnlargePartition(), but for >>>>>> that path to be taken, you have to trip and earlier exception. It >>>>>> should not be possible >>>>>> to call it. So I think you have memory corruption somewhere. >>>>>> >>>>>> Can you send a sample code we can run? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> ierr = >>>>>>> DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); >>>>>>> if (user->dm) { >>>>>>> DM refinedMesh = NULL; >>>>>>> DM distributedMesh = NULL; >>>>>>> ierr = >>>>>>> DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); >>>>>>> ierr = >>>>>>> DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); >>>>>>> if (refinedMesh) { >>>>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>>>> user->dm = refinedMesh; >>>>>>> } >>>>>>> ierr = >>>>>>> DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); >>>>>>> if (distributedMesh) { >>>>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>>>> user->dm = distributedMesh; >>>>>>> } >>>>>>> } >>>>>>> >>>>>>> Using gdb, the code gives a SEGV during distribution. The backtrace >>>>>>> when the fault >>>>>>> occurs points to an invalid pointer for ISGetIndices(). Attached is >>>>>>> a screenshot of the >>>>>>> gdb backtrace. >>>>>>> Do I need to set up some index set here? >>>>>>> >>>>>>> The same error occurs when trying to distribute a mesh using >>>>>>> DMPlexCreateFromCellList(). >>>>>>> >>>>>>> Thanks for the help. >>>>>>> >>>>>>> >>>>>>> - >>>>>>> Garnet >>>>>>> >>>>>>> >>>>>>> On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: >>>>>>> >>>>>>>> Garnet Vaz writes: >>>>>>>> >>>>>>>> > Hi, >>>>>>>> > >>>>>>>> > I just rebuilt PETSc on both my laptop and my desktop. >>>>>>>> > On both machines the output of >grep GIT configure.log >>>>>>>> > Defined "VERSION_GIT" to >>>>>>>> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >>>>>>>> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 >>>>>>>> -0500"" >>>>>>>> >>>>>>>> Thanks for the report. Matt just merged a bunch of DMPlex-related >>>>>>>> branches (about 60 commits in total). Can you 'git pull && make' >>>>>>>> to let >>>>>>>> us know if the problem is still there? (It may not fix the issue, >>>>>>>> but >>>>>>>> at least we'll be debugging current code.) >>>>>>>> >>>>>>>> When dealing with debug vs. optimized issues, it's useful to >>>>>>>> configure >>>>>>>> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to >>>>>>>> include >>>>>>>> line numbers, but it (usually!) does not affect whether the error >>>>>>>> occurs. >>>>>>>> >>>>>>>> > My code runs on both machines in the debug build without causing >>>>>>>> > any problems. When I try to run the optimized build, the code >>>>>>>> crashes >>>>>>>> > with a SEGV fault on my laptop but not on the desktop. I have >>>>>>>> built >>>>>>>> > PETSc using the same configure options. >>>>>>>> > >>>>>>>> > I have attached the outputs of valgrind for both my >>>>>>>> laptop/desktop for >>>>>>>> > both the debug/opt builds. How can I figure out what differences >>>>>>>> are >>>>>>>> > causing the errors in one case and not the other? >>>>>>>> >>>>>>>> It looks like an uninitialized variable. Debug mode often ends up >>>>>>>> initializing local variables where as optimized leaves junk in them. >>>>>>>> Stack allocation alignment/padding is also often different. >>>>>>>> Unfortunately, valgrind is less powerful for debugging stack >>>>>>>> corruption, >>>>>>>> so the uninitialized warning is usually the best you get. >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> Regards, >>>>>>> Garnet >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Regards, >>>>> Garnet >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> Regards, >>> Garnet >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > Regards, > Garnet > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From garnet.vaz at gmail.com Wed Aug 28 16:49:49 2013 From: garnet.vaz at gmail.com (Garnet Vaz) Date: Wed, 28 Aug 2013 14:49:49 -0700 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: Hi Matt, Within gdb how can I view an IS? I tried 'call ISView(*partition,0)' following the VecView() syntax but it causes a segmentation fault inside gdb. - Garnet On Wed, Aug 28, 2013 at 2:02 PM, Matthew Knepley wrote: > On Wed, Aug 28, 2013 at 3:32 PM, Garnet Vaz wrote: > >> Hi Matt, >> >> I just ran git clone https://bitbucket.org/petsc/petsc and built >> the debug build. The code still crashes now with a slightly >> different back trace. It looks like a request for a large (wrong) >> amount of memory which could be from some unitialized value >> I have lying about. I will look into this some more. >> > > It would really help if you could track this down in the debugger. I am > not getting > that here. You would think I would get an unititialized report from the > compiler. > > Thanks, > > Matt > > >> Attached is the configure.log file for my current build. >> >> - >> Garnet >> >> >> >> On Wed, Aug 28, 2013 at 1:08 PM, Matthew Knepley wrote: >> >>> On Wed, Aug 28, 2013 at 3:04 PM, Garnet Vaz wrote: >>> >>>> Hi Matt, >>>> >>>> I just built the 3.4.2 release in the hope that it will work. It was >>>> working fine for the 'next' >>>> branch until a recent update last night. I updated my laptop/desktop >>>> with a 1/2 hour >>>> gap which caused crashes in one build but not in the other. Hence, I >>>> moved to the >>>> 3.4.2 release. >>>> >>>> I will rebuild using the current 'next' and let you know if there are >>>> any problems. >>>> >>> >>> Can you send configure.log? I built against OpenMPI and it looks like a >>> get a similar error >>> which is not there with MPICH. Trying to confirm now. >>> >>> Matt >>> >>> >>>> Thanks. >>>> >>>> - >>>> Garnet >>>> >>>> >>>> >>>> On Wed, Aug 28, 2013 at 12:51 PM, Matthew Knepley wrote: >>>> >>>>> On Wed, Aug 28, 2013 at 1:58 PM, Garnet Vaz wrote: >>>>> >>>>>> Hi Matt, >>>>>> >>>>>> Attached is a folder containing the code and a sample mesh. >>>>>> >>>>> >>>>> I have built and run it here with the 'next' branch from today, and it >>>>> does not crash. >>>>> What branch are you using? >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks for the help. >>>>>> >>>>>> - >>>>>> Garnet >>>>>> >>>>>> >>>>>> On Wed, Aug 28, 2013 at 11:43 AM, Matthew Knepley wrote: >>>>>> >>>>>>> On Wed, Aug 28, 2013 at 12:52 PM, Garnet Vaz wrote: >>>>>>> >>>>>>>> Thanks Jed. I did as you told and the code finally crashes on both >>>>>>>> builds. I installed the 3.4.2 release now. >>>>>>>> >>>>>>>> The problem now seems to come from DMPlexDistribute(). I have two >>>>>>>> versions to load the mesh. One creates a mesh using Triangle >>>>>>>> from PETSc and the other loads a mesh using >>>>>>>> DMPlexCreateFromCellList(). >>>>>>>> >>>>>>>> Is the following piece of code for creating a mesh using Triangle >>>>>>>> right? >>>>>>>> >>>>>>> >>>>>>> Okay, something is really very wrong here. It is calling >>>>>>> EnlargePartition(), but for >>>>>>> that path to be taken, you have to trip and earlier exception. It >>>>>>> should not be possible >>>>>>> to call it. So I think you have memory corruption somewhere. >>>>>>> >>>>>>> Can you send a sample code we can run? >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> ierr = >>>>>>>> DMPlexCreateBoxMesh(comm,2,interpolate,&user->dm);CHKERRQ(ierr); >>>>>>>> if (user->dm) { >>>>>>>> DM refinedMesh = NULL; >>>>>>>> DM distributedMesh = NULL; >>>>>>>> ierr = >>>>>>>> DMPlexSetRefinementLimit(user->dm,refinementLimit);CHKERRQ(ierr); >>>>>>>> ierr = >>>>>>>> DMRefine(user->dm,PETSC_COMM_WORLD,&refinedMesh);CHKERRQ(ierr); >>>>>>>> if (refinedMesh) { >>>>>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>>>>> user->dm = refinedMesh; >>>>>>>> } >>>>>>>> ierr = >>>>>>>> DMPlexDistribute(user->dm,"chaco",1,&distributedMesh);CHKERRQ(ierr); >>>>>>>> if (distributedMesh) { >>>>>>>> ierr = DMDestroy(&user->dm);CHKERRQ(ierr); >>>>>>>> user->dm = distributedMesh; >>>>>>>> } >>>>>>>> } >>>>>>>> >>>>>>>> Using gdb, the code gives a SEGV during distribution. The backtrace >>>>>>>> when the fault >>>>>>>> occurs points to an invalid pointer for ISGetIndices(). Attached is >>>>>>>> a screenshot of the >>>>>>>> gdb backtrace. >>>>>>>> Do I need to set up some index set here? >>>>>>>> >>>>>>>> The same error occurs when trying to distribute a mesh using >>>>>>>> DMPlexCreateFromCellList(). >>>>>>>> >>>>>>>> Thanks for the help. >>>>>>>> >>>>>>>> >>>>>>>> - >>>>>>>> Garnet >>>>>>>> >>>>>>>> >>>>>>>> On Wed, Aug 28, 2013 at 6:38 AM, Jed Brown wrote: >>>>>>>> >>>>>>>>> Garnet Vaz writes: >>>>>>>>> >>>>>>>>> > Hi, >>>>>>>>> > >>>>>>>>> > I just rebuilt PETSc on both my laptop and my desktop. >>>>>>>>> > On both machines the output of >grep GIT configure.log >>>>>>>>> > Defined "VERSION_GIT" to >>>>>>>>> > ""d8f7425765acda418e23a679c25fd616d9da8153"" >>>>>>>>> > Defined "VERSION_DATE_GIT" to ""2013-08-27 10:05:35 >>>>>>>>> -0500"" >>>>>>>>> >>>>>>>>> Thanks for the report. Matt just merged a bunch of DMPlex-related >>>>>>>>> branches (about 60 commits in total). Can you 'git pull && make' >>>>>>>>> to let >>>>>>>>> us know if the problem is still there? (It may not fix the issue, >>>>>>>>> but >>>>>>>>> at least we'll be debugging current code.) >>>>>>>>> >>>>>>>>> When dealing with debug vs. optimized issues, it's useful to >>>>>>>>> configure >>>>>>>>> --with-debugging=0 COPTFLAGS='-O2 -g'. This allows valgrind to >>>>>>>>> include >>>>>>>>> line numbers, but it (usually!) does not affect whether the error >>>>>>>>> occurs. >>>>>>>>> >>>>>>>>> > My code runs on both machines in the debug build without causing >>>>>>>>> > any problems. When I try to run the optimized build, the code >>>>>>>>> crashes >>>>>>>>> > with a SEGV fault on my laptop but not on the desktop. I have >>>>>>>>> built >>>>>>>>> > PETSc using the same configure options. >>>>>>>>> > >>>>>>>>> > I have attached the outputs of valgrind for both my >>>>>>>>> laptop/desktop for >>>>>>>>> > both the debug/opt builds. How can I figure out what differences >>>>>>>>> are >>>>>>>>> > causing the errors in one case and not the other? >>>>>>>>> >>>>>>>>> It looks like an uninitialized variable. Debug mode often ends up >>>>>>>>> initializing local variables where as optimized leaves junk in >>>>>>>>> them. >>>>>>>>> Stack allocation alignment/padding is also often different. >>>>>>>>> Unfortunately, valgrind is less powerful for debugging stack >>>>>>>>> corruption, >>>>>>>>> so the uninitialized warning is usually the best you get. >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> Regards, >>>>>>>> Garnet >>>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Regards, >>>>>> Garnet >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>>> >>>> -- >>>> Regards, >>>> Garnet >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> Regards, >> Garnet >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Regards, Garnet -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Wed Aug 28 16:53:10 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 28 Aug 2013 16:53:10 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> Message-ID: <8738pt1nzt.fsf@mcs.anl.gov> Garnet Vaz writes: > Hi Matt, > > Within gdb how can I view an IS? I tried 'call ISView(*partition,0)' > following the VecView() syntax but it causes a segmentation fault > inside gdb. That'll work if you're passing an IS and it's not already corrupt. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From potaman at outlook.com Wed Aug 28 17:59:22 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Wed, 28 Aug 2013 18:59:22 -0400 Subject: [petsc-users] Setting up a fieldsplit preconditioner using PETSc DM Message-ID: Hi, Are there any examples for setting up a fieldsplit preconditioner using PETSc DM given a field decomposition? Thanks, Subramanya -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 28 18:20:31 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Aug 2013 18:20:31 -0500 Subject: [petsc-users] Setting up a fieldsplit preconditioner using PETSc DM In-Reply-To: References: Message-ID: On Wed, Aug 28, 2013 at 5:59 PM, subramanya sadasiva wrote: > Hi, > Are there any examples for setting up a fieldsplit preconditioner using > PETSc DM given a field decomposition? > There are a bunch of fieldsplit options in my last tutorial, and in my talk from SIAM CS&E 13. Matt > Thanks, > Subramanya > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Aug 28 21:22:54 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 28 Aug 2013 21:22:54 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: <8738pt1nzt.fsf@mcs.anl.gov> References: <87ob8i2aws.fsf@mcs.anl.gov> <8738pt1nzt.fsf@mcs.anl.gov> Message-ID: On Wed, Aug 28, 2013 at 4:53 PM, Jed Brown wrote: > Garnet Vaz writes: > > > Hi Matt, > > > > Within gdb how can I view an IS? I tried 'call ISView(*partition,0)' > > following the VecView() syntax but it causes a segmentation fault > > inside gdb. > > That'll work if you're passing an IS and it's not already corrupt. > Just to recap, I have run in both debug and optimized with complex, and through valgrind, and I get no problems on my machine (using downloaded MPICH). I think we will need you to narrow it down with the debugger there. Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From garnet.vaz at gmail.com Thu Aug 29 01:45:53 2013 From: garnet.vaz at gmail.com (Garnet Vaz) Date: Wed, 28 Aug 2013 23:45:53 -0700 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> <8738pt1nzt.fsf@mcs.anl.gov> Message-ID: Hi Matt, I figured out what was causing the problems. I never had an optimized build built on my laptop before. So I did not run the reconfigure script but had to build from scratch. I recently changed my build script to keep only what was required for this project. By mistake, the --download-chaco was taken out. Hence, the code was running on all debug builds + optimized build on the desktop and crashed on my laptop. Once I rebuilt 3.4.2 it crashed for all builds since chaco was not available on any of them now. The IS of the partition returned here was corrupt. plex.c line 2503 if (1) { #if defined(PETSC_HAVE_CHACO) ierr = DMPlexPartition_Chaco(dm, numVertices, start, adjacency, partSection, partition);CHKERRQ(ierr); #endif } else { #if defined(PETSC_HAVE_PARMETIS) ierr = DMPlexPartition_ParMetis(dm, numVertices, start, adjacency, partSection, partition);CHKERRQ(ierr); #endif } I do have parmetis installed but I do not know why it did not work. Since you were able to run the code it helped a lot. The code runs on all builds/machines now without any problems once chaco was downloaded. Thanks a lot for the help. Regards, Garnet On Wed, Aug 28, 2013 at 7:22 PM, Matthew Knepley wrote: > On Wed, Aug 28, 2013 at 4:53 PM, Jed Brown wrote: > >> Garnet Vaz writes: >> >> > Hi Matt, >> > >> > Within gdb how can I view an IS? I tried 'call ISView(*partition,0)' >> > following the VecView() syntax but it causes a segmentation fault >> > inside gdb. >> >> That'll work if you're passing an IS and it's not already corrupt. >> > > Just to recap, I have run in both debug and optimized with complex, and > through valgrind, > and I get no problems on my machine (using downloaded MPICH). > > I think we will need you to narrow it down with the debugger there. > > Thanks, > > Matt > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Regards, Garnet -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.bonnefon at avignon.inra.fr Thu Aug 29 04:28:58 2013 From: olivier.bonnefon at avignon.inra.fr (Olivier Bonnefon) Date: Thu, 29 Aug 2013 11:28:58 +0200 Subject: [petsc-users] distribute and cells mapping. In-Reply-To: References: <52177321.4080900@avignon.inra.fr> <521DB49C.6010101@avignon.inra.fr> <521E10E1.50109@avignon.inra.fr> Message-ID: <521F145A.5050902@avignon.inra.fr> On 08/28/2013 06:08 PM, Matthew Knepley wrote: > On Wed, Aug 28, 2013 at 10:01 AM, Olivier Bonnefon > > wrote: > > On 08/28/2013 10:28 AM, Olivier Bonnefon wrote: >> On 08/23/2013 04:42 PM, Matthew Knepley wrote: >>> On Fri, Aug 23, 2013 at 9:35 AM, Olivier Bonnefon >>> >> > wrote: >>> >>> Hello, >>> >>> Thanks for your answers, I'm now able to import and >>> distribute a mesh: >>> >>> >>> You might simplify this to >>> >>> if (rank) {obNbCells = 0; obNbVertex = 0;} >>> ierr = >>> DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >>> >>> if (!rank){ >>> ierr = >>> DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >>> for (i=0;i>> ierr =DMPlexSetLabelValue(*dm, "marker", >>> obBoundary[i]+obNbCells, 1);CHKERRQ(ierr); >>> } >>> }else { >>> ierr = >>> DMPlexCreateFromCellList(comm,dim,0,0,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >>> } >>> >>> ierr = DMPlexDistribute(*dm, partitioner, 0, >>> &distributedMesh);CHKERRQ(ierr); >>> if (distributedMesh) { >>> ierr = DMDestroy(dm);CHKERRQ(ierr); >>> *dm = distributedMesh; >>> } >>> >>> Is it possible to known the resulting partition ? ie, What >>> is the mapping between the initial cell number and the local >>> cell (used in DMPlexComputeResidualFEM)? >>> I need this to write an efficient implementation of the FEM >>> struct functions f0 and g0, space depending. >>> >>> >>> Yes, but I really do not think you want to do things that way. I >>> am assuming you want different material models or something >>> in different places. The way I envision that is using a DMLabel >>> to mark up parts of the domain. All labels are automatically >>> distributed with the mesh. Is that what you want? >> Hello, >> >> It is exactly what I need: I'm mobilized a landscape, and the >> parameters of the model depend of the type of crop. Therefore, I >> have created a label for each type of crop and I have labeled >> each triangle with the corresponding label: >> >> for (i=0;i> if (labelCells[i]==1){ >> ierr =DMPlexSetLabelValue(*dm, "marker1", i, >> 1);CHKERRQ(ierr); >> }else{ >> ierr =DMPlexSetLabelValue(*dm, "marker2", i, >> 1);CHKERRQ(ierr); >> } >> } >> >> So, I'm able to mark the triangles, but I'm not able to get this >> label in the plugin "fem.f0Funcs" and "fem.g0Funcs": These >> plugins are called by looping on the triangles in the function >> "FEMIntegrateResidualBatch", but the dm is not available, so I >> can't use the functions DMPlexGetLabel, DMLabelGetStratumSize and >> DMLabelGetStratumIS. What is the good way to get the labels in >> the user plugins of the fem struct ? > > > So lets start with the abstract problem so that I can see exactly what > you want to do. In ex12 (or ex62, etc.) I have a single > equation, so I do a loop over all cells. This loop takes place in > DMPlexComputeResidualFEM(). You would instead like > to do a few loops over sets of cells with different material models, > using different f0/f1. Is this correct? > >> Thanks a lot for your help. >> >> Olivier B > Hello, > > This is the solution I implemented to get the label level in the > plugins "fem.f0Funcs" and "fem.g0Funcs": > > I need the DM and the index element, so i do: > 1) I add some static variables: > static DM * spDM[128]; > static int scurElem[128]; > > > Notice that the DM is available in DMPlexComputeResidualFEM(). Here is > what the function does: > > a) Batches up elements into groups > > b) Integrates each group using a call to > FEMIntegrateResidualBatch(). Notice that in 'next' this has > changed to PetscFEIntegrateResidual() since we have added a few > FEM classes to make things > simpler and more flexible. > > What you can do, I think, to get what you want is: > > a) Write a new MY_DMPlexComputeResidualFEM() to do a few loops. This > is supplied to your app using > > ierr = DMSNESSetFunctionLocal(dm, (PetscErrorCode > (*)(DM,Vec,Vec,void*)) MY_DMPlexComputeResidualFEM, &user);CHKERRQ(ierr); > ierr = DMSNESSetJacobianLocal(dm, (PetscErrorCode > (*)(DM,Vec,Mat,Mat,MatStructure*,void*)) MY_DMPlexComputeJacobianFEM, > &user);CHKERRQ(ierr); > > just as in the examples. You could use different f0/f1 for each loop > somehow. > > b) Write a new PetscFEIntegrateResidual() that does what you want. > The easiest way to do this is create > a new PetscFE subclass, since they only really do one thing > which is these integrals. I can help you. > HOWEVER, if what you really want to do is get coefficient information > into f0/f1 instead of a different physical model, > then you can do something easier that we just put in. You can layout a > coefficient, like nu in > > \div \nu \grad u = \rho > > and provide a DM for \nu. This will be passed all the way down inside > until f0 gets > > f0Func(u, gradU, nu, gradNu, x, f0) > > so that the pointwise values of the your coefficient and its gradient > are available to your physics. > > I am sure there will be questions about this, but the first thing to > do is get entirely clear what you want to do. Hello, This is the 2D systems I want to simulate: system 1: 0=\div \rho(x) \grad u + r(x)*u(1-u) corresponding to the stationary state of the time dependent problem: system 2: \partial_t u = \div \rho(x) \grad u + r(x)*u(1-u) It represents the diffusion of a specie throw a 2D space, u is the density of this specie. I want also to study the effect of a predator p: system 3: \partial_t u = \div \rho_u(x) \grad u + r(x)*u(1-u) - \beta (x) p*u \partial_t p = \div \rho_p(x) \grad p - \delta (x) p + \gamma (x) p*u I'm focused on the (system 1). About the geometry: The geometry come from the landscape composed of crops. There are different type of crops. The functions ( \rho(x), r(x), \delta (x), \beta (x)) depend on this type of crops. I'm focused on the (system 1). I'm working from the ex12.c. Therefore, the plungins functions are: f0_u(u,grad_u,x,f0){ ... f0= r(x)*u*(1-u) ... } f1_u(u,grad_u,x,f1){ ... f1[comp*dim+d] = rho(x)*gradU[comp*dim+d]; ... } g0_u(u,grad_u,x,g0){ ... g0=-r(x)*(1-2*u) ... } g3_uu(u,grad_u,x,g3){ ... g3[((compI*Ncomp+compI)*dim+d)*dim+d] = \rho(x); ... } For an efficient implementation of theses plugins, I have to know the type of crop. If I well understand your previous mail, you propose to me to defined my own struct PetscFEM adding my useful parameters (crop type for example). I have to overload the functions DMPlexComputeResidualFEM, DMPlexComputeJacobianFEM, FEMIntegrateResidualBatch, FEMIntegrateJacobianActionBatch. I agree, thanks a lot. Regards, Olivier B > > Thanks, > > Matt > > 2) I overload the DMPlexComputeJacobianFEM with : > PetscErrorCode MY_DMPlexComputeJacobianFEM(DM dm, Vec X, Mat > Jac, Mat JacP, MatStructure *str,void *user) > { > > PetscMPIInt rank; > PetscErrorCode ierr; > > PetscFunctionBeginUser; > ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank);CHKERRQ(ierr); > spDM[rank]=&dm; > PetscFunctionReturn(DMPlexComputeJacobianFEM(dm, X, Jac,JacP, > str,user)); > > } > 3) overload FEMIntegrateResidualBatch adding code: > . > . > for (e = 0; e < Ne; ++e) { > scurElem[rank]=e;//added ligne > . > . > > So that, I can get the label level using DMPlexHasLabel and > DMLabelGetValue > > I'm sure this solution is awful, and works only in this version, > but i didn't find a better way to get the label in the plugins fem > struc. Do you know the correct way to do that ?? > > Thanks, > > Olivier B >>> >>> Thanks, >>> >>> Matt >>> >>> Regards, >>> >>> Olivier B >>> >>> -- >>> Olivier Bonnefon >>> INRA PACA-Avignon, Unit? BioSP >>> Tel: +33 (0)4 32 72 21 58 >>> >>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to >>> which their experiments lead. >>> -- Norbert Wiener >> >> >> -- >> Olivier Bonnefon >> INRA PACA-Avignon, Unit? BioSP >> Tel:+33 (0)4 32 72 21 58 > > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel:+33 (0)4 32 72 21 58 > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -- Olivier Bonnefon INRA PACA-Avignon, Unit? BioSP Tel: +33 (0)4 32 72 21 58 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 29 06:32:52 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Aug 2013 06:32:52 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> <8738pt1nzt.fsf@mcs.anl.gov> Message-ID: On Thu, Aug 29, 2013 at 1:45 AM, Garnet Vaz wrote: > Hi Matt, > > I figured out what was causing the problems. I never had > an optimized build built on my laptop before. So I did not > run the reconfigure script but had to build from scratch. > > I recently changed my build script to keep only what was > required for this project. By mistake, the --download-chaco > was taken out. Hence, the code was running on all debug > builds + optimized build on the desktop and crashed on > my laptop. > > Once I rebuilt 3.4.2 it crashed for all builds since chaco > was not available on any of them now. The IS of > the partition returned here was corrupt. > plex.c line 2503 > Crap! Thats a bug. Thanks for finding it. I will fix it in next today. Matt > if (1) { > #if defined(PETSC_HAVE_CHACO) > ierr = DMPlexPartition_Chaco(dm, numVertices, start, adjacency, > partSection, partition);CHKERRQ(ierr); > #endif > } else { > #if defined(PETSC_HAVE_PARMETIS) > ierr = DMPlexPartition_ParMetis(dm, numVertices, start, adjacency, > partSection, partition);CHKERRQ(ierr); > #endif > } > I do have parmetis installed but I do not know why it did not > work. > > Since you were able to run the code it helped a lot. The > code runs on all builds/machines now without any problems > once chaco was downloaded. > > Thanks a lot for the help. > > Regards, > Garnet > > > > > On Wed, Aug 28, 2013 at 7:22 PM, Matthew Knepley wrote: > >> On Wed, Aug 28, 2013 at 4:53 PM, Jed Brown wrote: >> >>> Garnet Vaz writes: >>> >>> > Hi Matt, >>> > >>> > Within gdb how can I view an IS? I tried 'call ISView(*partition,0)' >>> > following the VecView() syntax but it causes a segmentation fault >>> > inside gdb. >>> >>> That'll work if you're passing an IS and it's not already corrupt. >>> >> >> Just to recap, I have run in both debug and optimized with complex, and >> through valgrind, >> and I get no problems on my machine (using downloaded MPICH). >> >> I think we will need you to narrow it down with the debugger there. >> >> Thanks, >> >> Matt >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > Regards, > Garnet > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu Aug 29 07:11:01 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 29 Aug 2013 07:11:01 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: References: <87ob8i2aws.fsf@mcs.anl.gov> <8738pt1nzt.fsf@mcs.anl.gov> Message-ID: <87hae8zoh6.fsf@mcs.anl.gov> Matthew Knepley writes: > Crap! Thats a bug. Thanks for finding it. I will fix it in next today. As always, the choice of partitioner should be a run-time choice so that we can add a test that uses Metis regardless of whether Chaco is installed. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jtravs at gmail.com Thu Aug 29 07:39:48 2013 From: jtravs at gmail.com (John Travers) Date: Thu, 29 Aug 2013 14:39:48 +0200 Subject: [petsc-users] converting scipy sparse CSR matrix to petcs matrix with mpi In-Reply-To: <87txia2crj.fsf@mcs.anl.gov> References: <9B30BF4C-13C0-4123-BAAA-11D476FDA35C@gmail.com> <87txia2crj.fsf@mcs.anl.gov> Message-ID: <4E2AF407-2405-49B1-BF62-6A7C3C2A2A3B@gmail.com> On 28 Aug 2013, at 14:58, Jed Brown wrote: > John Travers writes: > >> Hi, >> >> I currently generate PETSc matrices from scipy.sparse CSR format matrices as follows (where A is a scipy sparse CSR matrix): >> >> pA = PETSc.Mat().createAIJ(size=A.shape, csr=(A.indptr, A.indices, A.data)) >> >> This work correctly on sequential runs, but if I run under MPI I get an error which I presume to be caused by the fact that all of my MPI processes try to simultaneously create this matrix, rather than splitting it? Eg. for 4 processes I get: > > Yeah, the size of the passed CSR part doesn't match the local size of > the matrix. I think that given a range [rstart,rend), you can pass > > csr=(A.indptr[rstart:rend] - A.indptr[rstart], > A.indices[A.indptr[rstart]:A.indptr[rend]], > A.data[A.indptr[rstart]:A.indptr[rend]]) > Thanks, this works (except it should be A.indptr[rstart:rend+1] in the first line I think). > More simply, you can just create the matrix and loop over rows calling > MatSetValues, but you have to do half of the spec above to set the > number of nonzeros per row if you want it to be fast. > > Do you have to start with redundantly-computed scipy matrices? (That'll > be a scalability bottleneck. It's usually important to distribute > computation of the matrix entries unless you're only trying to use a few > cores.) Well, not in the long run, but at the moment that code is debugged and working and I only wanted to try the slepc solvers first. From jedbrown at mcs.anl.gov Thu Aug 29 07:56:21 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 29 Aug 2013 07:56:21 -0500 Subject: [petsc-users] converting scipy sparse CSR matrix to petcs matrix with mpi In-Reply-To: <4E2AF407-2405-49B1-BF62-6A7C3C2A2A3B@gmail.com> References: <9B30BF4C-13C0-4123-BAAA-11D476FDA35C@gmail.com> <87txia2crj.fsf@mcs.anl.gov> <4E2AF407-2405-49B1-BF62-6A7C3C2A2A3B@gmail.com> Message-ID: <8738pszmdm.fsf@mcs.anl.gov> John Travers writes: > On 28 Aug 2013, at 14:58, Jed Brown wrote: >> csr=(A.indptr[rstart:rend] - A.indptr[rstart], >> A.indices[A.indptr[rstart]:A.indptr[rend]], >> A.data[A.indptr[rstart]:A.indptr[rend]]) >> > > Thanks, this works (except it should be A.indptr[rstart:rend+1] in the > first line I think). Yes, good catch. > Well, not in the long run, but at the moment that code is debugged and > working and I only wanted to try the slepc solvers first. Great, let us know if you have any problems. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From knepley at gmail.com Thu Aug 29 11:11:50 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Aug 2013 11:11:50 -0500 Subject: [petsc-users] Optimized run crashes on one machine but not another In-Reply-To: <87hae8zoh6.fsf@mcs.anl.gov> References: <87ob8i2aws.fsf@mcs.anl.gov> <8738pt1nzt.fsf@mcs.anl.gov> <87hae8zoh6.fsf@mcs.anl.gov> Message-ID: On Thu, Aug 29, 2013 at 7:11 AM, Jed Brown wrote: > Matthew Knepley writes: > > Crap! Thats a bug. Thanks for finding it. I will fix it in next today. > > As always, the choice of partitioner should be a run-time choice so that > we can add a test that uses Metis regardless of whether Chaco is > installed. > I have fixed and pushed this to next. Should be in master in a few days. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Thu Aug 29 16:01:48 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Thu, 29 Aug 2013 14:01:48 -0700 Subject: [petsc-users] IJacobian "PetscReal a" Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD2C1@EMAIL04.pnl.gov> Hello, when I look at IJacobian function in the example: http://www.mcs.anl.gov/petsc/petsc-current/src/ts/examples/tutorials/ex19.c static PetscErrorCode IJacobian(TS ts,PetscReal t,Vec X,Vec Xdot,PetscReal a,Mat *A,Mat *B,MatStructure *flag,void *ctx) { PetscErrorCode ierr; PetscInt rowcol[] = {0,1}; PetscScalar *x,J[2][2]; PetscFunctionBeginUser; ierr = VecGetArray(X,&x);CHKERRQ(ierr); J[0][0] = a; J[0][1] = -1.; J[1][0] = 1.; J[1][1] = -1. + x[1]*x[1]; ierr = MatSetValues(*B,2,rowcol,2,rowcol,&J[0][0],INSERT_VALUES);CHKERRQ(ierr); ierr = VecRestoreArray(X,&x);CHKERRQ(ierr); ierr = MatAssemblyBegin(*A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); ierr = MatAssemblyEnd(*A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); if (*A != *B) { ierr = MatAssemblyBegin(*B,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); ierr = MatAssemblyEnd(*B,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); } *flag = SAME_NONZERO_PATTERN; PetscFunctionReturn(0); } There's a PetscReal a. Is it a constant? And if the value of a known outside of the IJacobian function, saying the main() part. If it is, then as shown in this example, J[0][0] = a; J[0][1] = -1.; J[1][0] = 1.; are constant through all the iterations. How can I use MatRetrieveValues(*A) and MatStoreValues(*A) to reuse them? I'm asking this question because I have a large Jacobian matrix, but half of the matrix contains constant values. However, I don't know what's "PetscReal a" when I try to use it to compute my constant elements of Jacobian matrix which doesn't depend on the value of x at all. Thanks, Shuangshuang -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Aug 29 16:08:36 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 29 Aug 2013 16:08:36 -0500 Subject: [petsc-users] IJacobian "PetscReal a" In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD2C1@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD2C1@EMAIL04.pnl.gov> Message-ID: On Thu, Aug 29, 2013 at 4:01 PM, Jin, Shuangshuang < Shuangshuang.Jin at pnnl.gov> wrote: > Hello, when I look at IJacobian function in the example: * > http://www.mcs.anl.gov/petsc/petsc-current/src/ts/examples/tutorials/ex19.c > * > > static PetscErrorCode IJacobian(TS ts,PetscReal t,Vec X,Vec Xdot,PetscReal > a,Mat *A,Mat *B,MatStructure *flag,void *ctx) > { > PetscErrorCode ierr; > PetscInt rowcol[] = {0,1}; > PetscScalar *x,J[2][2]; > > PetscFunctionBeginUser; > ierr = VecGetArray(X,&x);CHKERRQ(ierr); > J[0][0] = a; J[0][1] = -1.; > J[1][0] = 1.; J[1][1] = -1. + x[1]*x[1]; > ierr = > MatSetValues(*B,2,rowcol,2,rowcol,&J[0][0],INSERT_VALUES);CHKERRQ(ierr); > ierr = VecRestoreArray(X,&x);CHKERRQ(ierr); > > ierr = MatAssemblyBegin(*A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > ierr = MatAssemblyEnd(*A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > if (*A != *B) { > ierr = MatAssemblyBegin(*B,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > ierr = MatAssemblyEnd(*B,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > } > *flag = SAME_NONZERO_PATTERN; > PetscFunctionReturn(0); > } > > There?s a PetscReal a. Is it a constant? And if the value of a known > outside of the IJacobian function, saying the main() part. > Its the shift for J_udot. See 6.1.1 in the manual. Matt > If it is, then as shown in this example, J[0][0] = a; J[0][1] = -1.; > J[1][0] = 1.; are constant through all the iterations. How can I use MatRetrieveValues(*A) > and MatStoreValues(*A) to reuse them? > > I?m asking this question because I have a large Jacobian matrix, but half > of the matrix contains constant values. However, I don?t know what?s > ?PetscReal a? when I try to use it to compute my constant elements of > Jacobian matrix which doesn?t depend on the value of x at all. > > Thanks, > Shuangshuang > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Fri Aug 30 03:32:12 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 30 Aug 2013 16:32:12 +0800 (CST) Subject: [petsc-users] can i set values of vec again after vecassemblebegin and vecassembleend. Message-ID: <32f5c7fd.25152.140ce59e5e9.Coremail.ztdepyahoo@163.com> -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Aug 30 03:59:45 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 30 Aug 2013 03:59:45 -0500 (CDT) Subject: [petsc-users] can i set values of vec again after vecassemblebegin and vecassembleend. In-Reply-To: <32f5c7fd.25152.140ce59e5e9.Coremail.ztdepyahoo@163.com> References: <32f5c7fd.25152.140ce59e5e9.Coremail.ztdepyahoo@163.com> Message-ID: yes. If setting values with VecSetValues - call VecAssemblyBegin/End again. Satish On Fri, 30 Aug 2013, ??? wrote: > > > > > > > > > > > > > From tribur at itis.ethz.ch Fri Aug 30 05:41:46 2013 From: tribur at itis.ethz.ch (Kathrin Burckhardt) Date: Fri, 30 Aug 2013 12:41:46 +0200 (CEST) Subject: [petsc-users] option values problem In-Reply-To: <603819225.2867768.1377859208170.JavaMail.root@itis.ethz.ch> Message-ID: <230776135.2891086.1377859306351.JavaMail.root@itis.ethz.ch> Hi petsc people, how can I check if the value to a specific option is valid before getting an error? E.g., if the option string contains 'options_table 2' I get [ERROR]: --------------------- Error Message ------------------------------------ [ERROR]: [0]PETSC ERROR: [INFO]: Invalid argument! [ERROR]: [0]PETSC ERROR: [INFO]: Unknown logical value: 2! Thank you in advance for your help, Kathrin From knepley at gmail.com Fri Aug 30 06:01:46 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 30 Aug 2013 06:01:46 -0500 Subject: [petsc-users] option values problem In-Reply-To: <230776135.2891086.1377859306351.JavaMail.root@itis.ethz.ch> References: <603819225.2867768.1377859208170.JavaMail.root@itis.ethz.ch> <230776135.2891086.1377859306351.JavaMail.root@itis.ethz.ch> Message-ID: On Fri, Aug 30, 2013 at 5:41 AM, Kathrin Burckhardt wrote: > Hi petsc people, > > how can I check if the value to a specific option is valid before getting > an error? > Right now the options table only holds strings. The conversion is done by the retrieval function, so there is no centralized knowledge about an a priori type of the option value. The failure can occur anywhere in the code where someone tries to get the value with a conflicting type. Thanks, Matt > E.g., if the option string contains 'options_table 2' I get > > [ERROR]: --------------------- Error Message > ------------------------------------ > [ERROR]: [0]PETSC ERROR: > [INFO]: Invalid argument! > [ERROR]: [0]PETSC ERROR: > [INFO]: Unknown logical value: 2! > > Thank you in advance for your help, > Kathrin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Aug 30 08:47:39 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 30 Aug 2013 08:47:39 -0500 Subject: [petsc-users] distribute and cells mapping. In-Reply-To: <521F145A.5050902@avignon.inra.fr> References: <52177321.4080900@avignon.inra.fr> <521DB49C.6010101@avignon.inra.fr> <521E10E1.50109@avignon.inra.fr> <521F145A.5050902@avignon.inra.fr> Message-ID: On Thu, Aug 29, 2013 at 4:28 AM, Olivier Bonnefon < olivier.bonnefon at avignon.inra.fr> wrote: > On 08/28/2013 06:08 PM, Matthew Knepley wrote: > > On Wed, Aug 28, 2013 at 10:01 AM, Olivier Bonnefon < > olivier.bonnefon at avignon.inra.fr> wrote: > >> On 08/28/2013 10:28 AM, Olivier Bonnefon wrote: >> >> On 08/23/2013 04:42 PM, Matthew Knepley wrote: >> >> On Fri, Aug 23, 2013 at 9:35 AM, Olivier Bonnefon < >> olivier.bonnefon at avignon.inra.fr> wrote: >> >>> Hello, >>> >>> Thanks for your answers, I'm now able to import and distribute a mesh: >>> >> >> You might simplify this to >> >> if (rank) {obNbCells = 0; obNbVertex = 0;} >> ierr = >> DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >> >> >>> if (!rank){ >>> ierr = >>> DMPlexCreateFromCellList(comm,dim,obNbCells,obNbVertex,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >>> for (i=0;i>> ierr =DMPlexSetLabelValue(*dm, "marker", >>> obBoundary[i]+obNbCells, 1);CHKERRQ(ierr); >>> } >>> }else { >>> ierr = >>> DMPlexCreateFromCellList(comm,dim,0,0,3,0,obCells,2,obVertex,dm);CHKERRQ(ierr); >>> } >>> >>> ierr = DMPlexDistribute(*dm, partitioner, 0, >>> &distributedMesh);CHKERRQ(ierr); >>> if (distributedMesh) { >>> ierr = DMDestroy(dm);CHKERRQ(ierr); >>> *dm = distributedMesh; >>> } >>> >>> Is it possible to known the resulting partition ? ie, What is the >>> mapping between the initial cell number and the local cell (used in >>> DMPlexComputeResidualFEM)? >>> I need this to write an efficient implementation of the FEM struct >>> functions f0 and g0, space depending. >>> >> >> Yes, but I really do not think you want to do things that way. I am >> assuming you want different material models or something >> in different places. The way I envision that is using a DMLabel to mark >> up parts of the domain. All labels are automatically >> distributed with the mesh. Is that what you want? >> >> Hello, >> >> It is exactly what I need: I'm mobilized a landscape, and the parameters >> of the model depend of the type of crop. Therefore, I have created a label >> for each type of crop and I have labeled each triangle with the >> corresponding label: >> >> for (i=0;i> if (labelCells[i]==1){ >> ierr =DMPlexSetLabelValue(*dm, "marker1", i, 1);CHKERRQ(ierr); >> }else{ >> ierr =DMPlexSetLabelValue(*dm, "marker2", i, 1);CHKERRQ(ierr); >> } >> } >> >> So, I'm able to mark the triangles, but I'm not able to get this label in >> the plugin "fem.f0Funcs" and "fem.g0Funcs": These plugins are called by >> looping on the triangles in the function "FEMIntegrateResidualBatch", but >> the dm is not available, so I can't use the functions DMPlexGetLabel, >> DMLabelGetStratumSize and DMLabelGetStratumIS. What is the good way to get >> the labels in the user plugins of the fem struct ? >> >> > So lets start with the abstract problem so that I can see exactly what > you want to do. In ex12 (or ex62, etc.) I have a single > equation, so I do a loop over all cells. This loop takes place in > DMPlexComputeResidualFEM(). You would instead like > to do a few loops over sets of cells with different material models, using > different f0/f1. Is this correct? > > >> Thanks a lot for your help. >> >> Olivier B >> >> Hello, >> >> This is the solution I implemented to get the label level in the plugins >> "fem.f0Funcs" and "fem.g0Funcs": >> >> I need the DM and the index element, so i do: >> 1) I add some static variables: >> static DM * spDM[128]; >> static int scurElem[128]; >> > > Notice that the DM is available in DMPlexComputeResidualFEM(). Here is > what the function does: > > a) Batches up elements into groups > > b) Integrates each group using a call to FEMIntegrateResidualBatch(). > Notice that in 'next' this has > changed to PetscFEIntegrateResidual() since we have added a few FEM > classes to make things > simpler and more flexible. > > What you can do, I think, to get what you want is: > > a) Write a new MY_DMPlexComputeResidualFEM() to do a few loops. This > is supplied to your app using > > ierr = DMSNESSetFunctionLocal(dm, (PetscErrorCode > (*)(DM,Vec,Vec,void*)) MY_DMPlexComputeResidualFEM, &user);CHKERRQ(ierr); > ierr = DMSNESSetJacobianLocal(dm, (PetscErrorCode > (*)(DM,Vec,Mat,Mat,MatStructure*,void*)) MY_DMPlexComputeJacobianFEM, > &user);CHKERRQ(ierr); > > just as in the examples. You could use different f0/f1 for each loop > somehow. > > b) Write a new PetscFEIntegrateResidual() that does what you want. The > easiest way to do this is create > a new PetscFE subclass, since they only really do one thing which > is these integrals. I can help you. > > HOWEVER, if what you really want to do is get coefficient information into > f0/f1 instead of a different physical model, > then you can do something easier that we just put in. You can layout a > coefficient, like nu in > > \div \nu \grad u = \rho > > and provide a DM for \nu. This will be passed all the way down inside > until f0 gets > > f0Func(u, gradU, nu, gradNu, x, f0) > > so that the pointwise values of the your coefficient and its gradient > are available to your physics. > > I am sure there will be questions about this, but the first thing to do > is get entirely clear what you want to do. > > Hello, > > This is the 2D systems I want to simulate: > > system 1: > 0=\div \rho(x) \grad u + r(x)*u(1-u) > I have just pushed a version of this system to SNES ex12 (you have to use the 'next' branch). I see at least two options: a) You have rho(x) and r(x) as explicit functions This is easy since the f0,f1 functions get x explicitly. This is the -variable_coefficient analytic mode of ex12. b) You have another FEM field defining rho(x) and r(x) This is the -variable_coefficient field mode of ex12. You can use an element different from that for u is defining these coefficients. For example, I can use P0 for rho and P2 for u, or P1 for both. I can try and help you modify that example to solve this problem if you want, or you can continue to manage the lower level stuff since that gives you more flexibility. Thanks, Matt > corresponding to the stationary state of the time dependent problem: > system 2: > \partial_t u = \div \rho(x) \grad u + r(x)*u(1-u) > > It represents the diffusion of a specie throw a 2D space, u is the density > of this specie. > > I want also to study the effect of a predator p: > system 3: > \partial_t u = \div \rho_u(x) \grad u + r(x)*u(1-u) - \beta (x) p*u > \partial_t p = \div \rho_p(x) \grad p - \delta (x) p + \gamma (x) p*u > > I'm focused on the (system 1). > > About the geometry: > The geometry come from the landscape composed of crops. There are > different type of crops. The functions ( \rho(x), r(x), \delta (x), \beta > (x)) depend on this type of crops. > > I'm focused on the (system 1). I'm working from the ex12.c. Therefore, the > plungins functions are: > > f0_u(u,grad_u,x,f0){ > ... > f0= r(x)*u*(1-u) > ... > } > > f1_u(u,grad_u,x,f1){ > ... > f1[comp*dim+d] = rho(x)*gradU[comp*dim+d]; > ... > } > > g0_u(u,grad_u,x,g0){ > ... > g0=-r(x)*(1-2*u) > ... > } > > g3_uu(u,grad_u,x,g3){ > ... > g3[((compI*Ncomp+compI)*dim+d)*dim+d] = \rho(x); > ... > } > > For an efficient implementation of theses plugins, I have to know the type > of crop. If I well understand your previous mail, you propose to me to > defined my own struct PetscFEM adding my useful parameters (crop type for > example). I have to overload the functions DMPlexComputeResidualFEM, > DMPlexComputeJacobianFEM, FEMIntegrateResidualBatch, > FEMIntegrateJacobianActionBatch. I agree, thanks a lot. > > > Regards, > > Olivier B > > > Thanks, > > Matt > > 2) I overload the DMPlexComputeJacobianFEM with : >> PetscErrorCode MY_DMPlexComputeJacobianFEM(DM dm, Vec X, Mat Jac, Mat >> JacP, MatStructure *str,void *user) >> { >> >> PetscMPIInt rank; >> PetscErrorCode ierr; >> >> PetscFunctionBeginUser; >> ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank);CHKERRQ(ierr); >> spDM[rank]=&dm; >> PetscFunctionReturn(DMPlexComputeJacobianFEM(dm, X, Jac,JacP, str,user)); >> >> } >> 3) overload FEMIntegrateResidualBatch adding code: >> . >> . >> for (e = 0; e < Ne; ++e) { >> scurElem[rank]=e;//added ligne >> . >> . >> >> So that, I can get the label level using DMPlexHasLabel and >> DMLabelGetValue >> >> I'm sure this solution is awful, and works only in this version, but i >> didn't find a better way to get the label in the plugins fem struc. Do you >> know the correct way to do that ?? >> >> Thanks, >> >> Olivier B >> >> >> Thanks, >> >> Matt >> >> >>> Regards, >>> >>> Olivier B >>> >>> -- >>> Olivier Bonnefon >>> INRA PACA-Avignon, Unit? BioSP >>> Tel: +33 (0)4 32 72 21 58 <%2B33%20%280%294%2032%2072%2021%2058> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> >> -- >> Olivier Bonnefon >> INRA PACA-Avignon, Unit? BioSP >> Tel: +33 (0)4 32 72 21 58 >> >> >> >> -- >> Olivier Bonnefon >> INRA PACA-Avignon, Unit? BioSP >> Tel: +33 (0)4 32 72 21 58 >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -- > Olivier Bonnefon > INRA PACA-Avignon, Unit? BioSP > Tel: +33 (0)4 32 72 21 58 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From boillod.france at gmail.com Fri Aug 30 13:04:00 2013 From: boillod.france at gmail.com (France Boillod-Cerneux) Date: Fri, 30 Aug 2013 20:04:00 +0200 Subject: [petsc-users] parallel matmult flop measure Message-ID: Dear PETSc user, I am using MatMult() function in a C++ program. Currently, my matrix is MATMPIAIJ format type, and I load the matrix from a binary file at petsc format. let's call the matrix A: I know before the execution the global rows, columns and nnz. my program is doing something like: global loop //begin to measure flop rate inner loop MatMult(A,x,y) end inner loop //want to know Flop rate of Matmult at this point end global loop I am looking for a function to know at runtime the flop rate for each process regarding matmult function. I had a look at pdf of Loic Gouarin (introduction to pertsc, performance May 2013) but it does not correspond to what i want, or i missunderstood? My other solution is using the matmpiaijsetpreallocation and therefore know the complete parallel distribution of A, but this implies a pre-treatment on my matrix A. I was wondering if i can create my matrix A without knowing in advance the parallel distribution and then collect the information about parallel distribution? Or more easier, if a function like PetscGetFlops could solve my problem? so far i understood that this function measure the flop since the begining of program, but this is not what i want to use, i want to have the flop ratio right after ending the inner loop, and this, for each global iteration Any ideas/suggestions would help, Thank you very much, France -- Bien cordialement - Best regards - Mit freundlichen Gr??en, *France BOILLOD-CERNEUX * *PhD Student, Laboratoire d'Informatique Fondamentale de Lille * *(LIFL), CNRS* France.Boillod-Cerneux at cea.fr Tel. : +33 (0) 1 6908 - 9527 Tel. : +33 (0) 6 4781 - 3059 DEN/DANS/DM2S CEA Saclay 91191 Gif-sur-Yvette FRANCE http://www.lifl.fr/ www-centre-saclay.cea.fr -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 30 13:08:37 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 30 Aug 2013 13:08:37 -0500 Subject: [petsc-users] parallel matmult flop measure In-Reply-To: References: Message-ID: On Aug 30, 2013, at 1:04 PM, France Boillod-Cerneux wrote: > Dear PETSc user, > > I am using MatMult() function in a C++ program. > > Currently, my matrix is MATMPIAIJ format type, and I load the matrix from a binary file at petsc format. let's call the matrix A: > > I know before the execution the global rows, columns and nnz. > > my program is doing something like: > PetscLogDouble prevflops,myflops; > global loop > //begin to measure flop rate PetscGetFlops(&prevFlops); > inner loop > MatMult(A,x,y) > end inner loop PetscGetFlops(&myflops); my flops -= prevFlops; /* each MPI process has how many flops IT has done > //want to know Flop rate of Matmult at this point > end global loop > > I am looking for a function to know at runtime the flop rate for each process regarding matmult function. > > I had a look at pdf of Loic Gouarin (introduction to pertsc, performance May 2013) but it does not correspond to what i want, or i missunderstood? > > My other solution is using the matmpiaijsetpreallocation and therefore know the complete parallel distribution of A, but this implies a pre-treatment on my matrix A. > > I was wondering if i can create my matrix A without knowing in advance the parallel distribution and then collect the information about parallel distribution? > > Or more easier, if a function like PetscGetFlops could solve my problem? > so far i understood that this function measure the flop since the begining of program, but this is not what i want to use, i want to have the flop ratio right after ending the inner loop, and this, for each global iteration > > Any ideas/suggestions would help, > > Thank you very much, > > France > > > -- > Bien cordialement - Best regards - Mit freundlichen Gr??en, > > France BOILLOD-CERNEUX > PhD Student, Laboratoire d'Informatique Fondamentale de Lille > (LIFL), CNRS > > France.Boillod-Cerneux at cea.fr > Tel. : +33 (0) 1 6908 - 9527 > Tel. : +33 (0) 6 4781 - 3059 > > DEN/DANS/DM2S > CEA Saclay > 91191 Gif-sur-Yvette > FRANCE > > http://www.lifl.fr/ > www-centre-saclay.cea.fr > From knepley at gmail.com Fri Aug 30 13:13:47 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 30 Aug 2013 13:13:47 -0500 Subject: [petsc-users] parallel matmult flop measure In-Reply-To: References: Message-ID: On Fri, Aug 30, 2013 at 1:04 PM, France Boillod-Cerneux < boillod.france at gmail.com> wrote: > Dear PETSc user, > > I am using MatMult() function in a C++ program. > > Currently, my matrix is MATMPIAIJ format type, and I load the matrix from > a binary file at petsc format. let's call the matrix A: > > I know before the execution the global rows, columns and nnz. > > my program is doing something like: > > global loop > //begin to measure flop rate > inner loop > MatMult(A,x,y) > end inner loop > //want to know Flop rate of Matmult at this point > end global loop > > I am looking for a function to know at runtime the flop rate for each > process regarding matmult function. > You can do what Barry suggests, however there might be intervening work, so you can be more precise using: PetscStageLog stageLog; PetscEventPerfLog eventLog; PetscInt stage = 0; PetscLogEvent event; PetscEventPerfInfo eventInfo; PetscLogGetStageLog(&stageLog); PetscStageLogGetEventPerfLog(stageLog, stage, &eventLog); eventInfo = eventLog->eventInfo[MAT_Mult]; Thanks, Matt I had a look at pdf of Loic Gouarin (introduction to pertsc, performance > May 2013) but it does not correspond to what i want, or i missunderstood? > > My other solution is using the matmpiaijsetpreallocation and therefore > know the complete parallel distribution of A, but this implies a > pre-treatment on my matrix A. > > I was wondering if i can create my matrix A without knowing in advance the > parallel distribution and then collect the information about parallel > distribution? > > Or more easier, if a function like PetscGetFlops could solve my problem? > so far i understood that this function measure the flop since the begining > of program, but this is not what i want to use, i want to have the flop > ratio right after ending the inner loop, and this, for each global iteration > > Any ideas/suggestions would help, > > Thank you very much, > > France > > > -- > Bien cordialement - Best regards - Mit freundlichen Gr??en, > > *France BOILLOD-CERNEUX * > *PhD Student, Laboratoire d'Informatique Fondamentale de Lille * > *(LIFL), CNRS* > > France.Boillod-Cerneux at cea.fr > Tel. : +33 (0) 1 6908 - 9527 > Tel. : +33 (0) 6 4781 - 3059 > > DEN/DANS/DM2S > CEA Saclay > 91191 Gif-sur-Yvette > FRANCE > > http://www.lifl.fr/ > www-centre-saclay.cea.fr > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From boillod.france at gmail.com Fri Aug 30 13:20:13 2013 From: boillod.france at gmail.com (France Boillod-Cerneux) Date: Fri, 30 Aug 2013 20:20:13 +0200 Subject: [petsc-users] parallel matmult flop measure In-Reply-To: References: Message-ID: thank you very much to both of you! On Fri, Aug 30, 2013 at 8:13 PM, Matthew Knepley wrote: > On Fri, Aug 30, 2013 at 1:04 PM, France Boillod-Cerneux < > boillod.france at gmail.com> wrote: > >> Dear PETSc user, >> >> I am using MatMult() function in a C++ program. >> >> Currently, my matrix is MATMPIAIJ format type, and I load the matrix >> from a binary file at petsc format. let's call the matrix A: >> >> I know before the execution the global rows, columns and nnz. >> >> my program is doing something like: >> >> global loop >> //begin to measure flop rate >> inner loop >> MatMult(A,x,y) >> end inner loop >> //want to know Flop rate of Matmult at this point >> end global loop >> >> I am looking for a function to know at runtime the flop rate for each >> process regarding matmult function. >> > > You can do what Barry suggests, however there might be intervening work, > so you can be more precise using: > > PetscStageLog stageLog; > PetscEventPerfLog eventLog; > PetscInt stage = 0; > PetscLogEvent event; > PetscEventPerfInfo eventInfo; > > PetscLogGetStageLog(&stageLog); > PetscStageLogGetEventPerfLog(stageLog, stage, &eventLog); > eventInfo = eventLog->eventInfo[MAT_Mult]; > > > Thanks, > > Matt > > I had a look at pdf of Loic Gouarin (introduction to pertsc, performance >> May 2013) but it does not correspond to what i want, or i missunderstood? >> >> My other solution is using the matmpiaijsetpreallocation and therefore >> know the complete parallel distribution of A, but this implies a >> pre-treatment on my matrix A. >> >> I was wondering if i can create my matrix A without knowing in advance >> the parallel distribution and then collect the information about parallel >> distribution? >> >> Or more easier, if a function like PetscGetFlops could solve my problem? >> so far i understood that this function measure the flop since the >> begining of program, but this is not what i want to use, i want to have the >> flop ratio right after ending the inner loop, and this, for each global >> iteration >> >> Any ideas/suggestions would help, >> >> Thank you very much, >> >> France >> >> >> -- >> Bien cordialement - Best regards - Mit freundlichen Gr??en, >> >> *France BOILLOD-CERNEUX * >> *PhD Student, Laboratoire d'Informatique Fondamentale de Lille * >> *(LIFL), CNRS* >> >> France.Boillod-Cerneux at cea.fr >> Tel. : +33 (0) 1 6908 - 9527 >> Tel. : +33 (0) 6 4781 - 3059 >> >> DEN/DANS/DM2S >> CEA Saclay >> 91191 Gif-sur-Yvette >> FRANCE >> >> http://www.lifl.fr/ >> www-centre-saclay.cea.fr >> >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Bien cordialement - Best regards - Mit freundlichen Gr??en, *France BOILLOD-CERNEUX * *PhD Student, Laboratoire d'Informatique Fondamentale de Lille * *(LIFL), CNRS* France.Boillod-Cerneux at cea.fr Tel. : +33 (0) 1 6908 - 9527 Tel. : +33 (0) 6 4781 - 3059 DEN/DANS/DM2S CEA Saclay 91191 Gif-sur-Yvette FRANCE http://www.lifl.fr/ www-centre-saclay.cea.fr -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Fri Aug 30 17:28:46 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 30 Aug 2013 15:28:46 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <87d2pd89uw.fsf@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 Warning -- total time of even greater than time of entire stage -- something is wrong with the timer SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 times faster. Is this normal? Do you have any more suggestion on how to reduce the FunctionEval and JacobianEval time? (Currently in IFunction, my f function is sequentially formulated; in IJacobian, the Jacobian matrix is distributed formulated). Thanks, Shuangshuang -----Original Message----- From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown Sent: Friday, August 16, 2013 5:00 PM To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) Cc: petsc-users at mcs.anl.gov Subject: RE: [petsc-users] Performance of PETSc TS solver "Jin, Shuangshuang" writes: > //////////////////////////////////////////////////////////////////////////////////////// > // This proves to be the most time-consuming block in the computation: > // Assign values to J matrix for the first 2*n rows (constant values) > ... (skipped) > > // Assign values to J matrix for the following 2*n rows (depends on X values) > for (i = 0; i < n; i++) { > for (j = 0; j < n; j++) { > ...(skipped) This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? > } > > ////////////////////////////////////////////////////////////////////// > ////////////////// > > for (i = 0; i < 4*n; i++) { > rowcol[i] = i; > } > > // Compute function over the locally owned part of the grid > for (i = xstart; i < xstart+xlen; i++) { > ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], > INSERT_VALUES); CHKERRQ(ierr); This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. From bsmith at mcs.anl.gov Fri Aug 30 17:48:06 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 30 Aug 2013 17:48:06 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> Message-ID: <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> I would next parallelize the function evaluation since it is the single largest consumer of time and should presumably be faster in parallel. After that revisit the -log_summary again to decide if the Jacobian evaluation can be improved. Barry On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" wrote: > Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > Warning -- total time of even greater than time of entire stage -- something is wrong with the timer > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 times faster. Is this normal? Do you have any more suggestion on how to reduce the FunctionEval and JacobianEval time? > (Currently in IFunction, my f function is sequentially formulated; in IJacobian, the Jacobian matrix is distributed formulated). > > Thanks, > Shuangshuang > > > > > > -----Original Message----- > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Friday, August 16, 2013 5:00 PM > To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) > Cc: petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] Performance of PETSc TS solver > > "Jin, Shuangshuang" writes: > >> //////////////////////////////////////////////////////////////////////////////////////// >> // This proves to be the most time-consuming block in the computation: >> // Assign values to J matrix for the first 2*n rows (constant values) >> ... (skipped) >> >> // Assign values to J matrix for the following 2*n rows (depends on X values) >> for (i = 0; i < n; i++) { >> for (j = 0; j < n; j++) { >> ...(skipped) > > This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? > >> } >> >> ////////////////////////////////////////////////////////////////////// >> ////////////////// >> >> for (i = 0; i < 4*n; i++) { >> rowcol[i] = i; >> } >> >> // Compute function over the locally owned part of the grid >> for (i = xstart; i < xstart+xlen; i++) { >> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], >> INSERT_VALUES); CHKERRQ(ierr); > > This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. From Shuangshuang.Jin at pnnl.gov Fri Aug 30 17:50:17 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 30 Aug 2013 15:50:17 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4E3@EMAIL04.pnl.gov> I'm sorry I made a wrong statement in the last email. My f functions in IFunction are also distributed formulated already. And the 24 seconds each for Fucntion and Jacobian EVAL are already based on this implementation. What else I can do? Thanks, Shuangshuang -----Original Message----- From: Barry Smith [mailto:bsmith at mcs.anl.gov] Sent: Friday, August 30, 2013 3:48 PM To: Jin, Shuangshuang Cc: Jed Brown; Shri (abhyshr at mcs.anl.gov); petsc-users at mcs.anl.gov Subject: Re: [petsc-users] Performance of PETSc TS solver I would next parallelize the function evaluation since it is the single largest consumer of time and should presumably be faster in parallel. After that revisit the -log_summary again to decide if the Jacobian evaluation can be improved. Barry On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" wrote: > Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > Warning -- total time of even greater than time of entire stage -- something is wrong with the timer > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 times faster. Is this normal? Do you have any more suggestion on how to reduce the FunctionEval and JacobianEval time? > (Currently in IFunction, my f function is sequentially formulated; in IJacobian, the Jacobian matrix is distributed formulated). > > Thanks, > Shuangshuang > > > > > > -----Original Message----- > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Friday, August 16, 2013 5:00 PM > To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) > Cc: petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] Performance of PETSc TS solver > > "Jin, Shuangshuang" writes: > >> >> ///////////////////////////////////////////////////////////////////// >> /////////////////// // This proves to be the most time-consuming >> block in the computation: >> // Assign values to J matrix for the first 2*n rows (constant >> values) ... (skipped) >> >> // Assign values to J matrix for the following 2*n rows (depends on >> X values) for (i = 0; i < n; i++) { >> for (j = 0; j < n; j++) { >> ...(skipped) > > This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? > >> } >> >> ///////////////////////////////////////////////////////////////////// >> / >> ////////////////// >> >> for (i = 0; i < 4*n; i++) { >> rowcol[i] = i; >> } >> >> // Compute function over the locally owned part of the grid for (i >> = xstart; i < xstart+xlen; i++) { >> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], >> INSERT_VALUES); CHKERRQ(ierr); > > This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. From jedbrown at mcs.anl.gov Fri Aug 30 17:51:51 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 30 Aug 2013 17:51:51 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> Message-ID: Also, which TS method are you using? Rosenbrock methods will amortize a lot of assembly cost by reusing the matrix for several stages. On Aug 30, 2013 3:48 PM, "Barry Smith" wrote: > > I would next parallelize the function evaluation since it is the single > largest consumer of time and should presumably be faster in parallel. After > that revisit the -log_summary again to decide if the Jacobian evaluation > can be improved. > > Barry > > On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" < > Shuangshuang.Jin at pnnl.gov> wrote: > > > Hello, I'm trying to update some of my status here. I just managed to" > _distribute_ the work of computing the Jacobian matrix" as you suggested, > so each processor only computes a part of elements for the Jacobian matrix > instead of a global Jacobian matrix. I observed a reduction of the > computation time from 351 seconds to 55 seconds, which is much better but > still slower than I expected given the problem size is small. (4n functions > in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > > > I looked at the log profile again, and saw that most of the computation > time are still for Functioan Eval and Jacobian Eval: > > > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 > 7.0e+04 93100 99 99 92 152100 99 99110 279 > > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 > 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 > 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > Warning -- total time of even greater than time of entire stage -- > something is wrong with the timer > > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 > 6.4e+04 88100 99 99 84 144100 99 99101 281 > > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 > 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 > 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 > 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 > 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 > 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 > 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 > 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 > 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 > seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 > times faster. Is this normal? Do you have any more suggestion on how to > reduce the FunctionEval and JacobianEval time? > > (Currently in IFunction, my f function is sequentially formulated; in > IJacobian, the Jacobian matrix is distributed formulated). > > > > Thanks, > > Shuangshuang > > > > > > > > > > > > -----Original Message----- > > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > > Sent: Friday, August 16, 2013 5:00 PM > > To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) > > Cc: petsc-users at mcs.anl.gov > > Subject: RE: [petsc-users] Performance of PETSc TS solver > > > > "Jin, Shuangshuang" writes: > > > >> > //////////////////////////////////////////////////////////////////////////////////////// > >> // This proves to be the most time-consuming block in the computation: > >> // Assign values to J matrix for the first 2*n rows (constant values) > >> ... (skipped) > >> > >> // Assign values to J matrix for the following 2*n rows (depends on X > values) > >> for (i = 0; i < n; i++) { > >> for (j = 0; j < n; j++) { > >> ...(skipped) > > > > This is a dense iteration. Are the entries really mostly nonzero? Why > is your i loop over all rows instead of only over xstart to xstart+xlen? > > > >> } > >> > >> ////////////////////////////////////////////////////////////////////// > >> ////////////////// > >> > >> for (i = 0; i < 4*n; i++) { > >> rowcol[i] = i; > >> } > >> > >> // Compute function over the locally owned part of the grid > >> for (i = xstart; i < xstart+xlen; i++) { > >> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], > >> INSERT_VALUES); CHKERRQ(ierr); > > > > This is seems to be creating a distributed dense matrix from a dense > matrix J of the global dimension. Is that correct? You need to > _distribute_ the work of computing the matrix entries if you want to see a > speedup. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 30 17:57:41 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 30 Aug 2013 17:57:41 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4E3@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4E3@EMAIL04.pnl.gov> Message-ID: Are you calling VecSetValues() for the vector entries? You can run with -info save the results in a file and search for the word stash in the file to see how much of the vector and matrix entries are being communicated between processes. If this number is very high then that is a problem. You can send the output to us if you like also. Barry On Aug 30, 2013, at 5:50 PM, "Jin, Shuangshuang" wrote: > I'm sorry I made a wrong statement in the last email. My f functions in IFunction are also distributed formulated already. And the 24 seconds each for Fucntion and Jacobian EVAL are already based on this implementation. What else I can do? > > Thanks, > Shuangshuang > > -----Original Message----- > From: Barry Smith [mailto:bsmith at mcs.anl.gov] > Sent: Friday, August 30, 2013 3:48 PM > To: Jin, Shuangshuang > Cc: Jed Brown; Shri (abhyshr at mcs.anl.gov); petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] Performance of PETSc TS solver > > > I would next parallelize the function evaluation since it is the single largest consumer of time and should presumably be faster in parallel. After that revisit the -log_summary again to decide if the Jacobian evaluation can be improved. > > Barry > > On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" wrote: > >> Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). >> >> I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: >> >> TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 >> TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 >> TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 >> Warning -- total time of even greater than time of entire stage -- something is wrong with the timer >> SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 >> SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 >> SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 >> SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 >> KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 >> KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 >> KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 >> PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 >> PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 >> PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 >> >> I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 times faster. Is this normal? Do you have any more suggestion on how to reduce the FunctionEval and JacobianEval time? >> (Currently in IFunction, my f function is sequentially formulated; in IJacobian, the Jacobian matrix is distributed formulated). >> >> Thanks, >> Shuangshuang >> >> >> >> >> >> -----Original Message----- >> From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown >> Sent: Friday, August 16, 2013 5:00 PM >> To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) >> Cc: petsc-users at mcs.anl.gov >> Subject: RE: [petsc-users] Performance of PETSc TS solver >> >> "Jin, Shuangshuang" writes: >> >>> >>> ///////////////////////////////////////////////////////////////////// >>> /////////////////// // This proves to be the most time-consuming >>> block in the computation: >>> // Assign values to J matrix for the first 2*n rows (constant >>> values) ... (skipped) >>> >>> // Assign values to J matrix for the following 2*n rows (depends on >>> X values) for (i = 0; i < n; i++) { >>> for (j = 0; j < n; j++) { >>> ...(skipped) >> >> This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? >> >>> } >>> >>> ///////////////////////////////////////////////////////////////////// >>> / >>> ////////////////// >>> >>> for (i = 0; i < 4*n; i++) { >>> rowcol[i] = i; >>> } >>> >>> // Compute function over the locally owned part of the grid for (i >>> = xstart; i < xstart+xlen; i++) { >>> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], >>> INSERT_VALUES); CHKERRQ(ierr); >> >> This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. > From bsmith at mcs.anl.gov Fri Aug 30 18:31:41 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 30 Aug 2013 18:31:41 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4EF@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4EF@EMAIL04.pnl.gov> Message-ID: <169B18ED-3805-4592-930D-DF3FB427D909@mcs.anl.gov> This is good. The time is not being spent in communication. So the time must be spent specifically in the actual computations. Perhaps compile with gprof (do man gprof) and run with that? You can do this on just one process. Barry On Aug 30, 2013, at 6:23 PM, "Jin, Shuangshuang" wrote: > I?m using the Trapezoidal method with the command ?-ts_theta_endpoint? > > ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr); > ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr); > ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr); > > Just did a quick try on Rosenbrock methods, and it?s diverged. > > I didn?t use VecSetValues. I only used MatSetValues multiple times inside IJacobian. > > I tried the ?info option. The output file is too large to be sent out. I search the ?Stash? and found 118678 hits in the file. All of them are like: > Line 1668: [16] MatStashScatterBegin_Private(): No of messages: 0 > Line 1669: [16] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. > Line 1670: [27] MatStashScatterBegin_Private(): No of messages: 0 > Line 1671: [27] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. > Line 1672: [28] MatStashScatterBegin_Private(): No of messages: 0 > Line 1673: [28] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. > Line 1674: [11] MatStashScatterBegin_Private(): No of messages: 0 > > Thanks, > Shuangshuang > > > > From: five9a2 at gmail.com [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Friday, August 30, 2013 3:52 PM > To: Barry Smith > Cc: PETSc users list; Shrirang Abhyankar; Jin, Shuangshuang > Subject: Re: [petsc-users] Performance of PETSc TS solver > > Also, which TS method are you using? Rosenbrock methods will amortize a lot of assembly cost by reusing the matrix for several stages. > > On Aug 30, 2013 3:48 PM, "Barry Smith" wrote: > > I would next parallelize the function evaluation since it is the single largest consumer of time and should presumably be faster in parallel. After that revisit the -log_summary again to decide if the Jacobian evaluation can be improved. > > Barry > > On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" wrote: > > > Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > > > I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: > > > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 > > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > Warning -- total time of even greater than time of entire stage -- something is wrong with the timer > > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 > > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 times faster. Is this normal? Do you have any more suggestion on how to reduce the FunctionEval and JacobianEval time? > > (Currently in IFunction, my f function is sequentially formulated; in IJacobian, the Jacobian matrix is distributed formulated). > > > > Thanks, > > Shuangshuang > > > > > > > > > > > > -----Original Message----- > > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > > Sent: Friday, August 16, 2013 5:00 PM > > To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) > > Cc: petsc-users at mcs.anl.gov > > Subject: RE: [petsc-users] Performance of PETSc TS solver > > > > "Jin, Shuangshuang" writes: > > > >> //////////////////////////////////////////////////////////////////////////////////////// > >> // This proves to be the most time-consuming block in the computation: > >> // Assign values to J matrix for the first 2*n rows (constant values) > >> ... (skipped) > >> > >> // Assign values to J matrix for the following 2*n rows (depends on X values) > >> for (i = 0; i < n; i++) { > >> for (j = 0; j < n; j++) { > >> ...(skipped) > > > > This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? > > > >> } > >> > >> ////////////////////////////////////////////////////////////////////// > >> ////////////////// > >> > >> for (i = 0; i < 4*n; i++) { > >> rowcol[i] = i; > >> } > >> > >> // Compute function over the locally owned part of the grid > >> for (i = xstart; i < xstart+xlen; i++) { > >> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], > >> INSERT_VALUES); CHKERRQ(ierr); > > > > This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. > From Shuangshuang.Jin at pnnl.gov Fri Aug 30 18:23:23 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 30 Aug 2013 16:23:23 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4EF@EMAIL04.pnl.gov> I?m using the Trapezoidal method with the command ?-ts_theta_endpoint? ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr); ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr); ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr); Just did a quick try on Rosenbrock methods, and it?s diverged. I didn?t use VecSetValues. I only used MatSetValues multiple times inside IJacobian. I tried the ?info option. The output file is too large to be sent out. I search the ?Stash? and found 118678 hits in the file. All of them are like: Line 1668: [16] MatStashScatterBegin_Private(): No of messages: 0 Line 1669: [16] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. Line 1670: [27] MatStashScatterBegin_Private(): No of messages: 0 Line 1671: [27] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. Line 1672: [28] MatStashScatterBegin_Private(): No of messages: 0 Line 1673: [28] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. Line 1674: [11] MatStashScatterBegin_Private(): No of messages: 0 Thanks, Shuangshuang From: five9a2 at gmail.com [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown Sent: Friday, August 30, 2013 3:52 PM To: Barry Smith Cc: PETSc users list; Shrirang Abhyankar; Jin, Shuangshuang Subject: Re: [petsc-users] Performance of PETSc TS solver Also, which TS method are you using? Rosenbrock methods will amortize a lot of assembly cost by reusing the matrix for several stages. On Aug 30, 2013 3:48 PM, "Barry Smith" > wrote: I would next parallelize the function evaluation since it is the single largest consumer of time and should presumably be faster in parallel. After that revisit the -log_summary again to decide if the Jacobian evaluation can be improved. Barry On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" > wrote: > Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > Warning -- total time of even greater than time of entire stage -- something is wrong with the timer > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 times faster. Is this normal? Do you have any more suggestion on how to reduce the FunctionEval and JacobianEval time? > (Currently in IFunction, my f function is sequentially formulated; in IJacobian, the Jacobian matrix is distributed formulated). > > Thanks, > Shuangshuang > > > > > > -----Original Message----- > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Friday, August 16, 2013 5:00 PM > To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) > Cc: petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] Performance of PETSc TS solver > > "Jin, Shuangshuang" > writes: > >> //////////////////////////////////////////////////////////////////////////////////////// >> // This proves to be the most time-consuming block in the computation: >> // Assign values to J matrix for the first 2*n rows (constant values) >> ... (skipped) >> >> // Assign values to J matrix for the following 2*n rows (depends on X values) >> for (i = 0; i < n; i++) { >> for (j = 0; j < n; j++) { >> ...(skipped) > > This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? > >> } >> >> ////////////////////////////////////////////////////////////////////// >> ////////////////// >> >> for (i = 0; i < 4*n; i++) { >> rowcol[i] = i; >> } >> >> // Compute function over the locally owned part of the grid >> for (i = xstart; i < xstart+xlen; i++) { >> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], >> INSERT_VALUES); CHKERRQ(ierr); > > This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri Aug 30 18:39:19 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 30 Aug 2013 18:39:19 -0500 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4EF@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4EF@EMAIL04.pnl.gov> Message-ID: Do you have a time-dependent source term (non-autonomous)? I'm trying to determine why Rosenbrock did not converge for you. But since residual and Jacobian is similar cost, it may not be faster. How does TSARKIMEX work for you? It may be able to take larger time steps than THETA. On Aug 30, 2013 4:23 PM, "Jin, Shuangshuang" wrote: > I?m using the Trapezoidal method with the command ?-ts_theta_endpoint?**** > > ** ** > > ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr);**** > > ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr);**** > > ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr);**** > > ** ** > > Just did a quick try on Rosenbrock methods, and it?s diverged.**** > > ** ** > > I didn?t use VecSetValues. I only used MatSetValues multiple times inside > IJacobian.**** > > ** ** > > I tried the ?info option. The output file is too large to be sent out. I > search the ?Stash? and found 118678 hits in the file. All of them are like: > **** > > Line 1668: [16] MatStashScatterBegin_Private(): No of messages: 0 **** > > Line 1669: [16] MatAssemblyBegin_MPIAIJ(): Stash has 0 > entries, uses 0 mallocs.**** > > Line 1670: [27] MatStashScatterBegin_Private(): No of > messages: 0 **** > > Line 1671: [27] MatAssemblyBegin_MPIAIJ(): Stash has 0 > entries, uses 0 mallocs.**** > > Line 1672: [28] MatStashScatterBegin_Private(): No of > messages: 0 **** > > Line 1673: [28] MatAssemblyBegin_MPIAIJ(): Stash has 0 > entries, uses 0 mallocs.**** > > Line 1674: [11] MatStashScatterBegin_Private(): No of > messages: 0 **** > > ** ** > > Thanks,**** > > Shuangshuang**** > > ** ** > > ** ** > > ** ** > > *From:* five9a2 at gmail.com [mailto:five9a2 at gmail.com] *On Behalf Of *Jed > Brown > *Sent:* Friday, August 30, 2013 3:52 PM > *To:* Barry Smith > *Cc:* PETSc users list; Shrirang Abhyankar; Jin, Shuangshuang > *Subject:* Re: [petsc-users] Performance of PETSc TS solver**** > > ** ** > > Also, which TS method are you using? Rosenbrock methods will amortize a > lot of assembly cost by reusing the matrix for several stages.**** > > On Aug 30, 2013 3:48 PM, "Barry Smith" wrote:**** > > > I would next parallelize the function evaluation since it is the single > largest consumer of time and should presumably be faster in parallel. After > that revisit the -log_summary again to decide if the Jacobian evaluation > can be improved. > > Barry > > On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" < > Shuangshuang.Jin at pnnl.gov> wrote: > > > Hello, I'm trying to update some of my status here. I just managed to" > _distribute_ the work of computing the Jacobian matrix" as you suggested, > so each processor only computes a part of elements for the Jacobian matrix > instead of a global Jacobian matrix. I observed a reduction of the > computation time from 351 seconds to 55 seconds, which is much better but > still slower than I expected given the problem size is small. (4n functions > in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > > > I looked at the log profile again, and saw that most of the computation > time are still for Functioan Eval and Jacobian Eval: > > > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 > 7.0e+04 93100 99 99 92 152100 99 99110 279 > > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 > 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 > 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > Warning -- total time of even greater than time of entire stage -- > something is wrong with the timer > > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 > 6.4e+04 88100 99 99 84 144100 99 99101 281 > > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 > 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 > 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 > 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 > 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 > 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 > 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 > 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 > 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 > seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 > times faster. Is this normal? Do you have any more suggestion on how to > reduce the FunctionEval and JacobianEval time? > > (Currently in IFunction, my f function is sequentially formulated; in > IJacobian, the Jacobian matrix is distributed formulated). > > > > Thanks, > > Shuangshuang > > > > > > > > > > > > -----Original Message----- > > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > > Sent: Friday, August 16, 2013 5:00 PM > > To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) > > Cc: petsc-users at mcs.anl.gov > > Subject: RE: [petsc-users] Performance of PETSc TS solver > > > > "Jin, Shuangshuang" writes: > > > >> > //////////////////////////////////////////////////////////////////////////////////////// > >> // This proves to be the most time-consuming block in the computation: > >> // Assign values to J matrix for the first 2*n rows (constant values) > >> ... (skipped) > >> > >> // Assign values to J matrix for the following 2*n rows (depends on X > values) > >> for (i = 0; i < n; i++) { > >> for (j = 0; j < n; j++) { > >> ...(skipped) > > > > This is a dense iteration. Are the entries really mostly nonzero? Why > is your i loop over all rows instead of only over xstart to xstart+xlen? > > > >> } > >> > >> ////////////////////////////////////////////////////////////////////// > >> ////////////////// > >> > >> for (i = 0; i < 4*n; i++) { > >> rowcol[i] = i; > >> } > >> > >> // Compute function over the locally owned part of the grid > >> for (i = xstart; i < xstart+xlen; i++) { > >> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], > >> INSERT_VALUES); CHKERRQ(ierr); > > > > This is seems to be creating a distributed dense matrix from a dense > matrix J of the global dimension. Is that correct? You need to > _distribute_ the work of computing the matrix entries if you want to see a > speedup.**** > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Shuangshuang.Jin at pnnl.gov Fri Aug 30 18:47:43 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Fri, 30 Aug 2013 16:47:43 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4EF@EMAIL04.pnl.gov> Message-ID: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4F5@EMAIL04.pnl.gov> I don?t know. TSARKIMEX doesn?t work for me either. ?TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase -ts_max_snes_failures or make negative to attempt recovery!? Am I use it wrong? I simply replaced: ? ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr); ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr); ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr);? by: TSCreate(PETSC_COMM_WORLD,&ts); //TSSetType(ts,TSROSW); TSSetType(ts,TSARKIMEX); Thanks, Shuangshuang From: five9a2 at gmail.com [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown Sent: Friday, August 30, 2013 4:39 PM To: Jin, Shuangshuang Cc: PETSc users list; Barry Smith; Shrirang Abhyankar Subject: RE: [petsc-users] Performance of PETSc TS solver Do you have a time-dependent source term (non-autonomous)? I'm trying to determine why Rosenbrock did not converge for you. But since residual and Jacobian is similar cost, it may not be faster. How does TSARKIMEX work for you? It may be able to take larger time steps than THETA. On Aug 30, 2013 4:23 PM, "Jin, Shuangshuang" > wrote: I?m using the Trapezoidal method with the command ?-ts_theta_endpoint? ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr); ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr); ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr); Just did a quick try on Rosenbrock methods, and it?s diverged. I didn?t use VecSetValues. I only used MatSetValues multiple times inside IJacobian. I tried the ?info option. The output file is too large to be sent out. I search the ?Stash? and found 118678 hits in the file. All of them are like: Line 1668: [16] MatStashScatterBegin_Private(): No of messages: 0 Line 1669: [16] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. Line 1670: [27] MatStashScatterBegin_Private(): No of messages: 0 Line 1671: [27] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. Line 1672: [28] MatStashScatterBegin_Private(): No of messages: 0 Line 1673: [28] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. Line 1674: [11] MatStashScatterBegin_Private(): No of messages: 0 Thanks, Shuangshuang From: five9a2 at gmail.com [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown Sent: Friday, August 30, 2013 3:52 PM To: Barry Smith Cc: PETSc users list; Shrirang Abhyankar; Jin, Shuangshuang Subject: Re: [petsc-users] Performance of PETSc TS solver Also, which TS method are you using? Rosenbrock methods will amortize a lot of assembly cost by reusing the matrix for several stages. On Aug 30, 2013 3:48 PM, "Barry Smith" > wrote: I would next parallelize the function evaluation since it is the single largest consumer of time and should presumably be faster in parallel. After that revisit the -log_summary again to decide if the Jacobian evaluation can be improved. Barry On Aug 30, 2013, at 5:28 PM, "Jin, Shuangshuang" > wrote: > Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > Warning -- total time of even greater than time of entire stage -- something is wrong with the timer > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 times faster. Is this normal? Do you have any more suggestion on how to reduce the FunctionEval and JacobianEval time? > (Currently in IFunction, my f function is sequentially formulated; in IJacobian, the Jacobian matrix is distributed formulated). > > Thanks, > Shuangshuang > > > > > > -----Original Message----- > From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown > Sent: Friday, August 16, 2013 5:00 PM > To: Jin, Shuangshuang; Barry Smith; Shri (abhyshr at mcs.anl.gov) > Cc: petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] Performance of PETSc TS solver > > "Jin, Shuangshuang" > writes: > >> //////////////////////////////////////////////////////////////////////////////////////// >> // This proves to be the most time-consuming block in the computation: >> // Assign values to J matrix for the first 2*n rows (constant values) >> ... (skipped) >> >> // Assign values to J matrix for the following 2*n rows (depends on X values) >> for (i = 0; i < n; i++) { >> for (j = 0; j < n; j++) { >> ...(skipped) > > This is a dense iteration. Are the entries really mostly nonzero? Why is your i loop over all rows instead of only over xstart to xstart+xlen? > >> } >> >> ////////////////////////////////////////////////////////////////////// >> ////////////////// >> >> for (i = 0; i < 4*n; i++) { >> rowcol[i] = i; >> } >> >> // Compute function over the locally owned part of the grid >> for (i = xstart; i < xstart+xlen; i++) { >> ierr = MatSetValues(*B, 1, &i, 4*n, rowcol, &J[i][0], >> INSERT_VALUES); CHKERRQ(ierr); > > This is seems to be creating a distributed dense matrix from a dense matrix J of the global dimension. Is that correct? You need to _distribute_ the work of computing the matrix entries if you want to see a speedup. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri Aug 30 18:57:18 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 30 Aug 2013 16:57:18 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> Message-ID: <87r4dayboh.fsf@mcs.anl.gov> "Jin, Shuangshuang" writes: > Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 The load imbalance is pretty significant here, so maybe you can distribute the work for residual evaluation better? > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > Warning -- total time of even greater than time of entire stage -- something is wrong with the timer SNESSolve contains the Jacobian and residual evaluations, as well as KSPSolve. Pretty much all the cost is in those three things. > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 > seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 > times faster. Is this normal? Do you have any more suggestion on how > to reduce the FunctionEval and JacobianEval time? It means that the linear systems are easy to solve (probably because they are small), but the IFunction and IJacobian are expensive. As Barry says, you might be able to speed it up by sequential optimization. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Fri Aug 30 19:19:16 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 30 Aug 2013 17:19:16 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4F5@EMAIL04.pnl.gov> References: <6778DE83AB681D49BFC2CD850610FEB1018FDB654F83@EMAIL04.pnl.gov> <877gfpm47j.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6552BE@EMAIL04.pnl.gov> <87fvuabcay.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB6554EF@EMAIL04.pnl.gov> <8BD6E3EF-E1AC-4FB2-ADBE-19B11E9536D5@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDB655512@EMAIL04.pnl.gov> <87d2pd89uw.fsf@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4D1@EMAIL04.pnl.gov> <3B15CB78-770E-4D9B-BFFD-A07919814E8E@mcs.anl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4EF@EMAIL04.pnl.gov> <6778DE83AB681D49BFC2CD850610FEB1018FDC8CD4F5@EMAIL04.pnl.gov> Message-ID: <87li3iyanv.fsf@mcs.anl.gov> "Jin, Shuangshuang" writes: > I don?t know. TSARKIMEX doesn?t work for me either. > > ?TSStep has failed due to DIVERGED_NONLINEAR_SOLVE, increase > -ts_max_snes_failures or make negative to attempt recovery!? It uses adaptive time stepping and the large time step cases might not converge, and thus have to shorten the step. Try using -ts_max_snes_failures 20 -ts_adapt_monitor to see what's going on. You could start with -ts_arkimex_type 1bee (backward Euler with extrapolation-based error estimator) before trying the higher order schemes. > Am I use it wrong? > > I simply replaced: > > ? ierr = TSCreate(PETSC_COMM_WORLD, &ts); CHKERRQ(ierr); > ierr = TSSetType(ts, TSTHETA); CHKERRQ(ierr); > ierr = TSThetaSetTheta(ts, 0.5); CHKERRQ(ierr);? > by: > TSCreate(PETSC_COMM_WORLD,&ts); > //TSSetType(ts,TSROSW); > TSSetType(ts,TSARKIMEX); Suggest just calling TSSetFromOptions and then pass -ts_type arkimex at run-time. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 835 bytes Desc: not available URL: From potaman at outlook.com Fri Aug 30 21:08:08 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Fri, 30 Aug 2013 22:08:08 -0400 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. Message-ID: Hi, Is there a reason that the convergence of the newton methods in SNES is much better with iterative solvers instead of with direct solvers? This is particularly in reference with the VI solvers. I get nearly quadratic convergence with the iterative solvers, but the code just diverges or converges linearly when i run with -ksp_type preonly .Thanks, Subramanya -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 30 21:11:36 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 30 Aug 2013 21:11:36 -0500 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. In-Reply-To: References: Message-ID: I would not expect this. Are you sure that direct solvers are actually working well? You can run -ksp_type gmres -pc_type lu to "accelerate" the direct solver. Under normal circumstances the direct solver will converge in one iteration but if, for example, the problem is singular or nearly singular several iterations might solve the linear system but one may not. Barry On Aug 30, 2013, at 9:08 PM, subramanya sadasiva wrote: > Hi, > Is there a reason that the convergence of the newton methods in SNES is much better with iterative solvers instead of with direct solvers? This is particularly in reference with the VI solvers. I get nearly quadratic convergence with the iterative solvers, but the code just diverges or converges linearly when i run with -ksp_type preonly . > Thanks, > Subramanya From potaman at outlook.com Fri Aug 30 21:14:19 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Fri, 30 Aug 2013 22:14:19 -0400 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. In-Reply-To: References: , Message-ID: Hi Barry, The linear solvers converge in 1 iterations. But the SNES solver fails when I use the direct solver. Thanks,Subramanya > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > From: bsmith at mcs.anl.gov > Date: Fri, 30 Aug 2013 21:11:36 -0500 > CC: petsc-users at mcs.anl.gov > To: potaman at outlook.com > > > I would not expect this. Are you sure that direct solvers are actually working well? You can run -ksp_type gmres -pc_type lu to "accelerate" the direct solver. Under normal circumstances the direct solver will converge in one iteration but if, for example, the problem is singular or nearly singular several iterations might solve the linear system but one may not. > > Barry > > On Aug 30, 2013, at 9:08 PM, subramanya sadasiva wrote: > > > Hi, > > Is there a reason that the convergence of the newton methods in SNES is much better with iterative solvers instead of with direct solvers? This is particularly in reference with the VI solvers. I get nearly quadratic convergence with the iterative solvers, but the code just diverges or converges linearly when i run with -ksp_type preonly . > > Thanks, > > Subramanya > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 30 21:16:33 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 30 Aug 2013 21:16:33 -0500 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. In-Reply-To: References: , Message-ID: <36E850C7-D37D-4B3C-AE12-905636D9438A@mcs.anl.gov> On Aug 30, 2013, at 9:14 PM, subramanya sadasiva wrote: > Hi Barry, > The linear solvers converge in 1 iterations. But the SNES solver fails when I use the direct solver. Hmm. I don't have an explanation for this. Could you send some output when running with -snes_monitor -ksp_monitor_true_residual for both the iterative case and the direct solver case? Barry > Thanks, > Subramanya > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > > From: bsmith at mcs.anl.gov > > Date: Fri, 30 Aug 2013 21:11:36 -0500 > > CC: petsc-users at mcs.anl.gov > > To: potaman at outlook.com > > > > > > I would not expect this. Are you sure that direct solvers are actually working well? You can run -ksp_type gmres -pc_type lu to "accelerate" the direct solver. Under normal circumstances the direct solver will converge in one iteration but if, for example, the problem is singular or nearly singular several iterations might solve the linear system but one may not. > > > > Barry > > > > On Aug 30, 2013, at 9:08 PM, subramanya sadasiva wrote: > > > > > Hi, > > > Is there a reason that the convergence of the newton methods in SNES is much better with iterative solvers instead of with direct solvers? This is particularly in reference with the VI solvers. I get nearly quadratic convergence with the iterative solvers, but the code just diverges or converges linearly when i run with -ksp_type preonly . > > > Thanks, > > > Subramanya > > From potaman at outlook.com Fri Aug 30 21:35:57 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Fri, 30 Aug 2013 22:35:57 -0400 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. In-Reply-To: <36E850C7-D37D-4B3C-AE12-905636D9438A@mcs.anl.gov> References: , , <36E850C7-D37D-4B3C-AE12-905636D9438A@mcs.anl.gov> Message-ID: Hi Barry, I have attached output from the solver with the options that you suggested. As you can see the direct solver causes the nonlinear solver to diverge - but the code converges easily with gmresThanks, Subramanya > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > From: bsmith at mcs.anl.gov > Date: Fri, 30 Aug 2013 21:16:33 -0500 > CC: petsc-users at mcs.anl.gov > To: potaman at outlook.com > > > On Aug 30, 2013, at 9:14 PM, subramanya sadasiva wrote: > > > Hi Barry, > > The linear solvers converge in 1 iterations. But the SNES solver fails when I use the direct solver. > > Hmm. I don't have an explanation for this. Could you send some output when running with -snes_monitor -ksp_monitor_true_residual for both the iterative case and the direct solver case? > > Barry > > > Thanks, > > Subramanya > > > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > > > From: bsmith at mcs.anl.gov > > > Date: Fri, 30 Aug 2013 21:11:36 -0500 > > > CC: petsc-users at mcs.anl.gov > > > To: potaman at outlook.com > > > > > > > > > I would not expect this. Are you sure that direct solvers are actually working well? You can run -ksp_type gmres -pc_type lu to "accelerate" the direct solver. Under normal circumstances the direct solver will converge in one iteration but if, for example, the problem is singular or nearly singular several iterations might solve the linear system but one may not. > > > > > > Barry > > > > > > On Aug 30, 2013, at 9:08 PM, subramanya sadasiva wrote: > > > > > > > Hi, > > > > Is there a reason that the convergence of the newton methods in SNES is much better with iterative solvers instead of with direct solvers? This is particularly in reference with the VI solvers. I get nearly quadratic convergence with the iterative solvers, but the code just diverges or converges linearly when i run with -ksp_type preonly . > > > > Thanks, > > > > Subramanya > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: snes_vi_with_direct_solver.rtf Type: text/richtext Size: 3310 bytes Desc: not available URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: snes_vi_with_gmres.txt URL: From knepley at gmail.com Fri Aug 30 21:41:00 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 30 Aug 2013 21:41:00 -0500 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. In-Reply-To: References: <36E850C7-D37D-4B3C-AE12-905636D9438A@mcs.anl.gov> Message-ID: On Fri, Aug 30, 2013 at 9:35 PM, subramanya sadasiva wrote: > Hi Barry, > I have attached output from the solver with the options that you > suggested. As you can see the direct solver causes the nonlinear solver to > diverge - but the code converges easily with gmres > Your Jacobian is wrong. The iterative solver does not solve it very accurately, and thus is able to get into the basin of convergence. You can try and verify this by using -snes_fd on a small problem (since its so slow). Thanks, Matt > Thanks, > Subramanya > > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct > Solvers. > > From: bsmith at mcs.anl.gov > > Date: Fri, 30 Aug 2013 21:16:33 -0500 > > CC: petsc-users at mcs.anl.gov > > To: potaman at outlook.com > > > > > > On Aug 30, 2013, at 9:14 PM, subramanya sadasiva > wrote: > > > > > Hi Barry, > > > The linear solvers converge in 1 iterations. But the SNES solver fails > when I use the direct solver. > > > > Hmm. I don't have an explanation for this. Could you send some output > when running with -snes_monitor -ksp_monitor_true_residual for both the > iterative case and the direct solver case? > > > > Barry > > > > > Thanks, > > > Subramanya > > > > > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct > Solvers. > > > > From: bsmith at mcs.anl.gov > > > > Date: Fri, 30 Aug 2013 21:11:36 -0500 > > > > CC: petsc-users at mcs.anl.gov > > > > To: potaman at outlook.com > > > > > > > > > > > > I would not expect this. Are you sure that direct solvers are > actually working well? You can run -ksp_type gmres -pc_type lu to > "accelerate" the direct solver. Under normal circumstances the direct > solver will converge in one iteration but if, for example, the problem is > singular or nearly singular several iterations might solve the linear > system but one may not. > > > > > > > > Barry > > > > > > > > On Aug 30, 2013, at 9:08 PM, subramanya sadasiva < > potaman at outlook.com> wrote: > > > > > > > > > Hi, > > > > > Is there a reason that the convergence of the newton methods in > SNES is much better with iterative solvers instead of with direct solvers? > This is particularly in reference with the VI solvers. I get nearly > quadratic convergence with the iterative solvers, but the code just > diverges or converges linearly when i run with -ksp_type preonly . > > > > > Thanks, > > > > > Subramanya > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Aug 30 21:43:07 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 30 Aug 2013 21:43:07 -0500 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. In-Reply-To: References: , , <36E850C7-D37D-4B3C-AE12-905636D9438A@mcs.anl.gov> Message-ID: \f0\fs24 \cf0 0 SNES Function norm 4.688596679512e-01\ Linear solve converged due to CONVERGED_ITS iterations 1\ I don't trust this. Change it to allow several KSP iterations instead of one and use -ksp_monitor_true_residual to see if the linear is actually solved accurately. Also I agree with Matt's email, it is possible the Jacobian is wrong and hence even if the linear solve is accurate Newton may converge. Barry On Aug 30, 2013, at 9:35 PM, subramanya sadasiva wrote: > Hi Barry, > I have attached output from the solver with the options that you suggested. As you can see the direct solver causes the nonlinear solver to diverge - but the code converges easily with gmres > Thanks, > Subramanya > > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > > From: bsmith at mcs.anl.gov > > Date: Fri, 30 Aug 2013 21:16:33 -0500 > > CC: petsc-users at mcs.anl.gov > > To: potaman at outlook.com > > > > > > On Aug 30, 2013, at 9:14 PM, subramanya sadasiva wrote: > > > > > Hi Barry, > > > The linear solvers converge in 1 iterations. But the SNES solver fails when I use the direct solver. > > > > Hmm. I don't have an explanation for this. Could you send some output when running with -snes_monitor -ksp_monitor_true_residual for both the iterative case and the direct solver case? > > > > Barry > > > > > Thanks, > > > Subramanya > > > > > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > > > > From: bsmith at mcs.anl.gov > > > > Date: Fri, 30 Aug 2013 21:11:36 -0500 > > > > CC: petsc-users at mcs.anl.gov > > > > To: potaman at outlook.com > > > > > > > > > > > > I would not expect this. Are you sure that direct solvers are actually working well? You can run -ksp_type gmres -pc_type lu to "accelerate" the direct solver. Under normal circumstances the direct solver will converge in one iteration but if, for example, the problem is singular or nearly singular several iterations might solve the linear system but one may not. > > > > > > > > Barry > > > > > > > > On Aug 30, 2013, at 9:08 PM, subramanya sadasiva wrote: > > > > > > > > > Hi, > > > > > Is there a reason that the convergence of the newton methods in SNES is much better with iterative solvers instead of with direct solvers? This is particularly in reference with the VI solvers. I get nearly quadratic convergence with the iterative solvers, but the code just diverges or converges linearly when i run with -ksp_type preonly . > > > > > Thanks, > > > > > Subramanya > > > > > > > From potaman at outlook.com Fri Aug 30 22:18:41 2013 From: potaman at outlook.com (subramanya sadasiva) Date: Fri, 30 Aug 2013 23:18:41 -0400 Subject: [petsc-users] Behaviour of Newton Methods with Direct Solvers. In-Reply-To: References: , , <36E850C7-D37D-4B3C-AE12-905636D9438A@mcs.anl.gov> , Message-ID: This is very odd as the matrix that I have is essentially the same as the matrix in ex54.C So I am pretty certain that my jacobian is correct. > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > From: bsmith at mcs.anl.gov > Date: Fri, 30 Aug 2013 21:43:07 -0500 > CC: petsc-users at mcs.anl.gov > To: potaman at outlook.com > > > \f0\fs24 \cf0 0 SNES Function norm 4.688596679512e-01\ > Linear solve converged due to CONVERGED_ITS iterations 1\ > > I don't trust this. Change it to allow several KSP iterations instead of one and use -ksp_monitor_true_residual to see if the linear is actually solved accurately. > > Also I agree with Matt's email, it is possible the Jacobian is wrong and hence even if the linear solve is accurate Newton may converge. > > Barry > > > On Aug 30, 2013, at 9:35 PM, subramanya sadasiva wrote: > > > Hi Barry, > > I have attached output from the solver with the options that you suggested. As you can see the direct solver causes the nonlinear solver to diverge - but the code converges easily with gmres > > Thanks, > > Subramanya > > > > > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > > > From: bsmith at mcs.anl.gov > > > Date: Fri, 30 Aug 2013 21:16:33 -0500 > > > CC: petsc-users at mcs.anl.gov > > > To: potaman at outlook.com > > > > > > > > > On Aug 30, 2013, at 9:14 PM, subramanya sadasiva wrote: > > > > > > > Hi Barry, > > > > The linear solvers converge in 1 iterations. But the SNES solver fails when I use the direct solver. > > > > > > Hmm. I don't have an explanation for this. Could you send some output when running with -snes_monitor -ksp_monitor_true_residual for both the iterative case and the direct solver case? > > > > > > Barry > > > > > > > Thanks, > > > > Subramanya > > > > > > > > > Subject: Re: [petsc-users] Behaviour of Newton Methods with Direct Solvers. > > > > > From: bsmith at mcs.anl.gov > > > > > Date: Fri, 30 Aug 2013 21:11:36 -0500 > > > > > CC: petsc-users at mcs.anl.gov > > > > > To: potaman at outlook.com > > > > > > > > > > > > > > > I would not expect this. Are you sure that direct solvers are actually working well? You can run -ksp_type gmres -pc_type lu to "accelerate" the direct solver. Under normal circumstances the direct solver will converge in one iteration but if, for example, the problem is singular or nearly singular several iterations might solve the linear system but one may not. > > > > > > > > > > Barry > > > > > > > > > > On Aug 30, 2013, at 9:08 PM, subramanya sadasiva wrote: > > > > > > > > > > > Hi, > > > > > > Is there a reason that the convergence of the newton methods in SNES is much better with iterative solvers instead of with direct solvers? This is particularly in reference with the VI solvers. I get nearly quadratic convergence with the iterative solvers, but the code just diverges or converges linearly when i run with -ksp_type preonly . > > > > > > Thanks, > > > > > > Subramanya > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stali at geology.wisc.edu Sat Aug 31 11:25:29 2013 From: stali at geology.wisc.edu (Tabrez Ali) Date: Sat, 31 Aug 2013 11:25:29 -0500 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <87bo4i99gx.fsf@mcs.anl.gov> References: <521D01E6.7070003@geology.wisc.edu> <87bo4i99gx.fsf@mcs.anl.gov> Message-ID: <522218F9.7090805@geology.wisc.edu> Hello So I used PCSetCoordinates and now GAMG seems to work really well in that the number of iterations are relatively constant. Here are the number of iterations on 4 cores DOF ASM GAMG 2187 15 22 14739 26 22 107811 51 29 So in PCSetCoordinates the 'coords' array should include values for the ghost nodes as well or only those values that correspond to the local owned sol'n vector? In the experiment above I included values of the ghost nodes as well (just had to add a line in my existing code) and it seems to have worked fine. Thanks in advance Tabrez On 08/27/2013 03:15 PM, Jed Brown wrote: > Tabrez Ali writes: > >> Hello >> >> What is the proper way to use GAMG on a vanilla 3D linear elasticity >> problem. Should I use >> >> -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 > > Yeah, and only the first of these is needed because the others are > default with -pc_type gamg. > >> -pc_type fieldsplit -pc_fieldsplit_block_size 3 -fieldsplit_pc_type gamg >> -fieldsplit_pc_gamg_type agg -fieldsplit_pc_gamg_agg_nsmooths 1 >> >> Do these options even make sense? With the second set of options the % >> increase in number of iterations with increasing problem size is lower >> than the first but not optimal. > > And it's probably more expensive because it has to do inner solves. > Also, if you have less compressible regions, it will get much worse. > >> Also, ksp/ksp/examples/ex56 performs much better in that the number of >> iterations remain more or less constant unlike what I see with my own >> problem. What am I doing wrong? > > You probably forgot to set the near null space. You can use > MatSetNearNullSpace (and maybe MatNullSpaceCreateRigidBody) or the more > hacky (IMO) PCSetCoordinates. It's important to have translational > *and* rotational modes in the near null space that GAMG uses to build a > coarse space. -- No one trusts a model except the one who wrote it; Everyone trusts an observation except the one who made it- Harlow Shapley From u.tabak at tudelft.nl Sat Aug 31 15:06:47 2013 From: u.tabak at tudelft.nl (Umut Tabak) Date: Sat, 31 Aug 2013 22:06:47 +0200 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <522218F9.7090805@geology.wisc.edu> References: <521D01E6.7070003@geology.wisc.edu> <87bo4i99gx.fsf@mcs.anl.gov> <522218F9.7090805@geology.wisc.edu> Message-ID: <52224CD7.8080406@tudelft.nl> On 08/31/2013 06:25 PM, Tabrez Ali wrote: > Hello > > So I used PCSetCoordinates and now GAMG seems to work really well in > that the number of iterations are relatively constant. Here are the > number of iterations on 4 cores > > DOF ASM GAMG > 2187 15 22 > 14739 26 22 > 107811 51 29 Hi, Just curious, what is the approximate condition number of this matrix, is that modelled with domain elements like solids or thin elements like shells? Best, U. From mfadams at lbl.gov Sat Aug 31 16:06:59 2013 From: mfadams at lbl.gov (Mark F. Adams) Date: Sat, 31 Aug 2013 17:06:59 -0400 Subject: [petsc-users] GAMG and linear elasticity In-Reply-To: <522218F9.7090805@geology.wisc.edu> References: <521D01E6.7070003@geology.wisc.edu> <87bo4i99gx.fsf@mcs.anl.gov> <522218F9.7090805@geology.wisc.edu> Message-ID: <517B410A-A301-4E9C-A827-E5BFFBA17B1A@lbl.gov> On Aug 31, 2013, at 12:25 PM, Tabrez Ali wrote: > Hello > > So I used PCSetCoordinates and now GAMG seems to work really well in that the number of iterations are relatively constant. Here are the number of iterations on 4 cores > > DOF ASM GAMG > 2187 15 22 > 14739 26 22 > 107811 51 29 > > So in PCSetCoordinates the 'coords' array should include values for the ghost nodes as well or only those values that correspond to the local owned sol'n vector? Local only. > In the experiment above I included values of the ghost nodes as well (just had to add a line in my existing code) and it seems to have worked fine. > You tacked it onto the end of the array and so no harm done, we just did not read it. And you might want to use MatNullSpaceCreateRigidBody to create these vectors from the coordinates. This would add one extra step but it 1) is the preferred way and 2) it sounds like you want to something like Stokes and you could run with modify the vectors from MatNullSpaceCreateRigidBody to do an all MG solver (and dump this fieldsplit crap :) SOR smoothers inode matrices are actually vertex blocked smoothers and so they are stable even though they have a zero on the diagonal (just order pressure last). I think Jed mentioned this to you but specifically you can take the vectors that come out of MatNullSpaceCreateRigidBody and think if it as a tall skinny matrix: 3*n x 6. For the 3x6 matrix for each (n) vertex, call this Q, create a 4x7 matrix: Q 0 0 1.0 and give that to GAMG (i.e., 7 vectors of size 4*n). This would be very interesting to see how it works compared to fieldsplit. Oh, and pressure has to be a vertex variable. > Thanks in advance > > Tabrez > > On 08/27/2013 03:15 PM, Jed Brown wrote: >> Tabrez Ali writes: >> >>> Hello >>> >>> What is the proper way to use GAMG on a vanilla 3D linear elasticity >>> problem. Should I use >>> >>> -pc_type gamg -pc_gamg_type agg -pc_gamg_agg_nsmooths 1 >> >> Yeah, and only the first of these is needed because the others are >> default with -pc_type gamg. >> >>> -pc_type fieldsplit -pc_fieldsplit_block_size 3 -fieldsplit_pc_type gamg >>> -fieldsplit_pc_gamg_type agg -fieldsplit_pc_gamg_agg_nsmooths 1 >>> >>> Do these options even make sense? With the second set of options the % >>> increase in number of iterations with increasing problem size is lower >>> than the first but not optimal. >> >> And it's probably more expensive because it has to do inner solves. >> Also, if you have less compressible regions, it will get much worse. >> >>> Also, ksp/ksp/examples/ex56 performs much better in that the number of >>> iterations remain more or less constant unlike what I see with my own >>> problem. What am I doing wrong? >> >> You probably forgot to set the near null space. You can use >> MatSetNearNullSpace (and maybe MatNullSpaceCreateRigidBody) or the more >> hacky (IMO) PCSetCoordinates. It's important to have translational >> *and* rotational modes in the near null space that GAMG uses to build a >> coarse space. > > > -- > No one trusts a model except the one who wrote it; Everyone trusts an observation except the one who made it- Harlow Shapley > > From Shuangshuang.Jin at pnnl.gov Sat Aug 31 17:52:52 2013 From: Shuangshuang.Jin at pnnl.gov (Jin, Shuangshuang) Date: Sat, 31 Aug 2013 15:52:52 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: <87r4dayboh.fsf@mcs.anl.gov> Message-ID: Hi, Jed, I think you have a good point here. The load imbalance might be a big problem for us, since the Jaociban matrix is not symmetric, and the distributed computation of each part of the Jacobian matrix elements on different processor can vary a lot. However, that's what the matrix looks like. Do we have any control over that? And what do you mean by "distribute the work for residual evaluation better?" I think I can only distribute the Ifunction and Ijacobian computation, but have no control of residual evaluation. Isn't it a black box inside TS? For the gprof Barry suggested, I tried to compile with gcc -pg with the sequential mode, couldn't create the gmon.out file after running the executable... Thanks, Shuangshuang On 8/30/13 4:57 PM, "Jed gov>" wrote: "Jin, Shuangshuang" writes: > Hello, I'm trying to update some of my status here. I just managed to" _distribute_ the work of computing the Jacobian matrix" as you suggested, so each processor only computes a part of elements for the Jacobian matrix instead of a global Jacobian matrix. I observed a reduction of the computation time from 351 seconds to 55 seconds, which is much better but still slower than I expected given the problem size is small. (4n functions in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > I looked at the log profile again, and saw that most of the computation time are still for Functioan Eval and Jacobian Eval: > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 7.0e+04 93100 99 99 92 152100 99 99110 279 > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 The load imbalance is pretty significant here, so maybe you can distribute the work for residual evaluation better? > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > Warning -- total time of even greater than time of entire stage -- something is wrong with the timer SNESSolve contains the Jacobian and residual evaluations, as well as KSPSolve. Pretty much all the cost is in those three things. > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 6.4e+04 88100 99 99 84 144100 99 99101 281 > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 > seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 > times faster. Is this normal? Do you have any more suggestion on how > to reduce the FunctionEval and JacobianEval time? It means that the linear systems are easy to solve (probably because they are small), but the IFunction and IJacobian are expensive. As Barry says, you might be able to speed it up by sequential optimization. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Sat Aug 31 18:10:04 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sat, 31 Aug 2013 16:10:04 -0700 Subject: [petsc-users] Performance of PETSc TS solver In-Reply-To: References: <87r4dayboh.fsf@mcs.anl.gov> Message-ID: You can choose the number of rows per process so that each has about the same number of entries. "Residual" meant IFunction and/or RHSFunction, when applicable. On Aug 31, 2013 3:53 PM, "Jin, Shuangshuang" wrote: > Hi, Jed, I think you have a good point here. The load imbalance might be > a big problem for us, since the Jaociban matrix is not symmetric, and the > distributed computation of each part of the Jacobian matrix elements on > different processor can vary a lot. However, that?s what the matrix looks > like. Do we have any control over that? And what do you mean by ?distribute > the work for residual evaluation better?? I think I can only distribute the > Ifunction and Ijacobian computation, but have no control of residual > evaluation. Isn?t it a black box inside TS? > > For the gprof Barry suggested, I tried to compile with gcc ?pg with the > sequential mode, couldn?t create the gmon.out file after running the > executable... > > Thanks, > Shuangshuang > > > On 8/30/13 4:57 PM, "Jed gov>" wrote: > > "Jin, Shuangshuang" writes: > > > Hello, I'm trying to update some of my status here. I just managed to" > _distribute_ the work of computing the Jacobian matrix" as you suggested, > so each processor only computes a part of elements for the Jacobian matrix > instead of a global Jacobian matrix. I observed a reduction of the > computation time from 351 seconds to 55 seconds, which is much better but > still slower than I expected given the problem size is small. (4n functions > in IFunction, and 4n*4n Jacobian matrix in IJacobian, n = 288). > > > > I looked at the log profile again, and saw that most of the computation > time are still for Functioan Eval and Jacobian Eval: > > > > TSStep 600 1.0 5.6103e+01 1.0 9.42e+0825.6 3.0e+06 2.9e+02 > 7.0e+04 93100 99 99 92 152100 99 99110 279 > > TSFunctionEval 2996 1.0 2.9608e+01 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 > 3.0e+04 30 0 0 0 39 50 0 0 0 47 0 > > The load imbalance is pretty significant here, so maybe you can > distribute the work for residual evaluation better? > > > TSJacobianEval 1796 1.0 2.3436e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 > 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > Warning -- total time of even greater than time of entire stage -- > something is wrong with the timer > > SNESSolve contains the Jacobian and residual evaluations, as well as > KSPSolve. Pretty much all the cost is in those three things. > > > SNESSolve 600 1.0 5.5692e+01 1.1 9.42e+0825.7 3.0e+06 2.9e+02 > 6.4e+04 88100 99 99 84 144100 99 99101 281 > > SNESFunctionEval 2396 1.0 2.3715e+01 3.4 1.04e+06 1.0 0.0e+00 0.0e+00 > 2.4e+04 25 0 0 0 31 41 0 0 0 38 1 > > SNESJacobianEval 1796 1.0 2.3447e+01 1.0 0.00e+00 0.0 5.4e+02 3.8e+01 > 1.3e+04 39 0 0 0 16 64 0 0 0 20 0 > > SNESLineSearch 1796 1.0 1.8313e+01 1.0 1.54e+0831.4 4.9e+05 2.9e+02 > 2.5e+04 30 16 16 16 33 50 16 16 16 39 139 > > KSPGMRESOrthog 9090 1.0 1.1399e+00 4.1 1.60e+07 1.0 0.0e+00 0.0e+00 > 9.1e+03 1 3 0 0 12 2 3 0 0 14 450 > > KSPSetUp 3592 1.0 2.8342e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 3.0e+01 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1796 1.0 2.3052e+00 1.0 7.87e+0825.2 2.5e+06 2.9e+02 > 2.0e+04 4 84 83 83 26 6 84 83 83 31 5680 > > PCSetUp 3592 1.0 9.1255e-02 1.7 6.47e+05 2.5 0.0e+00 0.0e+00 > 1.8e+01 0 0 0 0 0 0 0 0 0 0 159 > > PCSetUpOnBlocks 1796 1.0 6.6802e-02 2.3 6.47e+05 2.5 0.0e+00 0.0e+00 > 1.2e+01 0 0 0 0 0 0 0 0 0 0 217 > > PCApply 10886 1.0 2.6064e-01 1.3 4.70e+06 1.5 0.0e+00 0.0e+00 > 0.0e+00 0 1 0 0 0 1 1 0 0 0 481 > > > > I was wondering why SNESFunctionEval and SNESJacobianEval took over 23 > > seconds each, however, the KSPSolve only took 2.3 seconds, which is 10 > > times faster. Is this normal? Do you have any more suggestion on how > > to reduce the FunctionEval and JacobianEval time? > > It means that the linear systems are easy to solve (probably because > they are small), but the IFunction and IJacobian are expensive. As > Barry says, you might be able to speed it up by sequential optimization. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: