From dharmareddy84 at gmail.com Thu May 2 03:46:06 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 2 May 2013 03:46:06 -0500 Subject: [petsc-users] petsc, doxygen, fortran Message-ID: Hello, Not a fully petsc related question. I was wondering if any one had experience with this. I am trying to generate documentation using doxygen for my FORTRAN code which uses petsc, I see that petsc data types do not appear in the documentation. Any idea how to fix this ? I think if i use type(Mat) instead of Mat, it may work. For example: type test_t integer :: int Mat :: B end type test_t doxygen documentation does not show the variable B in type test_t Thanks Reddy -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From abdullateef.hajiali at kaust.edu.sa Thu May 2 07:04:15 2013 From: abdullateef.hajiali at kaust.edu.sa (Abdul-Lateef Haji-Ali) Date: Thu, 2 May 2013 15:04:15 +0300 Subject: [petsc-users] SNES solvers Message-ID: Greetings, I am trying to solve a system of 4 non-linear equations to find 4 unknowns. The equations are slightly complicated and unstable for slightly large values of the unknowns. In most of the cases the correct solution is found. However sometimes the values that are fed to my residual function are big and nan/inf values are produced. This is the output of my residual function (x are the parameters and f is the computed residual) x= [1, 1, 2, 4]; f= [-1.32993, 6.46116, 0.490443, -4.47854]; x= [2.32993, -5.46116, 1.50956, 8.47854]; f= [-1.77078, -133.872, 4.26217, 71.9728]; x= [1.06488, 0.684771, 1.97607, 4.2185]; f= [-0.431978, 2.79625, 0.190123, 0.0503181]; x= [4.10071, 128.411, -2.75261, -63.4942]; f= [62.0141, 2548.28, -95.0523, -1354.74]; x= [1.74294, 0.960827, 1.63558, 4.54888]; f= [19.6641, 4.52508, -9.48464, 7.44219]; x= [-57.9134, -2419.86, 92.2997, 1291.24]; f= [inf, -inf, inf, -inf]; This was using ngmres, I noticed similar behaviour with other types. What can I do to improve the solver? N ote that my experience with petsc in new and limited. Thank you, -- Abdul Lateef Haji Ali -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu May 2 07:33:21 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 2 May 2013 07:33:21 -0500 Subject: [petsc-users] SNES solvers In-Reply-To: References: Message-ID: On Thu, May 2, 2013 at 7:04 AM, Abdul-Lateef Haji-Ali < abdullateef.hajiali at kaust.edu.sa> wrote: > Greetings, > > I am trying to solve a system of 4 non-linear equations to find 4 unknowns. > The equations are slightly complicated and unstable for slightly large > values of the unknowns. > In most of the cases the correct solution is found. However sometimes the > values that are fed to my residual function are big and nan/inf values are > produced. > It could be overflow, so scaling the equations may help. However, it could also be the the math is poorly organized so that you create huge intermediate values (this seems likely since all your other values are quite small). Matt > This is the output of my residual function (x are the parameters and f is > the computed residual) > x= [1, 1, 2, 4]; f= [-1.32993, 6.46116, 0.490443, -4.47854]; > x= [2.32993, -5.46116, 1.50956, 8.47854]; f= [-1.77078, -133.872, 4.26217, > 71.9728]; > x= [1.06488, 0.684771, 1.97607, 4.2185]; f= [-0.431978, 2.79625, 0.190123, > 0.0503181]; > x= [4.10071, 128.411, -2.75261, -63.4942]; f= [62.0141, 2548.28, -95.0523, > -1354.74]; > x= [1.74294, 0.960827, 1.63558, 4.54888]; f= [19.6641, 4.52508, -9.48464, > 7.44219]; > x= [-57.9134, -2419.86, 92.2997, 1291.24]; f= [inf, -inf, inf, -inf]; > This was using ngmres, I noticed similar behaviour with other types. > > What can I do to improve the solver? > N > ote that my experience with petsc in new and limited. > > Thank you, > > -- > Abdul Lateef Haji Ali > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From nico.schloemer at gmail.com Thu May 2 09:01:18 2013 From: nico.schloemer at gmail.com (=?ISO-8859-1?Q?Nico_Schl=F6mer?=) Date: Thu, 2 May 2013 16:01:18 +0200 Subject: [petsc-users] vector-valued Laplace solver (Navier-Stokes): DIVERGED_INDEFINITE_MAT? Message-ID: Hi all, I'm trying to solve a discretization of the PDE in weak form rho/tau u - mu \Delta u = f where u is vector-valued (let's say in 2D -- this comes from a Navier--Stokes problem). Some Dirichlet-boundary conditions come with it, too. After translation in weak form, rho/tau * inner(u, v) + mu * inner(grad(u), grad(v)) = inner(f, v) I'm solving this with PETSc's CG and hypre_amg. What I get is 0 KSP preconditioned resid norm 4.962223194957e+30 true resid norm 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 7.089043065444e+19 true resid norm 2.289113027906e-02 ||r(i)||/||b|| 9.682829402926e-01 Without preconditioning, I'm getting 0 KSP preconditioned resid norm 2.364095175749e-02 true resid norm 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 1 KSP preconditioned resid norm 4.415430823612e-02 true resid norm 4.415430823612e-02 ||r(i)||/||b|| 1.867704341562e+00 2 KSP preconditioned resid norm 1.077641425707e-01 true resid norm 1.077641425707e-01 ||r(i)||/||b|| 4.558367348159e+00 and DIVERGED_INDEFINITE_MAT. Does anyone else have experience with this sort of problems? Any obvious mistakes? --Nico -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu May 2 09:33:17 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 2 May 2013 09:33:17 -0500 Subject: [petsc-users] vector-valued Laplace solver (Navier-Stokes): DIVERGED_INDEFINITE_MAT? In-Reply-To: References: Message-ID: On Thu, May 2, 2013 at 9:01 AM, Nico Schl?mer wrote: > Hi all, > > I'm trying to solve a discretization of the PDE in weak form > > rho/tau u - mu \Delta u = f > > where u is vector-valued (let's say in 2D -- this comes from a > Navier--Stokes problem). Some Dirichlet-boundary conditions come with it, > too. > > After translation in weak form, > > rho/tau * inner(u, v) + mu * inner(grad(u), grad(v)) = inner(f, v) > > I'm solving this with PETSc's CG and hypre_amg. What I get is > > 0 KSP preconditioned resid norm 4.962223194957e+30 true resid norm > 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 7.089043065444e+19 true resid norm > 2.289113027906e-02 ||r(i)||/||b|| 9.682829402926e-01 > > Without preconditioning, I'm getting > > 0 KSP preconditioned resid norm 2.364095175749e-02 true resid norm > 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 4.415430823612e-02 true resid norm > 4.415430823612e-02 ||r(i)||/||b|| 1.867704341562e+00 > 2 KSP preconditioned resid norm 1.077641425707e-01 true resid norm > 1.077641425707e-01 ||r(i)||/||b|| 4.558367348159e+00 > > and DIVERGED_INDEFINITE_MAT. > > Does anyone else have experience with this sort of problems? Any obvious > mistakes? > Do you have any non-symmetries in your discretization? With the standard P_1 basis, that operator is symmetric. Matt > --Nico > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From nico.schloemer at gmail.com Thu May 2 12:36:07 2013 From: nico.schloemer at gmail.com (=?ISO-8859-1?Q?Nico_Schl=F6mer?=) Date: Thu, 2 May 2013 19:36:07 +0200 Subject: [petsc-users] vector-valued Laplace solver (Navier-Stokes): DIVERGED_INDEFINITE_MAT? In-Reply-To: References: Message-ID: The boundary conditions weren't applied correctly, such that the operator was indeed (slightly) nonsymmetric. Seems to work now. Thanks for the hint! --Nico On Thu, May 2, 2013 at 4:33 PM, Matthew Knepley wrote: > On Thu, May 2, 2013 at 9:01 AM, Nico Schl?mer wrote: > >> Hi all, >> >> I'm trying to solve a discretization of the PDE in weak form >> >> rho/tau u - mu \Delta u = f >> >> where u is vector-valued (let's say in 2D -- this comes from a >> Navier--Stokes problem). Some Dirichlet-boundary conditions come with it, >> too. >> >> After translation in weak form, >> >> rho/tau * inner(u, v) + mu * inner(grad(u), grad(v)) = inner(f, v) >> >> I'm solving this with PETSc's CG and hypre_amg. What I get is >> >> 0 KSP preconditioned resid norm 4.962223194957e+30 true resid norm >> 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 >> 1 KSP preconditioned resid norm 7.089043065444e+19 true resid norm >> 2.289113027906e-02 ||r(i)||/||b|| 9.682829402926e-01 >> >> Without preconditioning, I'm getting >> >> 0 KSP preconditioned resid norm 2.364095175749e-02 true resid norm >> 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 >> 1 KSP preconditioned resid norm 4.415430823612e-02 true resid norm >> 4.415430823612e-02 ||r(i)||/||b|| 1.867704341562e+00 >> 2 KSP preconditioned resid norm 1.077641425707e-01 true resid norm >> 1.077641425707e-01 ||r(i)||/||b|| 4.558367348159e+00 >> >> and DIVERGED_INDEFINITE_MAT. >> >> Does anyone else have experience with this sort of problems? Any obvious >> mistakes? >> > > Do you have any non-symmetries in your discretization? With the standard > P_1 basis, that operator is symmetric. > > Matt > > >> --Nico >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu May 2 12:52:04 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 2 May 2013 12:52:04 -0500 Subject: [petsc-users] SNES solvers In-Reply-To: References: Message-ID: <388647A7-9BF5-4425-880E-768109CAB26F@mcs.anl.gov> Have you tried Newton's method? Do you have the Jacobian? Since it is a small problem you can compute the Jacobian with differences trivially just create with MatCreateSeqDense() matrix with size 4 and pass it and SNESComputeJacobianDefault to SNESSetJacobian(). Barry On May 2, 2013, at 7:04 AM, Abdul-Lateef Haji-Ali wrote: > Greetings, > > I am trying to solve a system of 4 non-linear equations to find 4 unknowns. > The equations are slightly complicated and unstable for slightly large values of the unknowns. > In most of the cases the correct solution is found. However sometimes the values that are fed to my residual function are big and nan/inf values are produced. > > This is the output of my residual function (x are the parameters and f is the computed residual) > x= [1, 1, 2, 4]; f= [-1.32993, 6.46116, 0.490443, -4.47854]; > x= [2.32993, -5.46116, 1.50956, 8.47854]; f= [-1.77078, -133.872, 4.26217, 71.9728]; > x= [1.06488, 0.684771, 1.97607, 4.2185]; f= [-0.431978, 2.79625, 0.190123, 0.0503181]; > x= [4.10071, 128.411, -2.75261, -63.4942]; f= [62.0141, 2548.28, -95.0523, -1354.74]; > x= [1.74294, 0.960827, 1.63558, 4.54888]; f= [19.6641, 4.52508, -9.48464, 7.44219]; > x= [-57.9134, -2419.86, 92.2997, 1291.24]; f= [inf, -inf, inf, -inf]; > This was using ngmres, I noticed similar behaviour with other types. > > What can I do to improve the solver? > N ote that my experience with petsc in new and limited. > > Thank you, > -- > Abdul Lateef Haji Ali From knepley at gmail.com Thu May 2 15:39:17 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 2 May 2013 15:39:17 -0500 Subject: [petsc-users] vector-valued Laplace solver (Navier-Stokes): DIVERGED_INDEFINITE_MAT? In-Reply-To: References: Message-ID: On Thu, May 2, 2013 at 12:36 PM, Nico Schl?mer wrote: > The boundary conditions weren't applied correctly, such that the operator > was indeed (slightly) nonsymmetric. Seems to work now. Thanks for the hint! > I have recently used GAMG on a problem like this with very good results. You can test it against Hypre using -pc_type gamg -pc_gamg_agg_nsmooths 1 I would be interested to hear about the effectiveness. Thanks, Matt > --Nico > > > On Thu, May 2, 2013 at 4:33 PM, Matthew Knepley wrote: > >> On Thu, May 2, 2013 at 9:01 AM, Nico Schl?mer wrote: >> >>> Hi all, >>> >>> I'm trying to solve a discretization of the PDE in weak form >>> >>> rho/tau u - mu \Delta u = f >>> >>> where u is vector-valued (let's say in 2D -- this comes from a >>> Navier--Stokes problem). Some Dirichlet-boundary conditions come with it, >>> too. >>> >>> After translation in weak form, >>> >>> rho/tau * inner(u, v) + mu * inner(grad(u), grad(v)) = inner(f, v) >>> >>> I'm solving this with PETSc's CG and hypre_amg. What I get is >>> >>> 0 KSP preconditioned resid norm 4.962223194957e+30 true resid norm >>> 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 >>> 1 KSP preconditioned resid norm 7.089043065444e+19 true resid norm >>> 2.289113027906e-02 ||r(i)||/||b|| 9.682829402926e-01 >>> >>> Without preconditioning, I'm getting >>> >>> 0 KSP preconditioned resid norm 2.364095175749e-02 true resid norm >>> 2.364095175749e-02 ||r(i)||/||b|| 1.000000000000e+00 >>> 1 KSP preconditioned resid norm 4.415430823612e-02 true resid norm >>> 4.415430823612e-02 ||r(i)||/||b|| 1.867704341562e+00 >>> 2 KSP preconditioned resid norm 1.077641425707e-01 true resid norm >>> 1.077641425707e-01 ||r(i)||/||b|| 4.558367348159e+00 >>> >>> and DIVERGED_INDEFINITE_MAT. >>> >>> Does anyone else have experience with this sort of problems? Any obvious >>> mistakes? >>> >> >> Do you have any non-symmetries in your discretization? With the standard >> P_1 basis, that operator is symmetric. >> >> Matt >> >> >>> --Nico >>> >>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.scott at ed.ac.uk Fri May 3 09:16:16 2013 From: d.scott at ed.ac.uk (David Scott) Date: Fri, 03 May 2013 15:16:16 +0100 Subject: [petsc-users] Advice Being Sought In-Reply-To: References: <51752589.4080001@ed.ac.uk> <5175649C.6060906@ed.ac.uk> Message-ID: <5183C6B0.8020903@ed.ac.uk> Barry, On 23/04/2013 02:36, Barry Smith wrote: > > On Apr 22, 2013, at 11:26 AM, David Scott wrote: > >> Thanks for the suggestion. >> >> I had tried '-pc_type gamg -pc_gamg_agg_nsmooths 1' with an earlier version of the code without success. I have tried it again but I get NaN's after only 90 time steps whereas with block Jacobi it runs quite happily for 36,000 time steps and produces physically sensible results. > > David, > > We would be very interested in determining what is "going wrong" with the solver here since we hope to make it robust. Would it be possible for you to use a MatView() and VecView() on the matrix and the right hand side with a binary viewer when it "goes bad" and send us the resulting file? > > Barry > > We'd run the gamg solver on your matrix and track down what is happening. > > I have inserted call PetscViewerBinaryOpen(PETSC_COMM_WORLD, 'DifficultSystem', & FILE_MODE_WRITE, viewer, ierr) call KSPGetOperators(ksp, A, PETSC_NULL_OBJECT, & PETSC_NULL_INTEGER, ierr) call MatView(A, viewer, ierr) call KSPGetRhs(ksp, b, ierr) call VecView(b, viewer, ierr) call PetscViewerDestroy(viewer, ierr) in my code after KSPSolve. You may download the result for the 29th time step from http://www2.epcc.ed.ac.uk/~dscott6/ I can generate data for other time steps if necessary. David -- Dr. D. M. Scott Applications Consultant Edinburgh Parallel Computing Centre Tel. 0131 650 5921 The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. From d.scott at ed.ac.uk Fri May 3 09:40:25 2013 From: d.scott at ed.ac.uk (David Scott) Date: Fri, 03 May 2013 15:40:25 +0100 Subject: [petsc-users] Advice Being Sought In-Reply-To: <5183C6B0.8020903@ed.ac.uk> References: <51752589.4080001@ed.ac.uk> <5175649C.6060906@ed.ac.uk> <5183C6B0.8020903@ed.ac.uk> Message-ID: <5183CC59.3020901@ed.ac.uk> On 03/05/2013 15:16, David Scott wrote: > I have inserted > call PetscViewerBinaryOpen(PETSC_COMM_WORLD, 'DifficultSystem', & > FILE_MODE_WRITE, viewer, ierr) > call KSPGetOperators(ksp, A, PETSC_NULL_OBJECT, & > PETSC_NULL_INTEGER, ierr) > call MatView(A, viewer, ierr) > call KSPGetRhs(ksp, b, ierr) > call VecView(b, viewer, ierr) > call PetscViewerDestroy(viewer, ierr) > in my code after KSPSolve. > > You may download the result for the 29th time step from > http://www2.epcc.ed.ac.uk/~dscott6/ > > I can generate data for other time steps if necessary. > > David I forgot to say that the program ran on 128 MPI processes with the following arguments -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_rtol 0.0001 David -- Dr. D. M. Scott Applications Consultant Edinburgh Parallel Computing Centre Tel. 0131 650 5921 The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. From dharmareddy84 at gmail.com Fri May 3 10:11:00 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Fri, 3 May 2013 10:11:00 -0500 Subject: [petsc-users] SLepc configure with Arpack Message-ID: Hello, I am trying to configure slpec with arpack. I installed the arpack-ng-3.2 in my home directory : $HOME/arpack i configure slpec as: ./configure --with-arpack-dir=$HOME/arpack/lib --with-arpack-flags='-lparpack -larpack" The configure wroks, i could compile the code. However, tests fails. I do not have $HOME/arpack/lib in the LD_LIBRARY_PATH. Tests pas iff i add arpack lib path to LD_LIB_PATH. I do not want to do that, i want to the configure to take care of the rpath I tried --with-arpack-flags="-Wl, rpath, $HOME/arpack/lib -L$HOME/arpack/lib -lparpack -larpack" This does not work. What should i do ? thanks Reddy -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Fri May 3 10:33:35 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Fri, 3 May 2013 17:33:35 +0200 Subject: [petsc-users] SLepc configure with Arpack In-Reply-To: References: Message-ID: <06A8BAC8-B228-4866-8697-FF2A3D8F2641@dsic.upv.es> El 03/05/2013, a las 17:11, Dharmendar Reddy escribi?: > Hello, > I am trying to configure slpec with arpack. I installed the arpack-ng-3.2 in my home directory : $HOME/arpack > > i configure slpec as: > > ./configure --with-arpack-dir=$HOME/arpack/lib --with-arpack-flags='-lparpack -larpack" > > The configure wroks, i could compile the code. However, tests fails. > I do not have $HOME/arpack/lib in the LD_LIBRARY_PATH. Tests pas iff i add arpack lib path to LD_LIB_PATH. > > I do not want to do that, i want to the configure to take care of the rpath > > I tried --with-arpack-flags="-Wl, rpath, $HOME/arpack/lib -L$HOME/arpack/lib -lparpack -larpack" > > This does not work. > > What should i do ? > > thanks > Reddy > I think you have to add a dash before rpath and remove the blank before $HOME/arpack/lib. See examples here http://stackoverflow.com/questions/6562403/i-dont-understand-wl-rpath-wl Jose From gokhalen at gmail.com Fri May 3 11:31:30 2013 From: gokhalen at gmail.com (Nachiket Gokhale) Date: Fri, 3 May 2013 12:31:30 -0400 Subject: [petsc-users] SLepc configure with Arpack Message-ID: I can't claim to speak for Jose, but I had asked him this same question about a year ago, and IIRC he had said that the default algorithm in SlepC (KrylovSchur) is equivalent or better then Arpack. Looking through my email this is what I did to get arpack working (I never used it because I switched to KrylovSchur). Can't guarantee it will work for you - > I was able to install ARPACK & link it with SLEPC on FC15 using gcc/g++/gfortran The issues were - > 1) I needed to change a function declaration in Arpack. This was holding up the linking. The function was etime in second.f > 2) I need to compile ARPACK with -fno-underscoring to make sure that the C code (SlepC) links with the Fortran code > 3) I needed to compile additional object files by doing make all in the LAPACK/BLAS directory that came with ARPACK to make sure that the examples ran. > 4) To link it with SlepC I needed to create position independent code using the -fPIC flag on the compiler > 5) I needed to install PETSC and SlepC with OpenMPI to be able to use Parpack. I needed to set the LD_LIBRARY_PATH to the directory where the openmpi compilers were located so that SlepC could find them. > -Nachiket -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Fri May 3 12:06:08 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Fri, 3 May 2013 19:06:08 +0200 Subject: [petsc-users] SLepc configure with Arpack In-Reply-To: References: <06A8BAC8-B228-4866-8697-FF2A3D8F2641@dsic.upv.es> Message-ID: <60333B89-52A2-4931-ACA9-062AFCE556F4@dsic.upv.es> El 03/05/2013, a las 17:53, Dharmendar Reddy escribi?: > Hello, > this is the exact command i use: > > login4$ ./configure --with-arpack-flags="-Wl,-rpath,$HOME/LocalApps/lib -L$HOME/LocalApps/lib -lparpack -larpack" > Checking environment... > Checking PETSc installation... > Checking ARPACK library... > ERROR: Unable to link with library ARPACK > ERROR: In directories /usr/local/lib > ERROR: With flags -L/usr/local/lib -Wl -rpath /home1/00924/Reddy135/LocalApps/lib -L/ > home1/00924/Reddy135/LocalApps/lib -lparpack -larpack > > ERROR: See "mpi_rScalar_Debug/conf/configure.log" file for details > > configure step is removing the commas and adding -L/usr/local/lib Yes, you are right. We do a very simple parsing of configure options and currently do not allow options containing commas. This is a bad solution and will try to fix it for the next release. Jose From jroman at dsic.upv.es Fri May 3 12:23:09 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Fri, 3 May 2013 19:23:09 +0200 Subject: [petsc-users] SLepc configure with Arpack In-Reply-To: References: <06A8BAC8-B228-4866-8697-FF2A3D8F2641@dsic.upv.es> <60333B89-52A2-4931-ACA9-062AFCE556F4@dsic.upv.es> Message-ID: <1016070E-7414-4AA1-9A1D-90496713C1CD@dsic.upv.es> El 03/05/2013, a las 19:14, Dharmendar Reddy escribi?: > Thanks. I can manage for now by adding the lib path to LD_LIBRARY_PATH. > > > On a different note, how do i get help for slepc function used in a executable. > > If i try -help with my code which also uses petsc, i see a lot of petsc realated options as well in the help. > > Is there an option like -eps_help ? > ./ex1 -help | grep eps Jose From mike.hui.zhang at hotmail.com Fri May 3 13:52:37 2013 From: mike.hui.zhang at hotmail.com (Hui Zhang) Date: Fri, 3 May 2013 20:52:37 +0200 Subject: [petsc-users] Mat local size zero Message-ID: Hello, I'm implementing a coarse problem for a domain decomposition preconditioner. I use many processors to solve one subdomain and I want to pick one processor for each subdomain to solve the global coarse problem. So I need to set the local sizes of the coarse operator on the other processors to be zero. Is this a good idea? Thanks in advance! From knepley at gmail.com Fri May 3 13:55:00 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 3 May 2013 13:55:00 -0500 Subject: [petsc-users] Mat local size zero In-Reply-To: References: Message-ID: On Fri, May 3, 2013 at 1:52 PM, Hui Zhang wrote: > Hello, > > I'm implementing a coarse problem for a domain decomposition > preconditioner. > I use many processors to solve one subdomain and I want to pick one > processor > for each subdomain to solve the global coarse problem. So I need to set > the > local sizes of the coarse operator on the other processors to be zero. Is > this a good idea? Thanks in advance! > This is fine. The only drawback would be that collective operations would still take place over the entire communicator. If this ever becomes a problems (in a land far far away), you can just call MatGetSubmatrix() for that process. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri May 3 17:34:27 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 3 May 2013 17:34:27 -0500 Subject: [petsc-users] Mat local size zero In-Reply-To: References: Message-ID: On May 3, 2013, at 1:55 PM, Matthew Knepley wrote: > On Fri, May 3, 2013 at 1:52 PM, Hui Zhang wrote: > Hello, > > I'm implementing a coarse problem for a domain decomposition preconditioner. > I use many processors to solve one subdomain and I want to pick one processor > for each subdomain to solve the global coarse problem. So I need to set the > local sizes of the coarse operator on the other processors to be zero. Is > this a good idea? Thanks in advance! > > This is fine. The only drawback would be that collective operations would still take > place over the entire communicator. If this ever becomes a problems (in a land > far far away), you can just call MatGetSubmatrix() for that process. You can also use PCREDUNDANT (see manual page) this manages everything for you; You create the coarse grid matrix across all the processes in your communicator in the normal way and it manages getting that matrix down to a subset of processes (or 1 process) and solving there and getting the answer back up to all the processes. You likely really really should use this (since it makes the process trivial) compared to writing all the stuff yourself. Barry > > Matt > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From bsmith at mcs.anl.gov Fri May 3 17:35:36 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 3 May 2013 17:35:36 -0500 Subject: [petsc-users] Advice Being Sought In-Reply-To: <5183C6B0.8020903@ed.ac.uk> References: <51752589.4080001@ed.ac.uk> <5175649C.6060906@ed.ac.uk> <5183C6B0.8020903@ed.ac.uk> Message-ID: <44696360-6B09-47F9-A6EF-7A66DEC86C57@mcs.anl.gov> Mark, Are you going to try this out? Thanks Barry On May 3, 2013, at 9:16 AM, David Scott wrote: > Barry, > > On 23/04/2013 02:36, Barry Smith wrote: >> >> On Apr 22, 2013, at 11:26 AM, David Scott wrote: >> >>> Thanks for the suggestion. >>> >>> I had tried '-pc_type gamg -pc_gamg_agg_nsmooths 1' with an earlier version of the code without success. I have tried it again but I get NaN's after only 90 time steps whereas with block Jacobi it runs quite happily for 36,000 time steps and produces physically sensible results. >> >> David, >> >> We would be very interested in determining what is "going wrong" with the solver here since we hope to make it robust. Would it be possible for you to use a MatView() and VecView() on the matrix and the right hand side with a binary viewer when it "goes bad" and send us the resulting file? >> >> Barry >> >> We'd run the gamg solver on your matrix and track down what is happening. >> >> > > I have inserted > call PetscViewerBinaryOpen(PETSC_COMM_WORLD, 'DifficultSystem', & > FILE_MODE_WRITE, viewer, ierr) > call KSPGetOperators(ksp, A, PETSC_NULL_OBJECT, & > PETSC_NULL_INTEGER, ierr) > call MatView(A, viewer, ierr) > call KSPGetRhs(ksp, b, ierr) > call VecView(b, viewer, ierr) > call PetscViewerDestroy(viewer, ierr) > in my code after KSPSolve. > > You may download the result for the 29th time step from > http://www2.epcc.ed.ac.uk/~dscott6/ > > I can generate data for other time steps if necessary. > > David > -- > Dr. D. M. Scott > Applications Consultant > Edinburgh Parallel Computing Centre > Tel. 0131 650 5921 > > The University of Edinburgh is a charitable body, registered in > Scotland, with registration number SC005336. From jedbrown at mcs.anl.gov Fri May 3 17:39:44 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 03 May 2013 17:39:44 -0500 Subject: [petsc-users] Advice Being Sought In-Reply-To: <44696360-6B09-47F9-A6EF-7A66DEC86C57@mcs.anl.gov> References: <51752589.4080001@ed.ac.uk> <5175649C.6060906@ed.ac.uk> <5183C6B0.8020903@ed.ac.uk> <44696360-6B09-47F9-A6EF-7A66DEC86C57@mcs.anl.gov> Message-ID: <87zjwbg027.fsf@mcs.anl.gov> Barry Smith writes: > Mark, > > Are you going to try this out? I started downloading it to cg.mcs, but it's going to take about an hour because the server appears to be on dial-up. From mark.adams at columbia.edu Fri May 3 19:52:17 2013 From: mark.adams at columbia.edu (Mark F. Adams) Date: Fri, 3 May 2013 19:52:17 -0500 Subject: [petsc-users] Advice Being Sought In-Reply-To: <44696360-6B09-47F9-A6EF-7A66DEC86C57@mcs.anl.gov> References: <51752589.4080001@ed.ac.uk> <5175649C.6060906@ed.ac.uk> <5183C6B0.8020903@ed.ac.uk> <44696360-6B09-47F9-A6EF-7A66DEC86C57@mcs.anl.gov> Message-ID: <2B0CE0EE-76C8-4411-9613-AC80395A2EA8@columbia.edu> On May 3, 2013, at 5:35 PM, Barry Smith wrote: > > Mark, > > Are you going to try this out? > Blowing up after some time steps is most likely (e.g., certainly) from the PC getting old. In particular the largest eigen values probably go up and the cheby smoother becomes unstable, among other things. This is a known bug and we have fixed it with: -pc_gamg_reuse_interpolation false This is a newish feature and will force the whole PC to be recomputed in each solve (this is the default behavior for ML and perhaps hypre). Mark > Thanks > > Barry > > On May 3, 2013, at 9:16 AM, David Scott wrote: > >> Barry, >> >> On 23/04/2013 02:36, Barry Smith wrote: >>> >>> On Apr 22, 2013, at 11:26 AM, David Scott wrote: >>> >>>> Thanks for the suggestion. >>>> >>>> I had tried '-pc_type gamg -pc_gamg_agg_nsmooths 1' with an earlier version of the code without success. I have tried it again but I get NaN's after only 90 time steps whereas with block Jacobi it runs quite happily for 36,000 time steps and produces physically sensible results. >>> >>> David, >>> >>> We would be very interested in determining what is "going wrong" with the solver here since we hope to make it robust. Would it be possible for you to use a MatView() and VecView() on the matrix and the right hand side with a binary viewer when it "goes bad" and send us the resulting file? >>> >>> Barry >>> >>> We'd run the gamg solver on your matrix and track down what is happening. >>> >>> >> >> I have inserted >> call PetscViewerBinaryOpen(PETSC_COMM_WORLD, 'DifficultSystem', & >> FILE_MODE_WRITE, viewer, ierr) >> call KSPGetOperators(ksp, A, PETSC_NULL_OBJECT, & >> PETSC_NULL_INTEGER, ierr) >> call MatView(A, viewer, ierr) >> call KSPGetRhs(ksp, b, ierr) >> call VecView(b, viewer, ierr) >> call PetscViewerDestroy(viewer, ierr) >> in my code after KSPSolve. >> >> You may download the result for the 29th time step from >> http://www2.epcc.ed.ac.uk/~dscott6/ >> >> I can generate data for other time steps if necessary. >> >> David >> -- >> Dr. D. M. Scott >> Applications Consultant >> Edinburgh Parallel Computing Centre >> Tel. 0131 650 5921 >> >> The University of Edinburgh is a charitable body, registered in >> Scotland, with registration number SC005336. > > From mark.adams at columbia.edu Fri May 3 20:01:09 2013 From: mark.adams at columbia.edu (Mark F. Adams) Date: Fri, 3 May 2013 20:01:09 -0500 Subject: [petsc-users] Advice Being Sought In-Reply-To: <44696360-6B09-47F9-A6EF-7A66DEC86C57@mcs.anl.gov> References: <51752589.4080001@ed.ac.uk> <5175649C.6060906@ed.ac.uk> <5183C6B0.8020903@ed.ac.uk> <44696360-6B09-47F9-A6EF-7A66DEC86C57@mcs.anl.gov> Message-ID: Also, another thing to try that should work even with older PETSc versions is: -mg_levels_ksp_chebyshev_estimate_eigenvalues 0,0.05,0,1.5 The last argument (1.5) is the largest eigenvalue safety factor (1.05 is the default). I would bet that increasing this will keep the code stable for longer. But the most robust thing to do is redo the PC every solve as I mentioned before. Mark On May 3, 2013, at 5:35 PM, Barry Smith wrote: > > Mark, > > Are you going to try this out? > > Thanks > > Barry > > On May 3, 2013, at 9:16 AM, David Scott wrote: > >> Barry, >> >> On 23/04/2013 02:36, Barry Smith wrote: >>> >>> On Apr 22, 2013, at 11:26 AM, David Scott wrote: >>> >>>> Thanks for the suggestion. >>>> >>>> I had tried '-pc_type gamg -pc_gamg_agg_nsmooths 1' with an earlier version of the code without success. I have tried it again but I get NaN's after only 90 time steps whereas with block Jacobi it runs quite happily for 36,000 time steps and produces physically sensible results. >>> >>> David, >>> >>> We would be very interested in determining what is "going wrong" with the solver here since we hope to make it robust. Would it be possible for you to use a MatView() and VecView() on the matrix and the right hand side with a binary viewer when it "goes bad" and send us the resulting file? >>> >>> Barry >>> >>> We'd run the gamg solver on your matrix and track down what is happening. >>> >>> >> >> I have inserted >> call PetscViewerBinaryOpen(PETSC_COMM_WORLD, 'DifficultSystem', & >> FILE_MODE_WRITE, viewer, ierr) >> call KSPGetOperators(ksp, A, PETSC_NULL_OBJECT, & >> PETSC_NULL_INTEGER, ierr) >> call MatView(A, viewer, ierr) >> call KSPGetRhs(ksp, b, ierr) >> call VecView(b, viewer, ierr) >> call PetscViewerDestroy(viewer, ierr) >> in my code after KSPSolve. >> >> You may download the result for the 29th time step from >> http://www2.epcc.ed.ac.uk/~dscott6/ >> >> I can generate data for other time steps if necessary. >> >> David >> -- >> Dr. D. M. Scott >> Applications Consultant >> Edinburgh Parallel Computing Centre >> Tel. 0131 650 5921 >> >> The University of Edinburgh is a charitable body, registered in >> Scotland, with registration number SC005336. > > From jedbrown at mcs.anl.gov Fri May 3 20:26:29 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 03 May 2013 20:26:29 -0500 Subject: [petsc-users] Advice Being Sought In-Reply-To: <5183CC59.3020901@ed.ac.uk> References: <51752589.4080001@ed.ac.uk> <5175649C.6060906@ed.ac.uk> <5183C6B0.8020903@ed.ac.uk> <5183CC59.3020901@ed.ac.uk> Message-ID: <87fvy3fsca.fsf@mcs.anl.gov> David Scott writes: > I forgot to say that the program ran on 128 MPI processes with the > following arguments > -pc_type gamg -pc_gamg_agg_nsmooths 1 -ksp_rtol 0.0001 Your problem is singular, as can be seen here: -mg_coarse_pc_type svd -mg_coarse_pc_svd_monitor SVD: condition number 4.795499711084e+15, 0 of 643 singular values are (nearly) zero SVD: smallest singular values: 1.007601359796e-11 1.599999673167e+02 1.599999753343e+02 1.599999841127e+02 1.599999906466e+02 SVD: largest singular values : 4.527760228588e+04 4.558254786213e+04 4.648975485549e+04 4.809775098939e+04 4.831952029788e+04 Additionally, it appears that your right hand side is inconsistent. The null space is not the constant so I would indict the discretization becoming degenerate somewhere. From dharmareddy84 at gmail.com Fri May 3 21:23:22 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Fri, 3 May 2013 21:23:22 -0500 Subject: [petsc-users] EPSSolve in a loop Message-ID: Hello, I see an interesting behavior when i call EPSSolve in a loop. Can you help me figure out whats going on ? I have a setup like this to solve a A x = lambda B x (Generalized Hermitian problem) type eigenSolver_t EPS :: eps end type eigenSolver_t the type has bound procedure which calls EPSSetOperators and EPSSolve when eigenSolver%solve() is called Now i run a for loop do ic=1,111 call eigenSolver(ic)%solve() end do I print the time for each solve. The operators A,B =A1,B1 for ic=1 to 50 A2,B2 for ic=51 to 80 and A3,B3 for ic=81 to 111 Now i see that time per solve per ic is almost constant when i use -eps_type lapack. But for, defualt solver, time per solve seem to increase with increasing ic. Have look at the attached timing information. Also, Time per solve using lapack is lower than any of the iterative solvers i have tried. Problem size is about 100 x 100, operators are tri-diagonal. Thanks Reddy -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: LapackTime Type: application/octet-stream Size: 5216 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: DefaultTime Type: application/octet-stream Size: 5216 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Fri May 3 22:20:35 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 03 May 2013 22:20:35 -0500 Subject: [petsc-users] EPSSolve in a loop In-Reply-To: References: Message-ID: <87haije8ho.fsf@mcs.anl.gov> Dharmendar Reddy writes: > Hello, > I see an interesting behavior when i call EPSSolve in a loop. Can > you help me figure out whats going on ? > > I have a setup like this to solve a A x = lambda B x (Generalized > Hermitian problem) > > type eigenSolver_t > EPS :: eps > > end type eigenSolver_t > > the type has bound procedure which calls EPSSetOperators and EPSSolve when > eigenSolver%solve() is called > > Now i run a for loop > > do ic=1,111 > call eigenSolver(ic)%solve() > end do > > I print the time for each solve. The operators A,B =A1,B1 for ic=1 to 50 > A2,B2 for ic=51 to 80 and A3,B3 for ic=81 to 111 > > Now i see that time per solve per ic is almost constant when i use > -eps_type lapack. How do you figure? It looks like it increases by more than an order of magnitude. Always send -log_summary output when asking about performance. In this case, it would be nice to use a different stage to log each solve. > But for, defualt solver, time per solve seem to increase with increasing > ic. Have look at the attached timing information. > > Also, Time per solve using lapack is lower than any of the iterative > solvers i have tried. Problem size is about 100 x 100, operators are > tri-diagonal. I'm not surprised. That problem is tiny. From dharmareddy84 at gmail.com Sat May 4 04:31:49 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Sat, 4 May 2013 04:31:49 -0500 Subject: [petsc-users] EPSSolve in a loop In-Reply-To: <87haije8ho.fsf@mcs.anl.gov> References: <87haije8ho.fsf@mcs.anl.gov> Message-ID: On Fri, May 3, 2013 at 10:20 PM, Jed Brown wrote: > Dharmendar Reddy writes: > > > Hello, > > I see an interesting behavior when i call EPSSolve in a loop. > Can > > you help me figure out whats going on ? > > > > I have a setup like this to solve a A x = lambda B x (Generalized > > Hermitian problem) > > > > type eigenSolver_t > > EPS :: eps > > > > end type eigenSolver_t > > > > the type has bound procedure which calls EPSSetOperators and EPSSolve > when > > eigenSolver%solve() is called > > > > Now i run a for loop > > > > do ic=1,111 > > call eigenSolver(ic)%solve() > > end do > > > > I print the time for each solve. The operators A,B =A1,B1 for ic=1 to 50 > > A2,B2 for ic=51 to 80 and A3,B3 for ic=81 to 111 > > > > Now i see that time per solve per ic is almost constant when i use > > -eps_type lapack. > > How do you figure? It looks like it increases by more than an order of > magnitude. > > You are right, The time per solve increases in both case, see the attached plot. I have attached the stage wise log_summary . > Always send -log_summary output when asking about performance. In this > case, it would be nice to use a different stage to log each solve. > > > But for, defualt solver, time per solve seem to increase with increasing > > ic. Have look at the attached timing information. > > > > Also, Time per solve using lapack is lower than any of the iterative > > solvers i have tried. Problem size is about 100 x 100, operators are > > tri-diagonal. > > I'm not surprised. That problem is tiny. > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: TimePerSolveEPS.tif Type: image/tiff Size: 51654 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: solvetimeDefault Type: application/octet-stream Size: 436043 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: solvetimeLapack Type: application/octet-stream Size: 225133 bytes Desc: not available URL: From jedbrown at mcs.anl.gov Sat May 4 09:54:57 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sat, 04 May 2013 09:54:57 -0500 Subject: [petsc-users] EPSSolve in a loop In-Reply-To: References: <87haije8ho.fsf@mcs.anl.gov> Message-ID: <871u9meqwu.fsf@mcs.anl.gov> Dharmendar Reddy writes: >> You are right, The time per solve increases in both case, see the > attached plot. > > I have attached the stage wise log_summary . Thanks. ########################################################## # # # WARNING!!! # # # # This code was compiled with a debugging option, # # To get timing results run ./configure # # using --with-debugging=no, the performance will # # be generally two or three times faster. # # # ########################################################## Here are some relevant events: --- Event Stage 1: Solve Step : 1 IPOrthogonalize 90 1.0 7.7698e-02 1.0 3.04e+06 1.0 0.0e+00 0.0e+00 5.4e+02 0 1 0 0 0 18 67 0 0 32 39 IPInnerProduct 896 1.0 5.1215e-02 1.0 1.60e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 12 35 0 0 0 31 VecMAXPBY 179 1.0 2.0418e-02 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 5 32 0 0 0 71 VecScale 89 1.0 9.6772e-03 1.0 7.92e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 2 0 0 0 0 1 VecReduceArith 448 1.0 3.6906e-02 1.0 1.48e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 9 32 0 0 0 40 --- Event Stage 111: Solve Step : 111 IPOrthogonalize 90 1.0 1.5872e+01 1.0 3.04e+06 1.0 0.0e+00 0.0e+00 5.4e+02 1 1 0 0 0 83 67 0 0 32 0 IPInnerProduct 896 1.0 9.5349e+00 1.0 1.60e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 50 35 0 0 0 0 VecMAXPBY 179 1.0 6.3249e+00 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 33 32 0 0 0 0 VecScale 89 1.0 3.1402e+00 1.0 7.92e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 16 0 0 0 0 0 VecReduceArith 448 1.0 9.5169e+00 1.0 1.48e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 49 32 0 0 0 0 VecScale is simple and does exactly the same amount of work in both cases, so I think the problem is in memory management. It doesn't look like you are using enough memory to be swapping and I don't think there are any CHKMEMQ statements in here. You could try running with '-malloc 0', and/or in optimized mode. It looks like you create one EPS per loop iteration, but don't destroy any of them until the end. If you don't need all the EPS at once, you should either reuse one throughout the loop or destroy the new one each step after using it. From dharmareddy84 at gmail.com Sat May 4 17:31:04 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Sat, 4 May 2013 17:31:04 -0500 Subject: [petsc-users] EPSSolve in a loop In-Reply-To: <871u9meqwu.fsf@mcs.anl.gov> References: <87haije8ho.fsf@mcs.anl.gov> <871u9meqwu.fsf@mcs.anl.gov> Message-ID: On Sat, May 4, 2013 at 9:54 AM, Jed Brown wrote: > Dharmendar Reddy writes: > > >> You are right, The time per solve increases in both case, see the > > attached plot. > > > > I have attached the stage wise log_summary . > > Thanks. > > ########################################################## > # # > # WARNING!!! # > # # > # This code was compiled with a debugging option, # > # To get timing results run ./configure # > # using --with-debugging=no, the performance will # > # be generally two or three times faster. # > # # > ########################################################## > > > Here are some relevant events: > > > --- Event Stage 1: Solve Step : 1 > > IPOrthogonalize 90 1.0 7.7698e-02 1.0 3.04e+06 1.0 0.0e+00 0.0e+00 > 5.4e+02 0 1 0 0 0 18 67 0 0 32 39 > IPInnerProduct 896 1.0 5.1215e-02 1.0 1.60e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 12 35 0 0 0 31 > VecMAXPBY 179 1.0 2.0418e-02 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 5 32 0 0 0 71 > VecScale 89 1.0 9.6772e-03 1.0 7.92e+03 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 2 0 0 0 0 1 > VecReduceArith 448 1.0 3.6906e-02 1.0 1.48e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 9 32 0 0 0 40 > > > --- Event Stage 111: Solve Step : 111 > > IPOrthogonalize 90 1.0 1.5872e+01 1.0 3.04e+06 1.0 0.0e+00 0.0e+00 > 5.4e+02 1 1 0 0 0 83 67 0 0 32 0 > IPInnerProduct 896 1.0 9.5349e+00 1.0 1.60e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 0 0 0 0 50 35 0 0 0 0 > VecMAXPBY 179 1.0 6.3249e+00 1.0 1.44e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 0 0 0 0 33 32 0 0 0 0 > VecScale 89 1.0 3.1402e+00 1.0 7.92e+03 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 16 0 0 0 0 0 > VecReduceArith 448 1.0 9.5169e+00 1.0 1.48e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 0 0 0 0 49 32 0 0 0 0 > > > VecScale is simple and does exactly the same amount of work in both > cases, so I think the problem is in memory management. It doesn't look > like you are using enough memory to be swapping and I don't think there > are any CHKMEMQ statements in here. You could try running with '-malloc > 0', and/or in optimized mode. > There is no CHKMEMQ statements. I will try running the optimized modes. Aren't the different eps objects independent ? All the objects are created only once. Every time i call solve, all i do is EPSSetOperators and EPSSolve. The code is designed to provide a eigensolver to the user, without worrying about the slepc and petsc objects inside, if i have to resue the eps objects for all solvers then i may have to share the eps between different instances of eigenSolver, i am not sure how easily i can do that. I am running the code on dual 8-core Xeon X5 with 32 GB memory. I hardly use 1 to 2 % of memory. What do you mean by memory management problem ? > It looks like you create one EPS per loop iteration, but don't destroy > any of them until the end. If you don't need all the EPS at once, you > should either reuse one throughout the loop or destroy the new one each > step after using it. > I need to reuse the eps objects, There is an outer loop do jc=1,numSolve computeOperators (A1,B1,A2,B2,A3,B3) do ic=1,111 eigenSolver(ic)%solve end do ComputeOtherQuantities based on eigenvectors end do -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Sat May 4 18:40:47 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sat, 04 May 2013 18:40:47 -0500 Subject: [petsc-users] EPSSolve in a loop In-Reply-To: References: <87haije8ho.fsf@mcs.anl.gov> <871u9meqwu.fsf@mcs.anl.gov> Message-ID: <87r4hmb9fk.fsf@mcs.anl.gov> Dharmendar Reddy writes: > There is no CHKMEMQ statements. I meant inside the VecScale() implementation. It turns out that PetscStackCall(), which is used around all BLAS/Lapack functions, among others, currently includes CHKMEMQ. CHKMEMQ walks all allocations to check for memory corruption. It can be rather time consuming when many objects (or other small allocations) occur. If you run with '-malloc 0', the performance problem will go away. When PETSc is configured --with-debugging=0, CHKMEMQ does nothing unless you pass '-malloc' (so it'll be fast by default in optimized mode). petsc-dev, is this performance degradation of multiple orders of magnitude really tolerable? Should we turn off malloc checking around BLAS/Lapack functions unless the user passes an extra flag (but leave sentinel checking around user functions because that's where almost all the memory bugs are)? From dharmareddy84 at gmail.com Sat May 4 22:54:39 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Sat, 4 May 2013 22:54:39 -0500 Subject: [petsc-users] slepc build fails.. Message-ID: Hello, I am getting an error when i try to build slepc. I first see: =============================================================================== CMake setup incomplete (status 256), falling back to legacy build =============================================================================== then make all gives an error, look at the attached logs -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: slepcLogs.tar.gz Type: application/x-gzip Size: 9637 bytes Desc: not available URL: From dharmareddy84 at gmail.com Sat May 4 23:08:43 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Sat, 4 May 2013 23:08:43 -0500 Subject: [petsc-users] slepc build fails.. In-Reply-To: References: Message-ID: My petsc installation is at commit: e9aac40 , branch next. If i have to tell which petsc version i am using when i send a query or post, what is the git command ? On Sat, May 4, 2013 at 10:54 PM, Dharmendar Reddy wrote: > Hello, > I am getting an error when i try to build slepc. > > I first see: > > =============================================================================== > > CMake setup incomplete (status 256), falling back to legacy > build > > =============================================================================== > > then make all gives an error, look at the attached logs > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Sun May 5 01:06:29 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 05 May 2013 01:06:29 -0500 Subject: [petsc-users] slepc build fails.. In-Reply-To: References: Message-ID: <874neiarkq.fsf@mcs.anl.gov> Dharmendar Reddy writes: > Hello, > I am getting an error when i try to build slepc. > > I first see: > > =============================================================================== > > CMake setup incomplete (status 256), falling back to legacy > build > =============================================================================== This problem, which is definitely a CMake bug (but they seem to have no plans to fix), is usually related to NOTFOUND cache entries: //Path to a library. -LARPACKLIB:FILEPATH=-LARPACKLIB-NOTFOUND This is probably constructed by joining an empty variable. Jose may know where in the SLEPc CMake code to look for this occurring. > then make all gives an error, look at the attached logs Does 'make test' work? What about after 'make allfortranstubs all'? From jedbrown at mcs.anl.gov Sun May 5 01:09:56 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 05 May 2013 01:09:56 -0500 Subject: [petsc-users] slepc build fails.. In-Reply-To: References: Message-ID: <871u9marez.fsf@mcs.anl.gov> Dharmendar Reddy writes: > My petsc installation is at commit: e9aac40 , branch next. > > > If i have to tell which petsc version i am using when i send a query or > post, what is the git command ? Branch and commit SHA1 is fine. You can get the commit from 'git rev-parse HEAD' or from the first line in 'git show' or 'git log'. From dharmareddy84 at gmail.com Sun May 5 02:30:03 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Sun, 5 May 2013 02:30:03 -0500 Subject: [petsc-users] slepc build fails.. In-Reply-To: <874neiarkq.fsf@mcs.anl.gov> References: <874neiarkq.fsf@mcs.anl.gov> Message-ID: On Sun, May 5, 2013 at 1:06 AM, Jed Brown wrote: > Dharmendar Reddy writes: > > > Hello, > > I am getting an error when i try to build slepc. > > > > I first see: > > > > > =============================================================================== > > > > CMake setup incomplete (status 256), falling back to legacy > > build > > > =============================================================================== > > This problem, which is definitely a CMake bug (but they seem to have no > plans to fix), is usually related to NOTFOUND cache entries: > > //Path to a library. > -LARPACKLIB:FILEPATH=-LARPACKLIB-NOTFOUND > > This is probably constructed by joining an empty variable. Jose may > know where in the SLEPc CMake code to look for this occurring. > > > then make all gives an error, look at the attached logs > > Does 'make test' work? What about after 'make allfortranstubs all'? > Make test works. login4$ make SLEPC_DIR=$PWD PETSC_DIR=/home1/00924/Reddy135/LocalApps/petsc PETSC_ARCH=mp i_rScalar_Opt test Running test examples to verify correct installation Using SLEPC_DIR=/home1/00924/Reddy135/LocalApps/slepc, PETSC_DIR=/home1/00924/Reddy135/Loc alApps/petsc and PETSC_ARCH=mpi_rScalar_Opt C/C++ example src/eps/examples/tests/test10 run successfully with 1 MPI process C/C++ example src/eps/examples/tests/test10 run successfully with 2 MPI process Fortran example src/eps/examples/tests/test7f run successfully with 1 MPI process Completed test examples But make allfortranstubs all fails.. grep: ftn-auto/makefile: No such file or directory grep: ftn-auto/makefile: No such file or directory grep: ftn-auto/makefile: No such file or directory grep: ftn-auto/makefile: No such file or directory grep: ftn-auto/makefile: No such file or directory grep: ftn-auto/makefile: No such file or directory libfast in: /home1/00924/Reddy135/LocalApps/slepc/include/finclude/ftn-auto make[5]: *** No rule to make target `slepc_tree'. Stop. libfast in: /home1/00924/Reddy135/LocalApps/slepc/include/finclude/ftn-custom libfast in: /home1/00924/Reddy135/LocalApps/slepc/include/slepc-private libfast in: /home1/00924/Reddy135/LocalApps/slepc/docs Completed building libraries ========================================= making shared libraries in /home1/00924/Reddy135/LocalApps/slepc/mpi_rScalar_Opt/lib building libslepc.so ========================================= *******************************ERROR************************************ Error during compile, check mpi_rScalar_Opt/conf/make.log Send all contents of mpi_rScalar_Opt/conf to slepc-maint at grycap.upv.es ************************************************************************ Now, i am wondering why i get these errors now ? i had no issues with slepc until i did git pull on petsc today. petsc does compile and pass tests. ? I am using slepc-dev -rev 3324 Thanks Reddy -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon May 6 08:28:59 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 6 May 2013 15:28:59 +0200 Subject: [petsc-users] slepc build fails.. In-Reply-To: References: <874neiarkq.fsf@mcs.anl.gov> Message-ID: <96329383-3FA2-490F-8CB5-2EA979F174A3@dsic.upv.es> El 05/05/2013, a las 09:30, Dharmendar Reddy escribi?: > On Sun, May 5, 2013 at 1:06 AM, Jed Brown wrote: > Dharmendar Reddy writes: > > > Hello, > > I am getting an error when i try to build slepc. > > > > I first see: > > > > =============================================================================== > > > > CMake setup incomplete (status 256), falling back to legacy > > build > > =============================================================================== > > This problem, which is definitely a CMake bug (but they seem to have no > plans to fix), is usually related to NOTFOUND cache entries: > > //Path to a library. > -LARPACKLIB:FILEPATH=-LARPACKLIB-NOTFOUND > > This is probably constructed by joining an empty variable. Jose may > know where in the SLEPc CMake code to look for this occurring. > > > then make all gives an error, look at the attached logs > > Does 'make test' work? What about after 'make allfortranstubs all'? > > Make test works. > login4$ make SLEPC_DIR=$PWD PETSC_DIR=/home1/00924/Reddy135/LocalApps/petsc PETSC_ARCH=mp > i_rScalar_Opt test > Running test examples to verify correct installation > Using SLEPC_DIR=/home1/00924/Reddy135/LocalApps/slepc, PETSC_DIR=/home1/00924/Reddy135/Loc > alApps/petsc and PETSC_ARCH=mpi_rScalar_Opt > C/C++ example src/eps/examples/tests/test10 run successfully with 1 MPI process > C/C++ example src/eps/examples/tests/test10 run successfully with 2 MPI process > Fortran example src/eps/examples/tests/test7f run successfully with 1 MPI process > Completed test examples > > But > make allfortranstubs all > fails.. > grep: ftn-auto/makefile: No such file or directory > grep: ftn-auto/makefile: No such file or directory > grep: ftn-auto/makefile: No such file or directory > grep: ftn-auto/makefile: No such file or directory > grep: ftn-auto/makefile: No such file or directory > grep: ftn-auto/makefile: No such file or directory > libfast in: /home1/00924/Reddy135/LocalApps/slepc/include/finclude/ftn-auto > make[5]: *** No rule to make target `slepc_tree'. Stop. > libfast in: /home1/00924/Reddy135/LocalApps/slepc/include/finclude/ftn-custom > libfast in: /home1/00924/Reddy135/LocalApps/slepc/include/slepc-private > libfast in: /home1/00924/Reddy135/LocalApps/slepc/docs > Completed building libraries > ========================================= > making shared libraries in /home1/00924/Reddy135/LocalApps/slepc/mpi_rScalar_Opt/lib > building libslepc.so > ========================================= > *******************************ERROR************************************ > Error during compile, check mpi_rScalar_Opt/conf/make.log > Send all contents of mpi_rScalar_Opt/conf to slepc-maint at grycap.upv.es > ************************************************************************ > > Now, i am wondering why i get these errors now ? i had no issues with slepc until i did git pull on petsc today. petsc does compile and pass tests. ? > I am using slepc-dev -rev 3324 > > Thanks > Reddy You should be using petsc-master since slepc-dev is in sync with petsc-master, not petsc-next. Anyway, I think the problem is the arpack-flags option, it should be like this (with a comma, as in the example of the manual): $ ./configure --with-arpack-dir=/home1/00924/Reddy135/LocalApps/arpack/lib --with-arpack-flags=-lparpack,-larpack Jose From dharmareddy84 at gmail.com Mon May 6 09:49:43 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Mon, 6 May 2013 09:49:43 -0500 Subject: [petsc-users] slepc build fails.. In-Reply-To: <96329383-3FA2-490F-8CB5-2EA979F174A3@dsic.upv.es> References: <874neiarkq.fsf@mcs.anl.gov> <96329383-3FA2-490F-8CB5-2EA979F174A3@dsic.upv.es> Message-ID: On Mon, May 6, 2013 at 8:28 AM, Jose E. Roman wrote: > > El 05/05/2013, a las 09:30, Dharmendar Reddy escribi?: > > > On Sun, May 5, 2013 at 1:06 AM, Jed Brown wrote: > > Dharmendar Reddy writes: > > > > > Hello, > > > I am getting an error when i try to build slepc. > > > > > > I first see: > > > > > > > =============================================================================== > > > > > > CMake setup incomplete (status 256), falling back to legacy > > > build > > > > =============================================================================== > > > > This problem, which is definitely a CMake bug (but they seem to have no > > plans to fix), is usually related to NOTFOUND cache entries: > > > > //Path to a library. > > -LARPACKLIB:FILEPATH=-LARPACKLIB-NOTFOUND > > > > This is probably constructed by joining an empty variable. Jose may > > know where in the SLEPc CMake code to look for this occurring. > > > > > then make all gives an error, look at the attached logs > > > > Does 'make test' work? What about after 'make allfortranstubs all'? > > > > Make test works. > > login4$ make SLEPC_DIR=$PWD > PETSC_DIR=/home1/00924/Reddy135/LocalApps/petsc PETSC_ARCH=mp > > i_rScalar_Opt test > > Running test examples to verify correct installation > > Using SLEPC_DIR=/home1/00924/Reddy135/LocalApps/slepc, > PETSC_DIR=/home1/00924/Reddy135/Loc > > alApps/petsc and PETSC_ARCH=mpi_rScalar_Opt > > C/C++ example src/eps/examples/tests/test10 run successfully with 1 MPI > process > > C/C++ example src/eps/examples/tests/test10 run successfully with 2 MPI > process > > Fortran example src/eps/examples/tests/test7f run successfully with 1 > MPI process > > Completed test examples > > > > But > > make allfortranstubs all > > fails.. > > grep: ftn-auto/makefile: No such file or directory > > grep: ftn-auto/makefile: No such file or directory > > grep: ftn-auto/makefile: No such file or directory > > grep: ftn-auto/makefile: No such file or directory > > grep: ftn-auto/makefile: No such file or directory > > grep: ftn-auto/makefile: No such file or directory > > libfast in: > /home1/00924/Reddy135/LocalApps/slepc/include/finclude/ftn-auto > > make[5]: *** No rule to make target `slepc_tree'. Stop. > > libfast in: > /home1/00924/Reddy135/LocalApps/slepc/include/finclude/ftn-custom > > libfast in: /home1/00924/Reddy135/LocalApps/slepc/include/slepc-private > > libfast in: /home1/00924/Reddy135/LocalApps/slepc/docs > > Completed building libraries > > ========================================= > > making shared libraries in > /home1/00924/Reddy135/LocalApps/slepc/mpi_rScalar_Opt/lib > > building libslepc.so > > ========================================= > > *******************************ERROR************************************ > > Error during compile, check mpi_rScalar_Opt/conf/make.log > > Send all contents of mpi_rScalar_Opt/conf to slepc-maint at grycap.upv.es > > ************************************************************************ > > > > Now, i am wondering why i get these errors now ? i had no issues with > slepc until i did git pull on petsc today. petsc does compile and pass > tests. ? > > I am using slepc-dev -rev 3324 > > > > Thanks > > Reddy > > You should be using petsc-master since slepc-dev is in sync with > petsc-master, not petsc-next. > Anyway, I think the problem is the arpack-flags option, it should be like > this (with a comma, as in the example of the manual): > > $ ./configure --with-arpack-dir=/home1/00924/Reddy135/LocalApps/arpack/lib > --with-arpack-flags=-lparpack,-larpack > > Thanks. I forgot the comma. That explains, why it failed this time. I did build slepc successfully with arpack many times earlier. Jose > > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Mon May 6 12:37:38 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Mon, 6 May 2013 12:37:38 -0500 Subject: [petsc-users] petsc binary vtk viewer Message-ID: Hello, I am seeing nan's at boundary locations for a plot over line data on a 2D mesh. There are no nan's in the data obtained from solver. For example, When i click plot over line along y-axis in paraview, it selects the y-coordiantes to be from -21.141444145 to 26.426805182 , The boundary Node at the positive end is showing a nan. If i change the end node to 26.426804 the nan goes away and i get the expected value (which is zero, as per imposed boundary condition). I have not seen this before with the ascii vtk writer in my fortran code. I am wondering if there is something i should be aware of petsc binary vtk viewer. I run the code on a 64 bit linux cluster. I am visualizing the vtk file on a 32 bit paraview 3.10.1 on windows xp. Thanks Reddy -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Mon May 6 12:41:11 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 06 May 2013 11:41:11 -0600 Subject: [petsc-users] petsc binary vtk viewer In-Reply-To: References: Message-ID: <87ip2wvweg.fsf@mcs.anl.gov> Dharmendar Reddy writes: > Hello, > I am seeing nan's at boundary locations for a plot over line data > on a 2D mesh. There are no nan's in the data obtained from solver. > > For example, When i click plot over line along y-axis in paraview, it > selects the y-coordiantes to be from > > -21.141444145 to 26.426805182 , The boundary Node at the positive end is > showing a nan. If i change the end node to 26.426804 the nan goes away > and i get the expected value (which is zero, as per imposed boundary > condition). > > I have not seen this before with the ascii vtk writer in my fortran code. > > I am wondering if there is something i should be aware of petsc binary vtk > viewer. I run the code on a 64 bit linux cluster. I am visualizing the vtk > file on a 32 bit paraview 3.10.1 on windows xp. 32/64-bit values shouldn't matter, but it's possible there is a bug with 64-bit-indices (I don't remember testing it). Is the writing code valgrind-clean? Is it reproducible? From dharmareddy84 at gmail.com Mon May 6 13:15:45 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Mon, 6 May 2013 13:15:45 -0500 Subject: [petsc-users] petsc binary vtk viewer In-Reply-To: <87ip2wvweg.fsf@mcs.anl.gov> References: <87ip2wvweg.fsf@mcs.anl.gov> Message-ID: On Mon, May 6, 2013 at 12:41 PM, Jed Brown wrote: > Dharmendar Reddy writes: > > > Hello, > > I am seeing nan's at boundary locations for a plot over line > data > > on a 2D mesh. There are no nan's in the data obtained from solver. > > > > For example, When i click plot over line along y-axis in paraview, it > > selects the y-coordiantes to be from > > > > -21.141444145 to 26.426805182 , The boundary Node at the positive end is > > showing a nan. If i change the end node to 26.426804 the nan goes away > > and i get the expected value (which is zero, as per imposed boundary > > condition). > > > > I have not seen this before with the ascii vtk writer in my fortran code. > > > > I am wondering if there is something i should be aware of petsc binary > vtk > > viewer. I run the code on a 64 bit linux cluster. I am visualizing the > vtk > > file on a 32 bit paraview 3.10.1 on windows xp. > > 32/64-bit values shouldn't matter, but it's possible there is a bug with > 64-bit-indices (I don't remember testing it). Is the writing code > valgrind-clean? Is it reproducible? > Hello, I ran the code a few times now, I get the same error in paraview. I can see that there are no nan's in the data when i look at spreadsheet view of the 2D. But nan's show up in plot over line. Also, The following is the final output for valgrind. My code is in fortran, based on valgrind, i think are no leaks in the program. ==18978== ==18978== HEAP SUMMARY: ==18978== in use at exit: 0 bytes in 0 blocks ==18978== total heap usage: 0 allocs, 0 frees, 0 bytes allocated ==18978== ==18978== All heap blocks were freed -- no leaks are possible ==18978== ==18978== For counts of detected and suppressed errors, rerun with: -v ==18978== Use --track-origins=yes to see where uninitialised values come from ==18978== ERROR SUMMARY: 24076 errors from 1000 contexts (suppressed: 14 from 9) Profiling timer expired -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From shchen at www.phys.lsu.edu Mon May 6 16:00:53 2013 From: shchen at www.phys.lsu.edu (Shaohao Chen) Date: Mon, 6 May 2013 16:00:53 -0500 Subject: [petsc-users] question on MatSubMatrixUpdate Message-ID: <20130506204614.M79957@physics.lsu.edu> Dear managers and users, I got a problem when using PETSc to make parallel codes. Could you please help me out? Basically, I need to do is to update the values of some blocks of a large matrix (not the whole matrix), and I need to do it many times in a big loop. These blocks could be assembled to the same or different processors. If using "MatSetValues", it would spend much time for the data transfer between different processors. I expect "MatSubMatrixUpdate" could do the job better. Am I correct? But I can not find an example of using "MatSubMatrixUpdate" on the website. It is said that the "MatSubMatrixUpdate" is only in a "developer" level and the users should use some other functions to replace it. What other functions should I use? Thank you for your attention! -- Shaohao Chen Department of Physics & Astronomy, Louisiana State University, Baton Rouge, LA From knepley at gmail.com Mon May 6 16:17:29 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 6 May 2013 16:17:29 -0500 Subject: [petsc-users] question on MatSubMatrixUpdate In-Reply-To: <20130506204614.M79957@physics.lsu.edu> References: <20130506204614.M79957@physics.lsu.edu> Message-ID: On Mon, May 6, 2013 at 4:00 PM, Shaohao Chen wrote: > Dear managers and users, > > I got a problem when using PETSc to make parallel codes. Could you please > help me out? > > Basically, I need to do is to update the values of some blocks of a large > matrix (not the whole matrix), > and I need to do it many times in a big loop. These blocks could be > assembled to the same or > different processors. If using "MatSetValues", it would spend much time > for the data transfer between > Are you guessing, or have you measured this? > different processors. I expect "MatSubMatrixUpdate" could do the job > better. Am I correct? But I can > Almost certainly not. It cannot do less communication than MatSetValues(), its just easier sometimes. Matt > not find an example of using "MatSubMatrixUpdate" on the website. It is > said that the > "MatSubMatrixUpdate" is only in a "developer" level and the users should > use some other functions to > replace it. What other functions should I use? > > Thank you for your attention! > > -- > Shaohao Chen > Department of Physics & Astronomy, > Louisiana State University, > Baton Rouge, LA > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Mon May 6 17:23:20 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 06 May 2013 16:23:20 -0600 Subject: [petsc-users] petsc binary vtk viewer In-Reply-To: References: <87ip2wvweg.fsf@mcs.anl.gov> Message-ID: <87ppx3vjc7.fsf@mcs.anl.gov> Dharmendar Reddy writes: >> 32/64-bit values shouldn't matter, but it's possible there is a bug with >> 64-bit-indices (I don't remember testing it). Is the writing code >> valgrind-clean? Is it reproducible? >> > > Hello, I ran the code a few times now, I get the same error in paraview. > I can see that there are no nan's in the data when i look at spreadsheet > view of the 2D. But nan's show up in plot over line. What should I do? If you think there is a problem in PETSc code, I'm afraid I'll need some way to reproduce (test case preferred) or much more specific information about how it occurs. > Also, The following is the final output for valgrind. My code is in > fortran, based on valgrind, i think are no leaks in the program. > > ==18978== > ==18978== HEAP SUMMARY: > ==18978== in use at exit: 0 bytes in 0 blocks > ==18978== total heap usage: 0 allocs, 0 frees, 0 bytes allocated > ==18978== > ==18978== All heap blocks were freed -- no leaks are possible > ==18978== > ==18978== For counts of detected and suppressed errors, rerun with: -v > ==18978== Use --track-origins=yes to see where uninitialised values come > from > ==18978== ERROR SUMMARY: 24076 errors from 1000 contexts (suppressed: 14 > from 9) > Profiling timer expired From jedbrown at mcs.anl.gov Mon May 6 17:33:09 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 06 May 2013 16:33:09 -0600 Subject: [petsc-users] question on MatSubMatrixUpdate In-Reply-To: <20130506204614.M79957@physics.lsu.edu> References: <20130506204614.M79957@physics.lsu.edu> Message-ID: <87mws7vivu.fsf@mcs.anl.gov> Shaohao Chen writes: >I expect "MatSubMatrixUpdate" could do the job better. Am I correct? But I can > not find an example of using "MatSubMatrixUpdate" on the website. It is said that the > "MatSubMatrixUpdate" is only in a "developer" level and the users should use some other functions to > replace it. You misunderstand MatSubMatrix. It is a type that represents a submatrix by applying the full global matrix and restricting the result. It's mostly interesting when the original matrix is MatShell, MatMFFD, or some other structured operator that cannot be taken apart. From dharmareddy84 at gmail.com Mon May 6 18:06:50 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Mon, 6 May 2013 18:06:50 -0500 Subject: [petsc-users] petsc binary vtk viewer In-Reply-To: <87ppx3vjc7.fsf@mcs.anl.gov> References: <87ip2wvweg.fsf@mcs.anl.gov> <87ppx3vjc7.fsf@mcs.anl.gov> Message-ID: On Mon, May 6, 2013 at 5:23 PM, Jed Brown wrote: > Dharmendar Reddy writes: > > >> 32/64-bit values shouldn't matter, but it's possible there is a bug with > >> 64-bit-indices (I don't remember testing it). Is the writing code > >> valgrind-clean? Is it reproducible? > >> > > > > Hello, I ran the code a few times now, I get the same error in > paraview. > > I can see that there are no nan's in the data when i look at spreadsheet > > view of the 2D. But nan's show up in plot over line. > > What should I do? If you think there is a problem in PETSc code, I'm > afraid I'll need some way to reproduce (test case preferred) or much > more specific information about how it occurs. > > Ahh Jed, I am not sure, and i do not think, if its a petsc related problem. I will try to see why it happens. I was just looking for pointers on possible reason. If i can come up with a simple reproducible test case, i will post it. Thanks Reddy > > Also, The following is the final output for valgrind. My code is in > > fortran, based on valgrind, i think are no leaks in the program. > > > > ==18978== > > ==18978== HEAP SUMMARY: > > ==18978== in use at exit: 0 bytes in 0 blocks > > ==18978== total heap usage: 0 allocs, 0 frees, 0 bytes allocated > > ==18978== > > ==18978== All heap blocks were freed -- no leaks are possible > > ==18978== > > ==18978== For counts of detected and suppressed errors, rerun with: -v > > ==18978== Use --track-origins=yes to see where uninitialised values come > > from > > ==18978== ERROR SUMMARY: 24076 errors from 1000 contexts (suppressed: 14 > > from 9) > > Profiling timer expired > > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From frtr at fysik.dtu.dk Tue May 7 04:49:11 2013 From: frtr at fysik.dtu.dk (Frederik Treue) Date: Tue, 7 May 2013 11:49:11 +0200 Subject: [petsc-users] pctype comparison Message-ID: <1367920151.23563.17.camel@frtr-laptop> Hi, is there any way of checking which type of PC you are using in the code? Ie. I want to do something like: ... PCType mytype; PC pc; if (PhaseOfTheMoon==waxing) { mytype=PCJacobi; } else { mytype=PCMG; } ... ierr=PCSetType(pc,mytype);CHKERRQ(ierr); if (mytype==PCMG) { ierr=PCMGSetLevels(pc,levels,PETSC_NULL);CHKERRQ(ierr); ... } ... But since mytype is apparently a pointer type, this doesn't work. /Frederik Treue From knepley at gmail.com Tue May 7 06:42:09 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 7 May 2013 06:42:09 -0500 Subject: [petsc-users] pctype comparison In-Reply-To: <1367920151.23563.17.camel@frtr-laptop> References: <1367920151.23563.17.camel@frtr-laptop> Message-ID: On Tue, May 7, 2013 at 4:49 AM, Frederik Treue wrote: > Hi, > > is there any way of checking which type of PC you are using in the code? > Ie. I want to do something like: > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscObjectTypeCompare.html Matt > ... > PCType mytype; > PC pc; > if (PhaseOfTheMoon==waxing) { > mytype=PCJacobi; > } else { > mytype=PCMG; > } > ... > ierr=PCSetType(pc,mytype);CHKERRQ(ierr); > if (mytype==PCMG) { > ierr=PCMGSetLevels(pc,levels,PETSC_NULL);CHKERRQ(ierr); > ... > } > ... > > But since mytype is apparently a pointer type, this doesn't work. > > /Frederik Treue > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Thomas.Ponweiser at risc-software.at Tue May 7 07:59:44 2013 From: Thomas.Ponweiser at risc-software.at (Thomas Ponweiser) Date: Tue, 7 May 2013 12:59:44 +0000 Subject: [petsc-users] How to assemble a sparse SPD matrix in parallel Message-ID: Dear PETSc community! I would like to read in a (large) sparse SPD matrix from a file in parallel. More precisely my plan was to do the following: 1) Read matrix size N from file. 2) Create PETSc matrix. 3) Set option MAT_SPD=PETSC_TRUE. 4) Set global size N x N, local sizes PETSC_DECIDE. 5) Read in only those rows from file, which are owned by the local process. 6) Preallocate the matrix using statistics collected in the previous step. 7) Insert the values read into the matrix row-by-row. 8) Begin and finish matrix assembly. My problem is in step 5, leading to 3 questions: QUESTION 1: How can I let PETSc decide, which rows of the global matrix will be local to the process BEFORE prealloction? In the manual pages I have found so far: A) MatGetOwnershipRange(): "requires that the matrix be preallocated". B) MatGetOwnershipRanges(): "Not collective, unless matrix has not been allocated, then collective on Mat" However, when running the program, I get the error message: "Must call MatXXXSetPreallocation() or MatSetUp() ... before MatGetOwnershipRanges()!" QUESTION 2: Is the documentation of MatGetOwnershipRanges() incorrect or am I misinterpreting it? -> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRanges.html I finally got the program running by calling MatSetUp() before MatGetOwnershipRange(). Still I cannot fight the feeling that I am doing things not as they have been intended by the developers, since preallocation is now done twice. The alternative seems to be to use PetscSplitOwnership() and MPI_Scan() to calculate the row ranges for each process before creating the matrix with MatCreate(). But this leads in any case to a very even distribution of row counts among the processes. Assuming that only the upper triangular part of the symmetric matrix needs to be stored (IS THIS CORRECT?), I would guess that consequently this leads to an imbalance regarding the number of (nonzero) matrix entries owned by each process (Processes with higher rank will own fewer nonzeros). QUESTION 3: For SPD matrices, is it in general a good strategy to have every process owning approximately the same number of rows? (In this case, I can of course forget about PetscSplitOwnership() and MPI_Scan() and do the distribution myself). Thank you and kind regards, Thomas Ponweiser -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue May 7 08:36:49 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 07 May 2013 07:36:49 -0600 Subject: [petsc-users] How to assemble a sparse SPD matrix in parallel In-Reply-To: References: Message-ID: <87wqraud1q.fsf@mcs.anl.gov> Thomas Ponweiser writes: > Dear PETSc community! > > I would like to read in a (large) sparse SPD matrix from a file in parallel. More precisely my plan was to do the following: > > 1) Read matrix size N from file. > 2) Create PETSc matrix. > 3) Set option MAT_SPD=PETSC_TRUE. > 4) Set global size N x N, local sizes PETSC_DECIDE. > 5) Read in only those rows from file, which are owned by the local process. > 6) Preallocate the matrix using statistics collected in the previous step. > 7) Insert the values read into the matrix row-by-row. > 8) Begin and finish matrix assembly. > > My problem is in step 5, leading to 3 questions: > > QUESTION 1: How can I let PETSc decide, which rows of the global > matrix will be local to the process BEFORE prealloction? > > In the manual pages I have found so far: > A) MatGetOwnershipRange(): > "requires that the matrix be preallocated". > B) MatGetOwnershipRanges(): > "Not collective, unless matrix has not been allocated, then collective on Mat" > However, when running the program, I get the error message: "Must call MatXXXSetPreallocation() or MatSetUp() ... before MatGetOwnershipRanges()!" > > QUESTION 2: Is the documentation of MatGetOwnershipRanges() incorrect or am I misinterpreting it? > -> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRanges.html > > I finally got the program running by calling MatSetUp() before > MatGetOwnershipRange(). Still I cannot fight the feeling that I am > doing things not as they have been intended by the developers, since > preallocation is now done twice. You should use PetscSplitOwnership(). We couldn't do anything more intelligent anyway without because such would be necessarily matrix-dependent. > The alternative seems to be to use PetscSplitOwnership() and > MPI_Scan() to calculate the row ranges for each process before > creating the matrix with MatCreate(). But this leads in any case to a > very even distribution of row counts among the processes. Assuming > that only the upper triangular part of the symmetric matrix needs to > be stored (IS THIS CORRECT?), I would guess that consequently this > leads to an imbalance regarding the number of (nonzero) matrix entries > owned by each process (Processes with higher rank will own fewer > nonzeros). > > QUESTION 3: For SPD matrices, is it in general a good strategy to have > every process owning approximately the same number of rows? (In this > case, I can of course forget about PetscSplitOwnership() and > MPI_Scan() and do the distribution myself). Yes, but this only affects the off-diagonal part (which involves communication). For most mesh-based problems, the size of the off-diagonal part is basically the same size for all processes except those on the trailing boundary. It's not a problem to have many balanced ranks and a few "light" ones. From frtr at fysik.dtu.dk Tue May 7 09:20:21 2013 From: frtr at fysik.dtu.dk (Frederik Treue) Date: Tue, 7 May 2013 16:20:21 +0200 Subject: [petsc-users] PC introduce errors at processor limits? Message-ID: <1367936421.23563.31.camel@frtr-laptop> Hi, I may be overlooking something very obvious here, but: I'm trying to solve a convection-diffusion problem. The method I use requires me to solve some Helmholtz and Poisson equations, which is the time consuming part. In order to reduce this time, I try to use preconditioners. However, no matter which preconditioner I use (I've tried PCJACOBI,PCBJACOBI,PCPBJACOBI,PCMG with 2 levels, galerkin) it introduces errors along the edges of the local domains when using multiple processors. These errors are small, but they don't converge to 0 as a function of resolution. I have checked with PCNONE, which eliminates the problem, but becomes unbearably slow. Is this somehow unavoidable? Or am I making some silly mistake? The code is somewhat complicated, but if desired, I can try to cook up a proof-of-(non)concept. /Frederik Treue From knepley at gmail.com Tue May 7 09:23:05 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 7 May 2013 09:23:05 -0500 Subject: [petsc-users] PC introduce errors at processor limits? In-Reply-To: <1367936421.23563.31.camel@frtr-laptop> References: <1367936421.23563.31.camel@frtr-laptop> Message-ID: On Tue, May 7, 2013 at 9:20 AM, Frederik Treue wrote: > Hi, > > I may be overlooking something very obvious here, but: > > I'm trying to solve a convection-diffusion problem. The method I use > requires me to solve some Helmholtz and Poisson equations, which is the > time consuming part. In order to reduce this time, I try to use > preconditioners. However, no matter which preconditioner I use (I've > tried PCJACOBI,PCBJACOBI,PCPBJACOBI,PCMG with 2 levels, galerkin) it > introduces errors along the edges of the local domains when using > multiple processors. These errors are small, but they don't converge to > 0 as a function of resolution. I have checked with PCNONE, which > eliminates the problem, but becomes unbearably slow. > > Is this somehow unavoidable? Or am I making some silly mistake? The code > is somewhat complicated, but if desired, I can try to cook up a > proof-of-(non)concept. You have made a mistake somewhere, probably in your parallel function evaluation. For Poisson, see SNES ex5. Matt > > /Frederik Treue > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue May 7 13:01:39 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 7 May 2013 13:01:39 -0500 Subject: [petsc-users] pctype comparison In-Reply-To: References: <1367920151.23563.17.camel@frtr-laptop> Message-ID: <180EC9E9-B650-4289-A88B-94991A8422CB@mcs.anl.gov> On May 7, 2013, at 6:42 AM, Matthew Knepley wrote: > On Tue, May 7, 2013 at 4:49 AM, Frederik Treue wrote: > Hi, > > is there any way of checking which type of PC you are using in the code? > Ie. I want to do something like: > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscObjectTypeCompare.html > > Matt > > ... > PCType mytype; > PC pc; > if (PhaseOfTheMoon==waxing) { > mytype=PCJacobi; > } else { > mytype=PCMG; > } > ... > ierr=PCSetType(pc,mytype);CHKERRQ(ierr); > if (mytype==PCMG) { > ierr=PCMGSetLevels(pc,levels,PETSC_NULL);CHKERRQ(ierr); > ? Note that also PETSc is designed to ignore options for types that are not being used. So you do not need the if (mgtype == PCMG) { stuff at all simply write PCSetFromOptions(pc); ierr=PCMGSetLevels(pc,levels,PETSC_NULL);CHKERRQ(ierr); /* will only activate if a MG PC selected ierr =PCFactorSetLevels(pc,2); /* will only activate if a factorization based preconditioner etc etc Barry > } > ... > > But since mytype is apparently a pointer type, this doesn't work. > > /Frederik Treue > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From bsmith at mcs.anl.gov Tue May 7 13:04:43 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 7 May 2013 13:04:43 -0500 Subject: [petsc-users] How to assemble a sparse SPD matrix in parallel In-Reply-To: References: Message-ID: <57C8E93C-9ED8-4208-8B58-C50B3B944EAC@mcs.anl.gov> On May 7, 2013, at 7:59 AM, Thomas Ponweiser wrote: > Dear PETSc community! > > I would like to read in a (large) sparse SPD matrix from a file in parallel. More precisely my plan was to do the following: > > 1) Read matrix size N from file. > 2) Create PETSc matrix. > 3) Set option MAT_SPD=PETSC_TRUE. > 4) Set global size N x N, local sizes PETSC_DECIDE. > 5) Read in only those rows from file, which are owned by the local process. Having many processes independently reading from different parts of the file will not be efficient using normal UNIX io. This will make it terribly terribly slow. I recommend writing a sequential program that reads in the matrix and save it with MatView() to a binary viewer then using MatLoad() to read it in in parallel. Thus you do not have to do any complicated coding and will get efficient loading. Barry > 6) Preallocate the matrix using statistics collected in the previous step. > 7) Insert the values read into the matrix row-by-row. > 8) Begin and finish matrix assembly. > > My problem is in step 5, leading to 3 questions: > > QUESTION 1: How can I let PETSc decide, which rows of the global matrix will be local to the process BEFORE prealloction? > > In the manual pages I have found so far: > A) MatGetOwnershipRange(): > ?requires that the matrix be preallocated?. > B) MatGetOwnershipRanges(): > ?Not collective, unless matrix has not been allocated, then collective on Mat? > However, when running the program, I get the error message: ?Must call MatXXXSetPreallocation() or MatSetUp() ? before MatGetOwnershipRanges()!? > > QUESTION 2: Is the documentation of MatGetOwnershipRanges() incorrect or am I misinterpreting it? > -> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRanges.html > > I finally got the program running by calling MatSetUp() before MatGetOwnershipRange(). Still I cannot fight the feeling that I am doing things not as they have been intended by the developers, since preallocation is now done twice. > > The alternative seems to be to use PetscSplitOwnership() and MPI_Scan() to calculate the row ranges for each process before creating the matrix with MatCreate(). But this leads in any case to a very even distribution of row counts among the processes. Assuming that only the upper triangular part of the symmetric matrix needs to be stored (IS THIS CORRECT?), I would guess that consequently this leads to an imbalance regarding the number of (nonzero) matrix entries owned by each process (Processes with higher rank will own fewer nonzeros). > > QUESTION 3: For SPD matrices, is it in general a good strategy to have every process owning approximately the same number of rows? (In this case, I can of course forget about PetscSplitOwnership() and MPI_Scan() and do the distribution myself). > > Thank you and kind regards, > Thomas Ponweiser From klaus.zimmermann at physik.uni-freiburg.de Tue May 7 14:00:54 2013 From: klaus.zimmermann at physik.uni-freiburg.de (Klaus Zimmermann) Date: Tue, 07 May 2013 21:00:54 +0200 Subject: [petsc-users] VecLoad_HDF5 Message-ID: <51894F66.5030402@physik.uni-freiburg.de> Hello, in both releases, 3.2-p7 and 3.3-p6, VecLoad_HDF5 is broken for complex scalars, albeit for different reasons. It seems fixed since git commit 37ef07e653527ae6342b8d6042269e12a607f99d. Will there be more bugfix releases for 3.2 and/or 3.3? Regards Klaus From jedbrown at mcs.anl.gov Tue May 7 18:53:01 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 07 May 2013 17:53:01 -0600 Subject: [petsc-users] VecLoad_HDF5 In-Reply-To: <51894F66.5030402@physik.uni-freiburg.de> References: <51894F66.5030402@physik.uni-freiburg.de> Message-ID: <87a9o6tkiq.fsf@mcs.anl.gov> Klaus Zimmermann writes: > Hello, > > in both releases, 3.2-p7 and 3.3-p6, VecLoad_HDF5 is broken for complex > scalars, albeit for different reasons. It seems fixed since > git commit 37ef07e653527ae6342b8d6042269e12a607f99d. I believe my rationale for not back-porting it was that the code has evolved somewhat. It could be cherry-picked back to 'maint-3.3' if you need it. > Will there be more bugfix releases for 3.2 and/or 3.3? Probably not, but you can look at branch 'maint' (will become 'maint-3.3' this week when we release petsc-3.4). From frtr at fysik.dtu.dk Wed May 8 03:33:44 2013 From: frtr at fysik.dtu.dk (Frederik Treue) Date: Wed, 8 May 2013 10:33:44 +0200 Subject: [petsc-users] PC introduce errors at processor limits? In-Reply-To: References: <1367936421.23563.31.camel@frtr-laptop> Message-ID: <1368002024.2889.2.camel@frtr-laptop> On Tue, 2013-05-07 at 09:23 -0500, Matthew Knepley wrote: > On Tue, May 7, 2013 at 9:20 AM, Frederik Treue > wrote: > Hi, > > I may be overlooking something very obvious here, but: > > I'm trying to solve a convection-diffusion problem. The method > I use > requires me to solve some Helmholtz and Poisson equations, > which is the > time consuming part. In order to reduce this time, I try to > use > preconditioners. However, no matter which preconditioner I use > (I've > tried PCJACOBI,PCBJACOBI,PCPBJACOBI,PCMG with 2 levels, > galerkin) it > introduces errors along the edges of the local domains when > using > multiple processors. These errors are small, but they don't > converge to > 0 as a function of resolution. I have checked with PCNONE, > which > eliminates the problem, but becomes unbearably slow. > > Is this somehow unavoidable? Or am I making some silly > mistake? The code > is somewhat complicated, but if desired, I can try to cook up > a > proof-of-(non)concept. > > > You have made a mistake somewhere, probably in your parallel function > evaluation. But how can this be? If it works without a preconditioner (which it does), doesn't that prove that the operators are correctly implemented? Or am I missing something something here? /Frederik Treue From jedbrown at mcs.anl.gov Wed May 8 06:22:32 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 08 May 2013 05:22:32 -0600 Subject: [petsc-users] PC introduce errors at processor limits? In-Reply-To: <1367936421.23563.31.camel@frtr-laptop> References: <1367936421.23563.31.camel@frtr-laptop> Message-ID: <87sj1xsolj.fsf@mcs.anl.gov> Frederik Treue writes: > I'm trying to solve a convection-diffusion problem. The method I use > requires me to solve some Helmholtz and Poisson equations, which is the > time consuming part. In order to reduce this time, I try to use > preconditioners. However, no matter which preconditioner I use (I've > tried PCJACOBI,PCBJACOBI,PCPBJACOBI,PCMG with 2 levels, galerkin) it > introduces errors along the edges of the local domains when using > multiple processors. These errors are small, but they don't converge to > 0 as a function of resolution. How "small"? How are you measuring the error? Does it converge to match the internal error as you shrink -ksp_rtol? It should not happen with -pc_type jacobi (or pbjacobi) because they don't do anything special at boundaries. From klaus.zimmermann at physik.uni-freiburg.de Wed May 8 06:27:53 2013 From: klaus.zimmermann at physik.uni-freiburg.de (Klaus Zimmermann) Date: Wed, 08 May 2013 13:27:53 +0200 Subject: [petsc-users] VecLoad_HDF5 In-Reply-To: <87a9o6tkiq.fsf@mcs.anl.gov> References: <51894F66.5030402@physik.uni-freiburg.de> <87a9o6tkiq.fsf@mcs.anl.gov> Message-ID: <518A36B9.3060304@physik.uni-freiburg.de> Hello Jed, I think it would be nice to fix those bugs in the stable releases, therefore I did the cherry-picking and created pull requests on bitbucket. If you feel this is too much trouble perhaps a known issues list would be an alternative. To just have it fail with the rather obscure error messages like now can be a bit irritating. Cheers Klaus Am 08.05.2013 01:53, schrieb Jed Brown: > Klaus Zimmermann writes: >> in both releases, 3.2-p7 and 3.3-p6, VecLoad_HDF5 is broken for complex >> scalars, albeit for different reasons. It seems fixed since >> git commit 37ef07e653527ae6342b8d6042269e12a607f99d. > > I believe my rationale for not back-porting it was that the code has > evolved somewhat. It could be cherry-picked back to 'maint-3.3' if you > need it. > >> Will there be more bugfix releases for 3.2 and/or 3.3? > > Probably not, but you can look at branch 'maint' (will become > 'maint-3.3' this week when we release petsc-3.4). > From jedbrown at mcs.anl.gov Wed May 8 07:28:56 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 08 May 2013 06:28:56 -0600 Subject: [petsc-users] VecLoad_HDF5 In-Reply-To: <518A36B9.3060304@physik.uni-freiburg.de> References: <51894F66.5030402@physik.uni-freiburg.de> <87a9o6tkiq.fsf@mcs.anl.gov> <518A36B9.3060304@physik.uni-freiburg.de> Message-ID: <87fvxxsliv.fsf@mcs.anl.gov> Klaus Zimmermann writes: > Hello Jed, > > I think it would be nice to fix those bugs in the stable releases, > therefore I did the cherry-picking and created pull requests on > bitbucket. If you feel this is too much trouble perhaps a known issues > list would be an alternative. To just have it fail with the rather > obscure error messages like now can be a bit irritating. Thanks, I merged the one for 'maint' and commented on the other one. Bitbucket pull requests should really have a way to be updated without marking them as rejected. From knepley at gmail.com Wed May 8 07:40:20 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 8 May 2013 07:40:20 -0500 Subject: [petsc-users] PC introduce errors at processor limits? In-Reply-To: <1368002024.2889.2.camel@frtr-laptop> References: <1367936421.23563.31.camel@frtr-laptop> <1368002024.2889.2.camel@frtr-laptop> Message-ID: On Wed, May 8, 2013 at 3:33 AM, Frederik Treue wrote: > On Tue, 2013-05-07 at 09:23 -0500, Matthew Knepley wrote: > > On Tue, May 7, 2013 at 9:20 AM, Frederik Treue > > wrote: > > Hi, > > > > I may be overlooking something very obvious here, but: > > > > I'm trying to solve a convection-diffusion problem. The method > > I use > > requires me to solve some Helmholtz and Poisson equations, > > which is the > > time consuming part. In order to reduce this time, I try to > > use > > preconditioners. However, no matter which preconditioner I use > > (I've > > tried PCJACOBI,PCBJACOBI,PCPBJACOBI,PCMG with 2 levels, > > galerkin) it > > introduces errors along the edges of the local domains when > > using > > multiple processors. These errors are small, but they don't > > converge to > > 0 as a function of resolution. I have checked with PCNONE, > > which > > eliminates the problem, but becomes unbearably slow. > > > > Is this somehow unavoidable? Or am I making some silly > > mistake? The code > > is somewhat complicated, but if desired, I can try to cook up > > a > > proof-of-(non)concept. > > > > > > You have made a mistake somewhere, probably in your parallel function > > evaluation. > > But how can this be? If it works without a preconditioner (which it > does), doesn't that prove that the operators are correctly implemented? > Or am I missing something something here? Here is my reasoning: 1) Examples work for you 2) Jacobi is identical in serial and parallel 3) This part of PETSc is tested by thousands of people every day, not to mention all regression tests, and has been stable for a decade at least. Thus my conclusion is that you have an error in the parallel residual evaluation. My advice is to take a working example and slowly change it to get to your equation. Matt > > /Frederik Treue > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From n-crgfp-hfref=zpf.nay.tbi-e289c at postmaster.twitter.com Wed May 8 08:36:56 2013 From: n-crgfp-hfref=zpf.nay.tbi-e289c at postmaster.twitter.com (Twitter) Date: Wed, 08 May 2013 13:36:56 +0000 Subject: [petsc-users] =?utf-8?q?Marcelo_Guterres_ainda_est=C3=A1_esperand?= =?utf-8?q?o_que_voc=C3=AA_participe_do_Twitter=2E=2E=2E?= Message-ID: <20130508133658.9FEE64FB823A@fbdbscrub04.att-mail.com> Marcelo Guterres ainda est? esperando que voc? participe do Twitter... Atrav?s do Twitter voc? fica conectado ao que est? acontecendo neste momento com as pessoas e organiza??es que lhe interessam. Aceitar convite https://twitter.com/i/cf3dac3a9cc5332792b33d184d2cc4bbf4c04759 ------------------------ Esta mensagem foi enviada pelo Twitter em nome de usu?rios que digitaram seu e-mail convidando voc? para participar do Twitter. Remover inscri??o: https://twitter.com/i/o?t=1&iid=2b0de522-1b4b-4d8e-90a0-bc4b9fb4f3dc&uid=0&c=CslKu56tIKaSSnAfvRhYceC8Jq8%2FkKu6&nid=69+26+20130507 Precisa de ajuda? https://support.twitter.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhenglun.wei at gmail.com Wed May 8 18:26:44 2013 From: zhenglun.wei at gmail.com (Zhenglun (Alan) Wei) Date: Wed, 08 May 2013 18:26:44 -0500 Subject: [petsc-users] CG or GMRES Message-ID: <518ADF34.9010109@gmail.com> Dear folks, I hope you're having a nice day. For the Poisson solver in /src/ksp/ksp/example/tutorial/ex45.c, I used the ksp_type = CG to solve it before; it converges very fast with pc_type = gamg. However, I was trying to check if the matrix generated by the 'ComputeMatrix' is symmetric by using "ierr = MatIsSymmetric(B, tol, &flg);". It shows that this matrix is not exact a symmetric one by setting tol = 0.0. Yet, the matrix is 'symmetric' if the tol > 0.01. Does this mean that, even if the matrix is not exact symmetric, the CG could still be used. This brings me a question. Can the CG be used to solve an actual unsymmetric matrix as long as 'MatIsSymmetric' returns a 'PETSC_TRUE' value with certain tolerance. Is there any rule of thumb for this tolerence? Also, as a preconditioner, does 'gamg' only work for symmetric positive-definite matrix? or it works for any matrix or even with GMRES? thanks, Alan From jedbrown at mcs.anl.gov Wed May 8 19:25:25 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 08 May 2013 18:25:25 -0600 Subject: [petsc-users] CG or GMRES In-Reply-To: <518ADF34.9010109@gmail.com> References: <518ADF34.9010109@gmail.com> Message-ID: <87k3n9ov7u.fsf@mcs.anl.gov> "Zhenglun (Alan) Wei" writes: > Dear folks, > I hope you're having a nice day. > For the Poisson solver in /src/ksp/ksp/example/tutorial/ex45.c, I used > the ksp_type = CG to solve it before; it converges very fast with > pc_type = gamg. However, I was trying to check if the matrix generated > by the 'ComputeMatrix' is symmetric by using "ierr = MatIsSymmetric(B, > tol, &flg);". It shows that this matrix is not exact a symmetric one by > setting tol = 0.0. Yet, the matrix is 'symmetric' if the tol > 0.01. The matrix does not enforce boundary conditions symmetrically. > Does this mean that, even if the matrix is not exact symmetric, the CG > could still be used. You happen to be iterating in a "benign" space in which the operator is SPD. > This brings me a question. Can the CG be used to solve an actual > unsymmetric matrix as long as 'MatIsSymmetric' returns a 'PETSC_TRUE' > value with certain tolerance. No. > Is there any rule of thumb for this tolerence? Also, as a > preconditioner, does 'gamg' only work for symmetric positive-definite > matrix? or it works for any matrix or even with GMRES? It works for many moderately non-symmetric, certainly for something that only has non-symmetric boundary conditions. From dharmareddy84 at gmail.com Wed May 8 20:21:34 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Wed, 8 May 2013 20:21:34 -0500 Subject: [petsc-users] vtkviewer with cloned dm Message-ID: Hello, I am seeing an error when i use vtkviewer with a cloned dm. Am i doing some thing worng ? Please see the attached test case. [0]PETSC ERROR: --------------------- Error Message ---------------------------- -------- [0]PETSC ERROR: Null argument, when expecting valid pointer! [0]PETSC ERROR: Null Object: Parameter # 1! [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: Petsc Development GIT revision: 2ba5cda9c63f067922ac725686fc49a5 014f5bad GIT Date: 2013-05-05 14:03:36 -0600 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named login3.stampede.tacc.u texas.edu by Reddy135 Wed May 8 19:53:42 2013 [0]PETSC ERROR: Libraries linked from /home1/00924/Reddy135/LocalApps/petsc/mpi_ rScalar_Debug/lib [0]PETSC ERROR: Configure run at Sun May 5 23:15:58 2013 [0]PETSC ERROR: Configure options --download-blacs=1 --download-ctetgen=1 --do wnload-metis=1 --download-mumps=1 --download-parmetis=1 --download-scalapack=1 - -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 --with-blas- lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ --with-deb ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ --with-petsc-arch=mpi_rS calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc PETSC_ARCH=mp i_rScalar_Debug [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: VecGetArrayRead() line 1488 in /home1/00924/Reddy135/LocalApps/p etsc/src/vec/vec/interface/rvector.c [0]PETSC ERROR: DMPlexVTKWriteAll_VTU() line 300 in /home1/00924/Reddy135/LocalA pps/petsc/src/dm/impls/plex/plexvtu.c [0]PETSC ERROR: DMPlexVTKWriteAll() line 606 in /home1/00924/Reddy135/LocalApps/ petsc/src/dm/impls/plex/plexvtk.c [0]PETSC ERROR: PetscViewerFlush_VTK() line 78 in /home1/00924/Reddy135/LocalApp s/petsc/src/sys/classes/viewer/impls/vtk/vtkv.c [0]PETSC ERROR: PetscViewerFlush() line 30 in /home1/00924/Reddy135/LocalApps/pe tsc/src/sys/classes/viewer/interface/flush.c [0]PETSC ERROR: PetscViewerDestroy() line 100 in /home1/00924/Reddy135/LocalApps /petsc/src/sys/classes/viewer/interface/view.c -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testPlexVTU.F90 Type: application/octet-stream Size: 1935 bytes Desc: not available URL: From dharmareddy84 at gmail.com Thu May 9 06:33:30 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 06:33:30 -0500 Subject: [petsc-users] vtkviewer with cloned dm In-Reply-To: References: Message-ID: Hello, Looks like the call DMGetCoordinatesLocal(clonedDM,coords,ierr) is giving a null coords Vec. DMView on clonedDM is printing the expected information Mesh in 2 dimensions: 0-cells: 9 1-cells: 16 2-cells: 8 Labels: marker: 2 strata of sizes (16, 5) depth: 3 strata of sizes (9, 16, 8) the original dm is created using DMPlexCreateBoxMesh. On Wed, May 8, 2013 at 8:21 PM, Dharmendar Reddy wrote: > Hello, > I am seeing an error when i use vtkviewer with a cloned dm. Am i > doing some thing worng ? Please see the attached test case. > > [0]PETSC ERROR: --------------------- Error Message > ---------------------------- > -------- > [0]PETSC ERROR: Null argument, when expecting valid pointer! > [0]PETSC ERROR: Null Object: Parameter # 1! > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: Petsc Development GIT revision: > 2ba5cda9c63f067922ac725686fc49a5 > 014f5bad GIT Date: 2013-05-05 14:03:36 -0600 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named > login3.stampede.tacc.u > texas.edu by Reddy135 Wed May 8 19:53:42 2013 > [0]PETSC ERROR: Libraries linked from > /home1/00924/Reddy135/LocalApps/petsc/mpi_ > rScalar_Debug/lib > [0]PETSC ERROR: Configure run at Sun May 5 23:15:58 2013 > [0]PETSC ERROR: Configure options --download-blacs=1 > --download-ctetgen=1 --do > wnload-metis=1 --download-mumps=1 --download-parmetis=1 > --download-scalapack=1 - > -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 > --with-blas- > lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ > --with-deb > ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ > --with-petsc-arch=mpi_rS > calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc > PETSC_ARCH=mp > i_rScalar_Debug > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: VecGetArrayRead() line 1488 in > /home1/00924/Reddy135/LocalApps/p > etsc/src/vec/vec/interface/rvector.c > [0]PETSC ERROR: DMPlexVTKWriteAll_VTU() line 300 in > /home1/00924/Reddy135/LocalA > pps/petsc/src/dm/impls/plex/plexvtu.c > [0]PETSC ERROR: DMPlexVTKWriteAll() line 606 in > /home1/00924/Reddy135/LocalApps/ > petsc/src/dm/impls/plex/plexvtk.c > [0]PETSC ERROR: PetscViewerFlush_VTK() line 78 in > /home1/00924/Reddy135/LocalApp > s/petsc/src/sys/classes/viewer/impls/vtk/vtkv.c > [0]PETSC ERROR: PetscViewerFlush() line 30 in > /home1/00924/Reddy135/LocalApps/pe > tsc/src/sys/classes/viewer/interface/flush.c > [0]PETSC ERROR: PetscViewerDestroy() line 100 in > /home1/00924/Reddy135/LocalApps > /petsc/src/sys/classes/viewer/interface/view.c > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu May 9 06:46:10 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 9 May 2013 06:46:10 -0500 Subject: [petsc-users] vtkviewer with cloned dm In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 6:33 AM, Dharmendar Reddy wrote: > Hello, > Looks like the call DMGetCoordinatesLocal(clonedDM,coords,ierr) > is giving a null coords Vec. DMView on clonedDM is printing the expected > information > > Mesh in 2 dimensions: > 0-cells: 9 > 1-cells: 16 > 2-cells: 8 > Labels: > marker: 2 strata of sizes (16, 5) > depth: 3 strata of sizes (9, 16, 8) > > the original dm is created using DMPlexCreateBoxMesh. > The semantics for Clone have not been nailed down. I can copy over the coordinates. I am not sure it will make the release, but I will put it in next. Thanks, Matt > > On Wed, May 8, 2013 at 8:21 PM, Dharmendar Reddy wrote: > >> Hello, >> I am seeing an error when i use vtkviewer with a cloned dm. Am i >> doing some thing worng ? Please see the attached test case. >> >> [0]PETSC ERROR: --------------------- Error Message >> ---------------------------- >> -------- >> [0]PETSC ERROR: Null argument, when expecting valid pointer! >> [0]PETSC ERROR: Null Object: Parameter # 1! >> [0]PETSC ERROR: >> ---------------------------------------------------------------- >> -------- >> [0]PETSC ERROR: Petsc Development GIT revision: >> 2ba5cda9c63f067922ac725686fc49a5 >> 014f5bad GIT Date: 2013-05-05 14:03:36 -0600 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ---------------------------------------------------------------- >> -------- >> [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named >> login3.stampede.tacc.u >> texas.edu by Reddy135 Wed May 8 19:53:42 2013 >> [0]PETSC ERROR: Libraries linked from >> /home1/00924/Reddy135/LocalApps/petsc/mpi_ >> rScalar_Debug/lib >> [0]PETSC ERROR: Configure run at Sun May 5 23:15:58 2013 >> [0]PETSC ERROR: Configure options --download-blacs=1 >> --download-ctetgen=1 --do >> wnload-metis=1 --download-mumps=1 --download-parmetis=1 >> --download-scalapack=1 - >> -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 >> --with-blas- >> lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ >> --with-deb >> ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ >> --with-petsc-arch=mpi_rS >> calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc >> PETSC_ARCH=mp >> i_rScalar_Debug >> [0]PETSC ERROR: >> ---------------------------------------------------------------- >> -------- >> [0]PETSC ERROR: VecGetArrayRead() line 1488 in >> /home1/00924/Reddy135/LocalApps/p >> etsc/src/vec/vec/interface/rvector.c >> [0]PETSC ERROR: DMPlexVTKWriteAll_VTU() line 300 in >> /home1/00924/Reddy135/LocalA >> pps/petsc/src/dm/impls/plex/plexvtu.c >> [0]PETSC ERROR: DMPlexVTKWriteAll() line 606 in >> /home1/00924/Reddy135/LocalApps/ >> petsc/src/dm/impls/plex/plexvtk.c >> [0]PETSC ERROR: PetscViewerFlush_VTK() line 78 in >> /home1/00924/Reddy135/LocalApp >> s/petsc/src/sys/classes/viewer/impls/vtk/vtkv.c >> [0]PETSC ERROR: PetscViewerFlush() line 30 in >> /home1/00924/Reddy135/LocalApps/pe >> tsc/src/sys/classes/viewer/interface/flush.c >> [0]PETSC ERROR: PetscViewerDestroy() line 100 in >> /home1/00924/Reddy135/LocalApps >> /petsc/src/sys/classes/viewer/interface/view.c >> >> -- >> ----------------------------------------------------- >> Dharmendar Reddy Palle >> Graduate Student >> Microelectronics Research center, >> University of Texas at Austin, >> 10100 Burnet Road, Bldg. 160 >> MER 2.608F, TX 78758-4445 >> e-mail: dharmareddy84 at gmail.com >> Phone: +1-512-350-9082 >> United States of America. >> Homepage: https://webspace.utexas.edu/~dpr342 >> > > > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Thu May 9 06:52:50 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 06:52:50 -0500 Subject: [petsc-users] vtkviewer with cloned dm In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 6:46 AM, Matthew Knepley wrote: > On Thu, May 9, 2013 at 6:33 AM, Dharmendar Reddy wrote: > >> Hello, >> Looks like the call DMGetCoordinatesLocal(clonedDM,coords,ierr) >> is giving a null coords Vec. DMView on clonedDM is printing the expected >> information >> >> Mesh in 2 dimensions: >> 0-cells: 9 >> 1-cells: 16 >> 2-cells: 8 >> Labels: >> marker: 2 strata of sizes (16, 5) >> depth: 3 strata of sizes (9, 16, 8) >> >> the original dm is created using DMPlexCreateBoxMesh. >> > > The semantics for Clone have not been nailed down. I can copy over the > coordinates. I am not sure it will make > the release, but I will put it in next. > > That will help for now. On a related note, when the parent dm is refined or distributed does cloned dm also get refined or distributed or do we need to call the clone after refining or distributing the parent dm ? > Thanks, > > Matt > > >> >> On Wed, May 8, 2013 at 8:21 PM, Dharmendar Reddy > > wrote: >> >>> Hello, >>> I am seeing an error when i use vtkviewer with a cloned dm. Am >>> i doing some thing worng ? Please see the attached test case. >>> >>> [0]PETSC ERROR: --------------------- Error Message >>> ---------------------------- >>> -------- >>> [0]PETSC ERROR: Null argument, when expecting valid pointer! >>> [0]PETSC ERROR: Null Object: Parameter # 1! >>> [0]PETSC ERROR: >>> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: Petsc Development GIT revision: >>> 2ba5cda9c63f067922ac725686fc49a5 >>> 014f5bad GIT Date: 2013-05-05 14:03:36 -0600 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named >>> login3.stampede.tacc.u >>> texas.edu by Reddy135 Wed May 8 19:53:42 2013 >>> [0]PETSC ERROR: Libraries linked from >>> /home1/00924/Reddy135/LocalApps/petsc/mpi_ >>> rScalar_Debug/lib >>> [0]PETSC ERROR: Configure run at Sun May 5 23:15:58 2013 >>> [0]PETSC ERROR: Configure options --download-blacs=1 >>> --download-ctetgen=1 --do >>> wnload-metis=1 --download-mumps=1 --download-parmetis=1 >>> --download-scalapack=1 - >>> -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 >>> --with-blas- >>> lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ >>> --with-deb >>> ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ >>> --with-petsc-arch=mpi_rS >>> calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc >>> PETSC_ARCH=mp >>> i_rScalar_Debug >>> [0]PETSC ERROR: >>> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: VecGetArrayRead() line 1488 in >>> /home1/00924/Reddy135/LocalApps/p >>> etsc/src/vec/vec/interface/rvector.c >>> [0]PETSC ERROR: DMPlexVTKWriteAll_VTU() line 300 in >>> /home1/00924/Reddy135/LocalA >>> pps/petsc/src/dm/impls/plex/plexvtu.c >>> [0]PETSC ERROR: DMPlexVTKWriteAll() line 606 in >>> /home1/00924/Reddy135/LocalApps/ >>> petsc/src/dm/impls/plex/plexvtk.c >>> [0]PETSC ERROR: PetscViewerFlush_VTK() line 78 in >>> /home1/00924/Reddy135/LocalApp >>> s/petsc/src/sys/classes/viewer/impls/vtk/vtkv.c >>> [0]PETSC ERROR: PetscViewerFlush() line 30 in >>> /home1/00924/Reddy135/LocalApps/pe >>> tsc/src/sys/classes/viewer/interface/flush.c >>> [0]PETSC ERROR: PetscViewerDestroy() line 100 in >>> /home1/00924/Reddy135/LocalApps >>> /petsc/src/sys/classes/viewer/interface/view.c >>> >>> -- >>> ----------------------------------------------------- >>> Dharmendar Reddy Palle >>> Graduate Student >>> Microelectronics Research center, >>> University of Texas at Austin, >>> 10100 Burnet Road, Bldg. 160 >>> MER 2.608F, TX 78758-4445 >>> e-mail: dharmareddy84 at gmail.com >>> Phone: +1-512-350-9082 >>> United States of America. >>> Homepage: https://webspace.utexas.edu/~dpr342 >>> >> >> >> >> -- >> ----------------------------------------------------- >> Dharmendar Reddy Palle >> Graduate Student >> Microelectronics Research center, >> University of Texas at Austin, >> 10100 Burnet Road, Bldg. 160 >> MER 2.608F, TX 78758-4445 >> e-mail: dharmareddy84 at gmail.com >> Phone: +1-512-350-9082 >> United States of America. >> Homepage: https://webspace.utexas.edu/~dpr342 >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu May 9 06:56:16 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 9 May 2013 06:56:16 -0500 Subject: [petsc-users] vtkviewer with cloned dm In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 6:52 AM, Dharmendar Reddy wrote: > > > > On Thu, May 9, 2013 at 6:46 AM, Matthew Knepley wrote: > >> On Thu, May 9, 2013 at 6:33 AM, Dharmendar Reddy > > wrote: >> >>> Hello, >>> Looks like the call DMGetCoordinatesLocal(clonedDM,coords,ierr) >>> is giving a null coords Vec. DMView on clonedDM is printing the expected >>> information >>> >>> Mesh in 2 dimensions: >>> 0-cells: 9 >>> 1-cells: 16 >>> 2-cells: 8 >>> Labels: >>> marker: 2 strata of sizes (16, 5) >>> depth: 3 strata of sizes (9, 16, 8) >>> >>> the original dm is created using DMPlexCreateBoxMesh. >>> >> >> The semantics for Clone have not been nailed down. I can copy over the >> coordinates. I am not sure it will make >> the release, but I will put it in next. >> >> That will help for now. On a related note, when the parent dm is refined > or distributed does cloned dm also get refined or distributed or do we > need to call the clone after refining or distributing the parent dm ? > No. Here the model is clear, and I think its the right one. Distribution, refinement, interpolation, etc. all create a new DM, thus any clones are pointing at the old DM. Matt > > >> Thanks, >> >> Matt >> >> >>> >>> On Wed, May 8, 2013 at 8:21 PM, Dharmendar Reddy < >>> dharmareddy84 at gmail.com> wrote: >>> >>>> Hello, >>>> I am seeing an error when i use vtkviewer with a cloned dm. Am >>>> i doing some thing worng ? Please see the attached test case. >>>> >>>> [0]PETSC ERROR: --------------------- Error Message >>>> ---------------------------- >>>> -------- >>>> [0]PETSC ERROR: Null argument, when expecting valid pointer! >>>> [0]PETSC ERROR: Null Object: Parameter # 1! >>>> [0]PETSC ERROR: >>>> ---------------------------------------------------------------- >>>> -------- >>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>> 2ba5cda9c63f067922ac725686fc49a5 >>>> 014f5bad GIT Date: 2013-05-05 14:03:36 -0600 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>> [0]PETSC ERROR: >>>> ---------------------------------------------------------------- >>>> -------- >>>> [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named >>>> login3.stampede.tacc.u >>>> texas.edu by Reddy135 Wed May 8 19:53:42 2013 >>>> [0]PETSC ERROR: Libraries linked from >>>> /home1/00924/Reddy135/LocalApps/petsc/mpi_ >>>> rScalar_Debug/lib >>>> [0]PETSC ERROR: Configure run at Sun May 5 23:15:58 2013 >>>> [0]PETSC ERROR: Configure options --download-blacs=1 >>>> --download-ctetgen=1 --do >>>> wnload-metis=1 --download-mumps=1 --download-parmetis=1 >>>> --download-scalapack=1 - >>>> -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 >>>> --with-blas- >>>> lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ >>>> --with-deb >>>> ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ >>>> --with-petsc-arch=mpi_rS >>>> calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc >>>> PETSC_ARCH=mp >>>> i_rScalar_Debug >>>> [0]PETSC ERROR: >>>> ---------------------------------------------------------------- >>>> -------- >>>> [0]PETSC ERROR: VecGetArrayRead() line 1488 in >>>> /home1/00924/Reddy135/LocalApps/p >>>> etsc/src/vec/vec/interface/rvector.c >>>> [0]PETSC ERROR: DMPlexVTKWriteAll_VTU() line 300 in >>>> /home1/00924/Reddy135/LocalA >>>> pps/petsc/src/dm/impls/plex/plexvtu.c >>>> [0]PETSC ERROR: DMPlexVTKWriteAll() line 606 in >>>> /home1/00924/Reddy135/LocalApps/ >>>> petsc/src/dm/impls/plex/plexvtk.c >>>> [0]PETSC ERROR: PetscViewerFlush_VTK() line 78 in >>>> /home1/00924/Reddy135/LocalApp >>>> s/petsc/src/sys/classes/viewer/impls/vtk/vtkv.c >>>> [0]PETSC ERROR: PetscViewerFlush() line 30 in >>>> /home1/00924/Reddy135/LocalApps/pe >>>> tsc/src/sys/classes/viewer/interface/flush.c >>>> [0]PETSC ERROR: PetscViewerDestroy() line 100 in >>>> /home1/00924/Reddy135/LocalApps >>>> /petsc/src/sys/classes/viewer/interface/view.c >>>> >>>> -- >>>> ----------------------------------------------------- >>>> Dharmendar Reddy Palle >>>> Graduate Student >>>> Microelectronics Research center, >>>> University of Texas at Austin, >>>> 10100 Burnet Road, Bldg. 160 >>>> MER 2.608F, TX 78758-4445 >>>> e-mail: dharmareddy84 at gmail.com >>>> Phone: +1-512-350-9082 >>>> United States of America. >>>> Homepage: https://webspace.utexas.edu/~dpr342 >>>> >>> >>> >>> >>> -- >>> ----------------------------------------------------- >>> Dharmendar Reddy Palle >>> Graduate Student >>> Microelectronics Research center, >>> University of Texas at Austin, >>> 10100 Burnet Road, Bldg. 160 >>> MER 2.608F, TX 78758-4445 >>> e-mail: dharmareddy84 at gmail.com >>> Phone: +1-512-350-9082 >>> United States of America. >>> Homepage: https://webspace.utexas.edu/~dpr342 >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Thu May 9 07:01:45 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 07:01:45 -0500 Subject: [petsc-users] fortran binding DMPlexCreateSectionInitial Message-ID: Hello, I know I can use DMPlexCreateSection with numBC=0, i was wondering if you can provide FORTRAN binding to DMPlexCreateSectionInitial. My thinking for the above is as follows: A section is not always related to a problem with boundary conditions. I just need to lay out some variables for access Thanks Reddy -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu May 9 07:23:41 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 9 May 2013 07:23:41 -0500 Subject: [petsc-users] fortran binding DMPlexCreateSectionInitial In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 7:01 AM, Dharmendar Reddy wrote: > Hello, > I know I can use DMPlexCreateSection with numBC=0, i was > wondering if you can provide FORTRAN binding to DMPlexCreateSectionInitial. > > My thinking for the above is as follows: > A section is not always related to a problem with boundary conditions. I > just need to lay out some variables for access > I was not planning on exposing that even for C. For simple layouts, I think its better just to use PetscSection directly. Matt > > Thanks > Reddy > > > > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Thu May 9 08:21:49 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 08:21:49 -0500 Subject: [petsc-users] fortran binding DMPlexCreateSectionInitial In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 7:23 AM, Matthew Knepley wrote: > On Thu, May 9, 2013 at 7:01 AM, Dharmendar Reddy wrote: > >> Hello, >> I know I can use DMPlexCreateSection with numBC=0, i was >> wondering if you can provide FORTRAN binding to DMPlexCreateSectionInitial. >> >> My thinking for the above is as follows: >> A section is not always related to a problem with boundary conditions. I >> just need to lay out some variables for access >> > > I was not planning on exposing that even for C. For simple layouts, I > think its better just to use PetscSection directly. > > Ok. > Matt > > >> >> Thanks >> Reddy >> >> >> >> >> -- >> ----------------------------------------------------- >> Dharmendar Reddy Palle >> Graduate Student >> Microelectronics Research center, >> University of Texas at Austin, >> 10100 Burnet Road, Bldg. 160 >> MER 2.608F, TX 78758-4445 >> e-mail: dharmareddy84 at gmail.com >> Phone: +1-512-350-9082 >> United States of America. >> Homepage: https://webspace.utexas.edu/~dpr342 >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Thu May 9 10:17:24 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 10:17:24 -0500 Subject: [petsc-users] vtkviewer with cloned dm In-Reply-To: References: Message-ID: Hello, Looks like the fix created other issues. I am getting errors from DMPlecCreateBoxMesh. I have attached the test code and error log On Thu, May 9, 2013 at 6:56 AM, Matthew Knepley wrote: > On Thu, May 9, 2013 at 6:52 AM, Dharmendar Reddy wrote: > >> >> >> >> On Thu, May 9, 2013 at 6:46 AM, Matthew Knepley wrote: >> >>> On Thu, May 9, 2013 at 6:33 AM, Dharmendar Reddy < >>> dharmareddy84 at gmail.com> wrote: >>> >>>> Hello, >>>> Looks like the call >>>> DMGetCoordinatesLocal(clonedDM,coords,ierr) is giving a null coords Vec. >>>> DMView on clonedDM is printing the expected information >>>> >>>> Mesh in 2 dimensions: >>>> 0-cells: 9 >>>> 1-cells: 16 >>>> 2-cells: 8 >>>> Labels: >>>> marker: 2 strata of sizes (16, 5) >>>> depth: 3 strata of sizes (9, 16, 8) >>>> >>>> the original dm is created using DMPlexCreateBoxMesh. >>>> >>> >>> The semantics for Clone have not been nailed down. I can copy over the >>> coordinates. I am not sure it will make >>> the release, but I will put it in next. >>> >>> That will help for now. On a related note, when the parent dm is >> refined or distributed does cloned dm also get refined or distributed or >> do we need to call the clone after refining or distributing the parent dm ? >> > > No. Here the model is clear, and I think its the right one. Distribution, > refinement, interpolation, etc. all create > a new DM, thus any clones are pointing at the old DM. > > Matt > > >> >> >>> Thanks, >>> >>> Matt >>> >>> >>>> >>>> On Wed, May 8, 2013 at 8:21 PM, Dharmendar Reddy < >>>> dharmareddy84 at gmail.com> wrote: >>>> >>>>> Hello, >>>>> I am seeing an error when i use vtkviewer with a cloned dm. >>>>> Am i doing some thing worng ? Please see the attached test case. >>>>> >>>>> [0]PETSC ERROR: --------------------- Error Message >>>>> ---------------------------- >>>>> -------- >>>>> [0]PETSC ERROR: Null argument, when expecting valid pointer! >>>>> [0]PETSC ERROR: Null Object: Parameter # 1! >>>>> [0]PETSC ERROR: >>>>> ---------------------------------------------------------------- >>>>> -------- >>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>> 2ba5cda9c63f067922ac725686fc49a5 >>>>> 014f5bad GIT Date: 2013-05-05 14:03:36 -0600 >>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>> [0]PETSC ERROR: >>>>> ---------------------------------------------------------------- >>>>> -------- >>>>> [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named >>>>> login3.stampede.tacc.u >>>>> texas.edu by Reddy135 Wed May 8 19:53:42 2013 >>>>> [0]PETSC ERROR: Libraries linked from >>>>> /home1/00924/Reddy135/LocalApps/petsc/mpi_ >>>>> rScalar_Debug/lib >>>>> [0]PETSC ERROR: Configure run at Sun May 5 23:15:58 2013 >>>>> [0]PETSC ERROR: Configure options --download-blacs=1 >>>>> --download-ctetgen=1 --do >>>>> wnload-metis=1 --download-mumps=1 --download-parmetis=1 >>>>> --download-scalapack=1 - >>>>> -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 >>>>> --with-blas- >>>>> lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ >>>>> --with-deb >>>>> ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ >>>>> --with-petsc-arch=mpi_rS >>>>> calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc >>>>> PETSC_ARCH=mp >>>>> i_rScalar_Debug >>>>> [0]PETSC ERROR: >>>>> ---------------------------------------------------------------- >>>>> -------- >>>>> [0]PETSC ERROR: VecGetArrayRead() line 1488 in >>>>> /home1/00924/Reddy135/LocalApps/p >>>>> etsc/src/vec/vec/interface/rvector.c >>>>> [0]PETSC ERROR: DMPlexVTKWriteAll_VTU() line 300 in >>>>> /home1/00924/Reddy135/LocalA >>>>> pps/petsc/src/dm/impls/plex/plexvtu.c >>>>> [0]PETSC ERROR: DMPlexVTKWriteAll() line 606 in >>>>> /home1/00924/Reddy135/LocalApps/ >>>>> petsc/src/dm/impls/plex/plexvtk.c >>>>> [0]PETSC ERROR: PetscViewerFlush_VTK() line 78 in >>>>> /home1/00924/Reddy135/LocalApp >>>>> s/petsc/src/sys/classes/viewer/impls/vtk/vtkv.c >>>>> [0]PETSC ERROR: PetscViewerFlush() line 30 in >>>>> /home1/00924/Reddy135/LocalApps/pe >>>>> tsc/src/sys/classes/viewer/interface/flush.c >>>>> [0]PETSC ERROR: PetscViewerDestroy() line 100 in >>>>> /home1/00924/Reddy135/LocalApps >>>>> /petsc/src/sys/classes/viewer/interface/view.c >>>>> >>>>> -- >>>>> ----------------------------------------------------- >>>>> Dharmendar Reddy Palle >>>>> Graduate Student >>>>> Microelectronics Research center, >>>>> University of Texas at Austin, >>>>> 10100 Burnet Road, Bldg. 160 >>>>> MER 2.608F, TX 78758-4445 >>>>> e-mail: dharmareddy84 at gmail.com >>>>> Phone: +1-512-350-9082 >>>>> United States of America. >>>>> Homepage: https://webspace.utexas.edu/~dpr342 >>>>> >>>> >>>> >>>> >>>> -- >>>> ----------------------------------------------------- >>>> Dharmendar Reddy Palle >>>> Graduate Student >>>> Microelectronics Research center, >>>> University of Texas at Austin, >>>> 10100 Burnet Road, Bldg. 160 >>>> MER 2.608F, TX 78758-4445 >>>> e-mail: dharmareddy84 at gmail.com >>>> Phone: +1-512-350-9082 >>>> United States of America. >>>> Homepage: https://webspace.utexas.edu/~dpr342 >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> >> -- >> ----------------------------------------------------- >> Dharmendar Reddy Palle >> Graduate Student >> Microelectronics Research center, >> University of Texas at Austin, >> 10100 Burnet Road, Bldg. 160 >> MER 2.608F, TX 78758-4445 >> e-mail: dharmareddy84 at gmail.com >> Phone: +1-512-350-9082 >> United States of America. >> Homepage: https://webspace.utexas.edu/~dpr342 >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testPlexVTU.F90 Type: application/octet-stream Size: 2197 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: error.log Type: application/octet-stream Size: 9248 bytes Desc: not available URL: From hao.yu at peraglobal.com Thu May 9 10:20:55 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Thu, 9 May 2013 23:20:55 +0800 Subject: [petsc-users] =?gb2312?b?16q3ojogUEVUc2MgcHJvYmxlbQ==?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn>, <878v3opa43.fsf@mcs.anl.gov>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> Thanks! so what can I do to compile if I want to use PETsc in VS2010? I have tried using g++: ./configure --with-cxx=g++ --with-fc=0 --download-f2blaslapck --download-mpich then make PETSC_ARCH=arch-mswin-cxx-debug all make PETSC_ARCH=arch-mswin-cxx-debug test it shows: Using PETSC_DIR=/petsc-3.3-p6 and PETSC_ARCH=arch-mswin-cxx-debug /usr/bin/sh: line 20: 10916 Segmentation fault " some path here"(I omitted) Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process See http://www.mcs.anlgov/petsc/documentation/faq.html /usr/bin/sh: line 20: 11780 Segmentation fault "some path here"(I omitted) Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes See http://www.mcs.anl.gov/petsc/documentation/faq.html Completed test examples Hao ________________________________________ ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] ????: 2013?5?9? 21:15 ???: ?? ??: Re: PETsc problem ?? writes: > Dear Jed Brown, > > I am sorry to trouble you! I am a PhD in math dept. at U of Minnesota, > currently using PETsc. I have a problem , it can't successfully > compile in VS2010 after installation. Because I am using Windows, so > under cygwin, I use: > > ./configure --with-cc=gcc -with-fc=0 --download-f2blaslapck > --download-mpich make all test You can't mix gcc here and VS2010 later because they are not compatible. > then it completes, but after I include , for example, "petscksp.h", > the error message after compling: > > Error 4 error C3861: '__builtin_expect': identifier not found > e:\cygwin\home\petsc-3.3-p6\include\petsclog.h 332 1 HelloFluids2 Configure tests whether the compiler supports __builtin_expect. There are probably many other incompatibilities. If you have any further questions, please send email to either petsc-users at mcs.anl.gov or petsc-maint at mcs.anl.gov. From knepley at gmail.com Thu May 9 10:24:42 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 9 May 2013 10:24:42 -0500 Subject: [petsc-users] vtkviewer with cloned dm In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 10:17 AM, Dharmendar Reddy wrote: > Hello, > Looks like the fix created other issues. I am getting errors > from DMPlecCreateBoxMesh. I have attached the test code and error log > DMSetCoordinates() does not like NULL. I pushed the fix. Matt > On Thu, May 9, 2013 at 6:56 AM, Matthew Knepley wrote: > >> On Thu, May 9, 2013 at 6:52 AM, Dharmendar Reddy > > wrote: >> >>> >>> >>> >>> On Thu, May 9, 2013 at 6:46 AM, Matthew Knepley wrote: >>> >>>> On Thu, May 9, 2013 at 6:33 AM, Dharmendar Reddy < >>>> dharmareddy84 at gmail.com> wrote: >>>> >>>>> Hello, >>>>> Looks like the call >>>>> DMGetCoordinatesLocal(clonedDM,coords,ierr) is giving a null coords Vec. >>>>> DMView on clonedDM is printing the expected information >>>>> >>>>> Mesh in 2 dimensions: >>>>> 0-cells: 9 >>>>> 1-cells: 16 >>>>> 2-cells: 8 >>>>> Labels: >>>>> marker: 2 strata of sizes (16, 5) >>>>> depth: 3 strata of sizes (9, 16, 8) >>>>> >>>>> the original dm is created using DMPlexCreateBoxMesh. >>>>> >>>> >>>> The semantics for Clone have not been nailed down. I can copy over the >>>> coordinates. I am not sure it will make >>>> the release, but I will put it in next. >>>> >>>> That will help for now. On a related note, when the parent dm is >>> refined or distributed does cloned dm also get refined or distributed or >>> do we need to call the clone after refining or distributing the parent dm ? >>> >> >> No. Here the model is clear, and I think its the right one. Distribution, >> refinement, interpolation, etc. all create >> a new DM, thus any clones are pointing at the old DM. >> >> Matt >> >> >>> >>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> >>>>> On Wed, May 8, 2013 at 8:21 PM, Dharmendar Reddy < >>>>> dharmareddy84 at gmail.com> wrote: >>>>> >>>>>> Hello, >>>>>> I am seeing an error when i use vtkviewer with a cloned dm. >>>>>> Am i doing some thing worng ? Please see the attached test case. >>>>>> >>>>>> [0]PETSC ERROR: --------------------- Error Message >>>>>> ---------------------------- >>>>>> -------- >>>>>> [0]PETSC ERROR: Null argument, when expecting valid pointer! >>>>>> [0]PETSC ERROR: Null Object: Parameter # 1! >>>>>> [0]PETSC ERROR: >>>>>> ---------------------------------------------------------------- >>>>>> -------- >>>>>> [0]PETSC ERROR: Petsc Development GIT revision: >>>>>> 2ba5cda9c63f067922ac725686fc49a5 >>>>>> 014f5bad GIT Date: 2013-05-05 14:03:36 -0600 >>>>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>>>> [0]PETSC ERROR: >>>>>> ---------------------------------------------------------------- >>>>>> -------- >>>>>> [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named >>>>>> login3.stampede.tacc.u >>>>>> texas.edu by Reddy135 Wed May 8 19:53:42 2013 >>>>>> [0]PETSC ERROR: Libraries linked from >>>>>> /home1/00924/Reddy135/LocalApps/petsc/mpi_ >>>>>> rScalar_Debug/lib >>>>>> [0]PETSC ERROR: Configure run at Sun May 5 23:15:58 2013 >>>>>> [0]PETSC ERROR: Configure options --download-blacs=1 >>>>>> --download-ctetgen=1 --do >>>>>> wnload-metis=1 --download-mumps=1 --download-parmetis=1 >>>>>> --download-scalapack=1 - >>>>>> -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 >>>>>> --with-blas- >>>>>> lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ >>>>>> --with-deb >>>>>> ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ >>>>>> --with-petsc-arch=mpi_rS >>>>>> calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc >>>>>> PETSC_ARCH=mp >>>>>> i_rScalar_Debug >>>>>> [0]PETSC ERROR: >>>>>> ---------------------------------------------------------------- >>>>>> -------- >>>>>> [0]PETSC ERROR: VecGetArrayRead() line 1488 in >>>>>> /home1/00924/Reddy135/LocalApps/p >>>>>> etsc/src/vec/vec/interface/rvector.c >>>>>> [0]PETSC ERROR: DMPlexVTKWriteAll_VTU() line 300 in >>>>>> /home1/00924/Reddy135/LocalA >>>>>> pps/petsc/src/dm/impls/plex/plexvtu.c >>>>>> [0]PETSC ERROR: DMPlexVTKWriteAll() line 606 in >>>>>> /home1/00924/Reddy135/LocalApps/ >>>>>> petsc/src/dm/impls/plex/plexvtk.c >>>>>> [0]PETSC ERROR: PetscViewerFlush_VTK() line 78 in >>>>>> /home1/00924/Reddy135/LocalApp >>>>>> s/petsc/src/sys/classes/viewer/impls/vtk/vtkv.c >>>>>> [0]PETSC ERROR: PetscViewerFlush() line 30 in >>>>>> /home1/00924/Reddy135/LocalApps/pe >>>>>> tsc/src/sys/classes/viewer/interface/flush.c >>>>>> [0]PETSC ERROR: PetscViewerDestroy() line 100 in >>>>>> /home1/00924/Reddy135/LocalApps >>>>>> /petsc/src/sys/classes/viewer/interface/view.c >>>>>> >>>>>> -- >>>>>> ----------------------------------------------------- >>>>>> Dharmendar Reddy Palle >>>>>> Graduate Student >>>>>> Microelectronics Research center, >>>>>> University of Texas at Austin, >>>>>> 10100 Burnet Road, Bldg. 160 >>>>>> MER 2.608F, TX 78758-4445 >>>>>> e-mail: dharmareddy84 at gmail.com >>>>>> Phone: +1-512-350-9082 >>>>>> United States of America. >>>>>> Homepage: https://webspace.utexas.edu/~dpr342 >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> ----------------------------------------------------- >>>>> Dharmendar Reddy Palle >>>>> Graduate Student >>>>> Microelectronics Research center, >>>>> University of Texas at Austin, >>>>> 10100 Burnet Road, Bldg. 160 >>>>> MER 2.608F, TX 78758-4445 >>>>> e-mail: dharmareddy84 at gmail.com >>>>> Phone: +1-512-350-9082 >>>>> United States of America. >>>>> Homepage: https://webspace.utexas.edu/~dpr342 >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >>> >>> -- >>> ----------------------------------------------------- >>> Dharmendar Reddy Palle >>> Graduate Student >>> Microelectronics Research center, >>> University of Texas at Austin, >>> 10100 Burnet Road, Bldg. 160 >>> MER 2.608F, TX 78758-4445 >>> e-mail: dharmareddy84 at gmail.com >>> Phone: +1-512-350-9082 >>> United States of America. >>> Homepage: https://webspace.utexas.edu/~dpr342 >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu May 9 10:28:11 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 9 May 2013 10:28:11 -0500 (CDT) Subject: [petsc-users] =?gb2312?b?16q3ojogUEVUc2MgcHJvYmxlbQ==?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn>, <878v3opa43.fsf@mcs.anl.gov>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> Message-ID: On Thu, 9 May 2013, ?? wrote: > > Thanks! > > so what can I do to compile if I want to use PETsc in VS2010? You compile PETSc with MS C/C++ compiler [not gcc/g++]. Check the installation instructions https://www.mcs.anl.gov/petsc/documentation/installation.html#windows Notice: --with-cc='win32fe cl' etc.. Satish > > I have tried using g++: > > ./configure --with-cxx=g++ --with-fc=0 --download-f2blaslapck --download-mpich > > then > > make PETSC_ARCH=arch-mswin-cxx-debug all > make PETSC_ARCH=arch-mswin-cxx-debug test > > it shows: > > Using PETSC_DIR=/petsc-3.3-p6 and PETSC_ARCH=arch-mswin-cxx-debug > /usr/bin/sh: line 20: 10916 Segmentation fault " some path here"(I omitted) > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process > > See http://www.mcs.anlgov/petsc/documentation/faq.html > /usr/bin/sh: line 20: 11780 Segmentation fault "some path here"(I omitted) > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes > See http://www.mcs.anl.gov/petsc/documentation/faq.html > Completed test examples > > Hao > ________________________________________ > ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] > ????: 2013?5?9? 21:15 > ???: ?? > ??: Re: PETsc problem > > ?? writes: > > > Dear Jed Brown, > > > > I am sorry to trouble you! I am a PhD in math dept. at U of Minnesota, > > currently using PETsc. I have a problem , it can't successfully > > compile in VS2010 after installation. Because I am using Windows, so > > under cygwin, I use: > > > > ./configure --with-cc=gcc -with-fc=0 --download-f2blaslapck > > --download-mpich make all test > > You can't mix gcc here and VS2010 later because they are not compatible. > > > then it completes, but after I include , for example, "petscksp.h", > > the error message after compling: > > > > Error 4 error C3861: '__builtin_expect': identifier not found > > e:\cygwin\home\petsc-3.3-p6\include\petsclog.h 332 1 HelloFluids2 > > Configure tests whether the compiler supports __builtin_expect. There > are probably many other incompatibilities. > > If you have any further questions, please send email to either > petsc-users at mcs.anl.gov or petsc-maint at mcs.anl.gov. > From bourdin at lsu.edu Thu May 9 11:04:50 2013 From: bourdin at lsu.edu (Blaise A Bourdin) Date: Thu, 9 May 2013 16:04:50 +0000 Subject: [petsc-users] Line search: Initial direction and size is 0 Message-ID: <506D87A5-45B3-4A6F-AA03-96A4B30CBFD6@lsu.edu> Hi, I am trying to solve a quadratic constrained optimization problem with SNESVI, i.e. F(v) < 0 if v < 0 F(v) = 0 if 0 < v < 1 F(v) > 0 if v > 0 I am reasonably sure that my function and Jacobian matrix evaluation are correct. In particular, SNES (i.e. unconstrained minimization) works just fine. In some situation, I get the following error in snesvi when running with -snes_monitor -snes_linesearch_monitor: 0 SNES Function norm 2.970161116676e+00 Line search: Using full step: fnorm 2.970161116676e+00 gnorm 1.028739442283e-01 1 SNES Function norm 1.028739442283e-01 Line search: Using full step: fnorm 1.028739442283e-01 gnorm 6.789609435880e-08 2 SNES Function norm 6.789609435880e-08 Line search: Initial direction and size is 0 The output of snes_view is the following: SNES Object:(V_) 24 MPI processes type: virs maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-08, solution=1e-08 total number of linear solver iterations=25 total number of function evaluations=3 KSP Object: (V_) 24 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-08, absolute=1e-08, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (V_) 24 MPI processes type: bjacobi block Jacobi: number of blocks = 24 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (V_sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (V_sub_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 using diagonal shift to prevent zero pivot matrix ordering: natural factor fill ratio given 1, needed 1 Factored matrix follows: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 package used to perform factorization: petsc total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 1 MPI processes type: seqaij rows=0, cols=0 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Matrix Object: 24 MPI processes type: mpiaij rows=1890, cols=1890 total: nonzeros=42208, allocated nonzeros=42208 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines SNESLineSearch Object: (V_) 24 MPI processes type: bt interpolation: cubic alpha=1.000000e-04 maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=40 I really don't know where to start digging. Any suggestion? Blaise -- Department of Mathematics and Center for Computation & Technology Louisiana State University, Baton Rouge, LA 70803, USA Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin From dharmareddy84 at gmail.com Thu May 9 11:11:02 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 11:11:02 -0500 Subject: [petsc-users] plexvtk viewer Message-ID: Hello, I am getting an error, when i try to vec view for dofs defined on nodes and cell centers. I find this error massage strange, i am the dofs belong to different fields, why is it considered a mixed type object? It works if us fileds with only nodal or only cell dofs Is it possible to support a case wehre i have few fields which are differed on nodes, and few which are defined on cell (this are usually auxiliary variables providing information such as material parameters). Attached a test case [0]PETSC ERROR: --------------------- Error Message ---------------------------- -------- [0]PETSC ERROR: No support for this operation for this object type! [0]PETSC ERROR: No support for viewing mixed space with dofs at both vertices an d cells! [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: Petsc Development GIT revision: ba3d6699fe16b82fc53230ec60dd42b2 a6c4d965 GIT Date: 2013-05-09 07:57:34 -0500 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named login1.stampede.tacc.u texas.edu by Reddy135 Thu May 9 11:04:50 2013 [0]PETSC ERROR: Libraries linked from /home1/00924/Reddy135/LocalApps/petsc/mpi_ rScalar_Debug/lib [0]PETSC ERROR: Configure run at Thu May 9 08:52:27 2013 [0]PETSC ERROR: Configure options --download-blacs=1 --download-ctetgen=1 --do wnload-metis=1 --download-mumps=1 --download-parmetis=1 --download-scalapack=1 - -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 --with-blas- lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ --with-deb ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ --with-petsc-arch=mpi_rS calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc PETSC_ARCH=mp i_rScalar_Debug [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: VecView_Plex_Local() line 42 in /home1/00924/Reddy135/LocalApps/ petsc/src/dm/impls/plex/plex.c [0]PETSC ERROR: VecView() line 706 in /home1/00924/Reddy135/LocalApps/petsc/src/ vec/vec/interface/vector.c login1$ -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: testPlexVTU.F90 Type: application/octet-stream Size: 2204 bytes Desc: not available URL: From dharmareddy84 at gmail.com Thu May 9 13:45:55 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 13:45:55 -0500 Subject: [petsc-users] Fortran binding for dmgetdefualtsection Message-ID: Hello, Can you please add fortran binding for dmgetdefualtsection Thanks Reddy -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu May 9 14:58:04 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 9 May 2013 14:58:04 -0500 Subject: [petsc-users] Line search: Initial direction and size is 0 In-Reply-To: <506D87A5-45B3-4A6F-AA03-96A4B30CBFD6@lsu.edu> References: <506D87A5-45B3-4A6F-AA03-96A4B30CBFD6@lsu.edu> Message-ID: Blaise, First see what the linear solver is doing. Run with -ksp_monitor_true_residual -ksp_converged_reason Also run with -snes_vi_monitor to see how much of the solution is on the constraints (for example all of them?). Without thinking about it much I don't think this is suppose to happen; I think with the reduced space method the search direction can only be exactly 0 when the function norm is exactly 0. Barry On May 9, 2013, at 11:04 AM, Blaise A Bourdin wrote: > Hi, > > I am trying to solve a quadratic constrained optimization problem with SNESVI, i.e. > F(v) < 0 if v < 0 > F(v) = 0 if 0 < v < 1 > F(v) > 0 if v > 0 > > I am reasonably sure that my function and Jacobian matrix evaluation are correct. In particular, SNES (i.e. unconstrained minimization) works just fine. > In some situation, I get the following error in snesvi when running with -snes_monitor -snes_linesearch_monitor: > > 0 SNES Function norm 2.970161116676e+00 > Line search: Using full step: fnorm 2.970161116676e+00 gnorm 1.028739442283e-01 > 1 SNES Function norm 1.028739442283e-01 > Line search: Using full step: fnorm 1.028739442283e-01 gnorm 6.789609435880e-08 > 2 SNES Function norm 6.789609435880e-08 > Line search: Initial direction and size is 0 > > The output of snes_view is the following: > SNES Object:(V_) 24 MPI processes > type: virs > maximum iterations=50, maximum function evaluations=10000 > tolerances: relative=1e-08, absolute=1e-08, solution=1e-08 > total number of linear solver iterations=25 > total number of function evaluations=3 > KSP Object: (V_) 24 MPI processes > type: cg > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-08, absolute=1e-08, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (V_) 24 MPI processes > type: bjacobi > block Jacobi: number of blocks = 24 > Local solve is same for all blocks, in the following KSP and PC objects: > KSP Object: (V_sub_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using NONE norm type for convergence test > PC Object: (V_sub_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > using diagonal shift to prevent zero pivot > matrix ordering: natural > factor fill ratio given 1, needed 1 > Factored matrix follows: > Matrix Object: 1 MPI processes > type: seqaij > rows=0, cols=0 > package used to perform factorization: petsc > total: nonzeros=0, allocated nonzeros=0 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Matrix Object: 1 MPI processes > type: seqaij > rows=0, cols=0 > total: nonzeros=0, allocated nonzeros=0 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Matrix Object: 24 MPI processes > type: mpiaij > rows=1890, cols=1890 > total: nonzeros=42208, allocated nonzeros=42208 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > SNESLineSearch Object: (V_) 24 MPI processes > type: bt > interpolation: cubic > alpha=1.000000e-04 > maxstep=1.000000e+08, minlambda=1.000000e-12 > tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 > maximum iterations=40 > > > I really don't know where to start digging. Any suggestion? > > Blaise > > -- > Department of Mathematics and Center for Computation & Technology > Louisiana State University, Baton Rouge, LA 70803, USA > Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin > > > > > > > > From knepley at gmail.com Thu May 9 16:06:53 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 9 May 2013 16:06:53 -0500 Subject: [petsc-users] Fortran binding for dmgetdefualtsection In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 1:45 PM, Dharmendar Reddy wrote: > Hello, > Can you please add fortran binding for dmgetdefualtsection > Its already there. Thanks, Matt > Thanks > Reddy > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dharmareddy84 at gmail.com Thu May 9 16:16:11 2013 From: dharmareddy84 at gmail.com (Dharmendar Reddy) Date: Thu, 9 May 2013 16:16:11 -0500 Subject: [petsc-users] Fortran binding for dmgetdefualtsection In-Reply-To: References: Message-ID: Sorry my bad,,..i type defualt instead of default :P On Thu, May 9, 2013 at 4:06 PM, Matthew Knepley wrote: > On Thu, May 9, 2013 at 1:45 PM, Dharmendar Reddy wrote: > >> Hello, >> Can you please add fortran binding for dmgetdefualtsection >> > > Its already there. > > Thanks, > > Matt > > >> Thanks >> Reddy >> >> -- >> ----------------------------------------------------- >> Dharmendar Reddy Palle >> Graduate Student >> Microelectronics Research center, >> University of Texas at Austin, >> 10100 Burnet Road, Bldg. 160 >> MER 2.608F, TX 78758-4445 >> e-mail: dharmareddy84 at gmail.com >> Phone: +1-512-350-9082 >> United States of America. >> Homepage: https://webspace.utexas.edu/~dpr342 >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- ----------------------------------------------------- Dharmendar Reddy Palle Graduate Student Microelectronics Research center, University of Texas at Austin, 10100 Burnet Road, Bldg. 160 MER 2.608F, TX 78758-4445 e-mail: dharmareddy84 at gmail.com Phone: +1-512-350-9082 United States of America. Homepage: https://webspace.utexas.edu/~dpr342 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu May 9 16:35:38 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 9 May 2013 16:35:38 -0500 Subject: [petsc-users] Line search: Initial direction and size is 0 In-Reply-To: <3F298A03-DBF0-4612-A1E8-391FB685A843@lsu.edu> References: <506D87A5-45B3-4A6F-AA03-96A4B30CBFD6@lsu.edu> <3F298A03-DBF0-4612-A1E8-391FB685A843@lsu.edu> Message-ID: <53956BC3-6516-4D03-9644-8CA694C3C879@mcs.anl.gov> It kicked out after zero iterations hence the linear solve solution was exactly 0 hence no search direction. Use a smaller -ksp_atol On May 9, 2013, at 3: Residual norms for V_ solve. 0 KSP preconditioned resid norm 8.187008817208e-09 true resid norm 6.789609494342e-08 ||r(i)||/||b|| 1.000000000000e+00 Linear solve converged due to CONVERGED_ATOL iterations 0 Line search: Initial direction and size is 0 41 PM, Blaise A Bourdin wrote: > Hi Barry, > > >> First see what the linear solver is doing. Run with -ksp_monitor_true_residual -ksp_converged_reason >> > > Here is the output. > > Solving for V 0 SNES Function norm 2.970161108459e+00 > Residual norms for V_ solve. > 0 KSP Residual norm 2.036472233104e+00 > Residual norms for V_ solve. > 0 KSP preconditioned resid norm 2.036472233104e+00 true resid norm 2.970161108459e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP Residual norm 3.362201008589e-01 > 1 KSP preconditioned resid norm 3.362201008589e-01 true resid norm 1.081901225389e+00 ||r(i)||/||b|| 3.642567476585e-01 > 2 KSP Residual norm 8.676011976897e-02 > 2 KSP preconditioned resid norm 8.676011976897e-02 true resid norm 5.628067688773e-01 ||r(i)||/||b|| 1.894869498070e-01 > 3 KSP Residual norm 2.354970897728e-02 > 3 KSP preconditioned resid norm 2.354970897728e-02 true resid norm 2.268465596192e-01 ||r(i)||/||b|| 7.637517001120e-02 > 4 KSP Residual norm 8.856864873012e-03 > 4 KSP preconditioned resid norm 8.856864873012e-03 true resid norm 3.472847912504e-02 ||r(i)||/||b|| 1.169245635401e-02 > 5 KSP Residual norm 2.117375395242e-03 > 5 KSP preconditioned resid norm 2.117375395242e-03 true resid norm 1.074321390110e-02 ||r(i)||/||b|| 3.617047530015e-03 > 6 KSP Residual norm 6.111534144586e-04 > 6 KSP preconditioned resid norm 6.111534144586e-04 true resid norm 5.929550196977e-03 ||r(i)||/||b|| 1.996373253993e-03 > 7 KSP Residual norm 1.309512944075e-04 > 7 KSP preconditioned resid norm 1.309512944075e-04 true resid norm 1.486838362166e-03 ||r(i)||/||b|| 5.005918224205e-04 > 8 KSP Residual norm 4.062133496811e-05 > 8 KSP preconditioned resid norm 4.062133496811e-05 true resid norm 5.161567698818e-04 ||r(i)||/||b|| 1.737807314262e-04 > 9 KSP Residual norm 1.698745004540e-05 > 9 KSP preconditioned resid norm 1.698745004540e-05 true resid norm 2.239276637988e-04 ||r(i)||/||b|| 7.539243011466e-05 > 10 KSP Residual norm 6.636124730372e-06 > 10 KSP preconditioned resid norm 6.636124730372e-06 true resid norm 6.506381575115e-05 ||r(i)||/||b|| 2.190582038323e-05 > 11 KSP Residual norm 2.781214911941e-06 > 11 KSP preconditioned resid norm 2.781214911941e-06 true resid norm 2.540790192046e-05 ||r(i)||/||b|| 8.554385096516e-06 > 12 KSP Residual norm 6.742783595506e-07 > 12 KSP preconditioned resid norm 6.742783595506e-07 true resid norm 4.447795888297e-06 ||r(i)||/||b|| 1.497493141240e-06 > 13 KSP Residual norm 1.576207197909e-07 > 13 KSP preconditioned resid norm 1.576207197909e-07 true resid norm 1.565294744046e-06 ||r(i)||/||b|| 5.270066797347e-07 > 14 KSP Residual norm 4.693732104006e-08 > 14 KSP preconditioned resid norm 4.693732104006e-08 true resid norm 4.050537753372e-07 ||r(i)||/||b|| 1.363743448743e-07 > 15 KSP Residual norm 1.512651153334e-08 > 15 KSP preconditioned resid norm 1.512651153334e-08 true resid norm 1.188769631688e-07 ||r(i)||/||b|| 4.002374242603e-08 > Linear solve converged due to CONVERGED_RTOL iterations 15 > Line search: Using full step: fnorm 2.970161108459e+00 gnorm 1.028739450303e-01 > 1 SNES Function norm 1.028739450303e-01 > Residual norms for V_ solve. > 0 KSP Residual norm 2.222817828717e-02 > Residual norms for V_ solve. > 0 KSP preconditioned resid norm 2.222817828717e-02 true resid norm 1.028739450303e-01 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP Residual norm 3.859290459225e-03 > 1 KSP preconditioned resid norm 3.859290459225e-03 true resid norm 9.004748973250e-03 ||r(i)||/||b|| 8.753187185149e-02 > 2 KSP Residual norm 1.009944701445e-03 > 2 KSP preconditioned resid norm 1.009944701445e-03 true resid norm 4.110149331593e-03 ||r(i)||/||b|| 3.995325862523e-02 > 3 KSP Residual norm 2.025885581425e-04 > 3 KSP preconditioned resid norm 2.025885581425e-04 true resid norm 4.988762164519e-04 ||r(i)||/||b|| 4.849393267702e-03 > 4 KSP Residual norm 4.335626650617e-05 > 4 KSP preconditioned resid norm 4.335626650617e-05 true resid norm 1.989191475913e-04 ||r(i)||/||b|| 1.933620291637e-03 > 5 KSP Residual norm 1.087258672326e-05 > 5 KSP preconditioned resid norm 1.087258672326e-05 true resid norm 4.241101136270e-05 ||r(i)||/||b|| 4.122619323115e-04 > 6 KSP Residual norm 2.484893638537e-06 > 6 KSP preconditioned resid norm 2.484893638537e-06 true resid norm 1.579812863788e-05 ||r(i)||/||b|| 1.535678313224e-04 > 7 KSP Residual norm 5.757478032991e-07 > 7 KSP preconditioned resid norm 5.757478032991e-07 true resid norm 1.504224487980e-06 ||r(i)||/||b|| 1.462201617268e-05 > 8 KSP Residual norm 1.374813388713e-07 > 8 KSP preconditioned resid norm 1.374813388713e-07 true resid norm 5.490692200205e-07 ||r(i)||/||b|| 5.337301100476e-06 > 9 KSP Residual norm 2.242235726481e-08 > 9 KSP preconditioned resid norm 2.242235726481e-08 true resid norm 2.387428492390e-07 ||r(i)||/||b|| 2.320731932354e-06 > 10 KSP Residual norm 8.187008611740e-09 > 10 KSP preconditioned resid norm 8.187008611740e-09 true resid norm 6.789609492893e-08 ||r(i)||/||b|| 6.599931101012e-07 > Linear solve converged due to CONVERGED_ATOL iterations 10 > Line search: Using full step: fnorm 1.028739450303e-01 gnorm 6.789609494342e-08 > 2 SNES Function norm 6.789609494342e-08 > Residual norms for V_ solve. > 0 KSP Residual norm 8.187008817208e-09 > Residual norms for V_ solve. > 0 KSP preconditioned resid norm 8.187008817208e-09 true resid norm 6.789609494342e-08 ||r(i)||/||b|| 1.000000000000e+00 > Linear solve converged due to CONVERGED_ATOL iterations 0 > Line search: Initial direction and size is 0 > [ERROR] snesV diverged with reason -6 > V min / max: -2.072055e-03 3.523902e+05 > >> Also run with -snes_vi_monitor to see how much of the solution is on the constraints (for example all of them?). > I am running pets-3.3 for this project and there does not seem to be an option -snes_vi_monitor. Is it a pets-dev new thing? If so, I can try to port the code to petsc-dev > From the problem, the constraints should be active at the vast majority of vertices (think of the cahn-hilliard example where the constraints are active at all vertices except in a transition zone between phases). Actually, if one would do a first snes solve without any bounds, I am quite sure that for this problem, one of the bounds would be active at each vertex. > > >> >> Without thinking about it much I don't think this is suppose to happen; I think with the reduced space method the search direction can only be exactly 0 when the function norm is exactly 0. > Sounds reasonable... > Blaise > > -- > Department of Mathematics and Center for Computation & Technology > Louisiana State University, Baton Rouge, LA 70803, USA > Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin > > > > > > > > From knepley at gmail.com Thu May 9 17:44:21 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 9 May 2013 17:44:21 -0500 Subject: [petsc-users] plexvtk viewer In-Reply-To: References: Message-ID: On Thu, May 9, 2013 at 11:11 AM, Dharmendar Reddy wrote: > Hello, > I am getting an error, when i try to vec view for dofs defined on > nodes and cell centers. I find this error massage strange, i am the dofs > belong to different fields, why is it considered a mixed type object? It > works if us fileds with only nodal or only cell dofs > > Is it possible to support a case wehre i have few fields which are > differed on nodes, and few which are defined on cell (this are usually > auxiliary variables providing information such as material parameters). > Attached a test case > This may be possible. You would have to change the PetscViewerVTKFieldType to be bitflags. After that, I am not sure if anything would break in the routine. I don't have much time right now, but I will look at it later if no one else gets to it. Thanks, Matt > [0]PETSC ERROR: --------------------- Error Message > ---------------------------- > -------- > [0]PETSC ERROR: No support for this operation for this object type! > [0]PETSC ERROR: No support for viewing mixed space with dofs at both > vertices an > d cells! > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: Petsc Development GIT revision: > ba3d6699fe16b82fc53230ec60dd42b2 > a6c4d965 GIT Date: 2013-05-09 07:57:34 -0500 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: ./testDMView on a mpi_rScalar_Debug named > login1.stampede.tacc.u > texas.edu by Reddy135 Thu May 9 11:04:50 2013 > [0]PETSC ERROR: Libraries linked from > /home1/00924/Reddy135/LocalApps/petsc/mpi_ > rScalar_Debug/lib > [0]PETSC ERROR: Configure run at Thu May 9 08:52:27 2013 > [0]PETSC ERROR: Configure options --download-blacs=1 > --download-ctetgen=1 --do > wnload-metis=1 --download-mumps=1 --download-parmetis=1 > --download-scalapack=1 - > -download-superlu_dist=1 --download-triangle=1 --download-umfpack=1 > --with-blas- > lapack-dir=/opt/apps/intel/13/composer_xe_2013.2.146/mkl/lib/intel64/ > --with-deb > ugging=1 --with-mpi-dir=/opt/apps/intel13/mvapich2/1.9/ > --with-petsc-arch=mpi_rS > calar_Debug --with-petsc-dir=/home1/00924/Reddy135/LocalApps/petsc > PETSC_ARCH=mp > i_rScalar_Debug > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: VecView_Plex_Local() line 42 in > /home1/00924/Reddy135/LocalApps/ > petsc/src/dm/impls/plex/plex.c > [0]PETSC ERROR: VecView() line 706 in > /home1/00924/Reddy135/LocalApps/petsc/src/ > vec/vec/interface/vector.c > login1$ > > > > -- > ----------------------------------------------------- > Dharmendar Reddy Palle > Graduate Student > Microelectronics Research center, > University of Texas at Austin, > 10100 Burnet Road, Bldg. 160 > MER 2.608F, TX 78758-4445 > e-mail: dharmareddy84 at gmail.com > Phone: +1-512-350-9082 > United States of America. > Homepage: https://webspace.utexas.edu/~dpr342 > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From hao.yu at peraglobal.com Fri May 10 02:08:41 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Fri, 10 May 2013 15:08:41 +0800 Subject: [petsc-users] =?gb2312?b?tPC4tDogINeqt6I6IFBFVHNjIHByb2JsZW0=?= In-Reply-To: References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn>, <878v3opa43.fsf@mcs.anl.gov>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn>, Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> it shows that 'win32fe cl' does not work. I dont' know what the problem is. ________________________________________ ???: Satish Balay [balay at mcs.anl.gov] ????: 2013?5?9? 23:28 ???: ?? ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] ??: PETsc problem On Thu, 9 May 2013, ?? wrote: > > Thanks! > > so what can I do to compile if I want to use PETsc in VS2010? You compile PETSc with MS C/C++ compiler [not gcc/g++]. Check the installation instructions https://www.mcs.anl.gov/petsc/documentation/installation.html#windows Notice: --with-cc='win32fe cl' etc.. Satish > > I have tried using g++: > > ./configure --with-cxx=g++ --with-fc=0 --download-f2blaslapck --download-mpich > > then > > make PETSC_ARCH=arch-mswin-cxx-debug all > make PETSC_ARCH=arch-mswin-cxx-debug test > > it shows: > > Using PETSC_DIR=/petsc-3.3-p6 and PETSC_ARCH=arch-mswin-cxx-debug > /usr/bin/sh: line 20: 10916 Segmentation fault " some path here"(I omitted) > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process > > See http://www.mcs.anlgov/petsc/documentation/faq.html > /usr/bin/sh: line 20: 11780 Segmentation fault "some path here"(I omitted) > Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes > See http://www.mcs.anl.gov/petsc/documentation/faq.html > Completed test examples > > Hao > ________________________________________ > ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] > ????: 2013?5?9? 21:15 > ???: ?? > ??: Re: PETsc problem > > ?? writes: > > > Dear Jed Brown, > > > > I am sorry to trouble you! I am a PhD in math dept. at U of Minnesota, > > currently using PETsc. I have a problem , it can't successfully > > compile in VS2010 after installation. Because I am using Windows, so > > under cygwin, I use: > > > > ./configure --with-cc=gcc -with-fc=0 --download-f2blaslapck > > --download-mpich make all test > > You can't mix gcc here and VS2010 later because they are not compatible. > > > then it completes, but after I include , for example, "petscksp.h", > > the error message after compling: > > > > Error 4 error C3861: '__builtin_expect': identifier not found > > e:\cygwin\home\petsc-3.3-p6\include\petsclog.h 332 1 HelloFluids2 > > Configure tests whether the compiler supports __builtin_expect. There > are probably many other incompatibilities. > > If you have any further questions, please send email to either > petsc-users at mcs.anl.gov or petsc-maint at mcs.anl.gov. > From jedbrown at mcs.anl.gov Fri May 10 02:16:49 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 10 May 2013 02:16:49 -0500 Subject: [petsc-users] =?utf-8?b?562U5aSNOiAg6L2s5Y+ROiBQRVRzYyBwcm9ibGVt?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> Message-ID: <87vc6rjoda.fsf@mcs.anl.gov> ?? writes: > it shows that 'win32fe cl' does not work. I dont' know what the problem is. You couldn't have found a less helpful way to report this. Note the bold part: http://www.mcs.anl.gov/petsc/documentation/bugreporting.html From jedbrown at mcs.anl.gov Fri May 10 02:48:14 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 10 May 2013 02:48:14 -0500 Subject: [petsc-users] plexvtk viewer In-Reply-To: References: Message-ID: <87ip2rjmwx.fsf@mcs.anl.gov> Matthew Knepley writes: > On Thu, May 9, 2013 at 11:11 AM, Dharmendar Reddy > wrote: > >> Hello, >> I am getting an error, when i try to vec view for dofs defined on >> nodes and cell centers. I find this error massage strange, i am the dofs >> belong to different fields, why is it considered a mixed type object? It >> works if us fileds with only nodal or only cell dofs >> >> Is it possible to support a case wehre i have few fields which are >> differed on nodes, and few which are defined on cell (this are usually >> auxiliary variables providing information such as material parameters). >> Attached a test case >> > > This may be possible. You would have to change the PetscViewerVTKFieldType > to be bitflags. After that, > I am not sure if anything would break in the routine. I don't have much > time right now, but I will look at it > later if no one else gets to it. We just need to classify _per field_ rather than per vector. I think all the viewing code will work if we have one PetscViewerVTKFieldType per field. From fsantost at student.ethz.ch Fri May 10 04:55:00 2013 From: fsantost at student.ethz.ch (Santos Teixeira Frederico) Date: Fri, 10 May 2013 09:55:00 +0000 Subject: [petsc-users] Does MUMPS check the nonzero pattern for each iteration? Message-ID: <682CC3CD7A208742B8C2D116C67199010B172ED5@MBX13.d.ethz.ch> Hi, I am using PETSc with MUMPS and based on the time required for each iteraction (it is almost the same amount of time), I suspect MUMPS is not using the information about the nonzero pattern. My question is: how do I make sure it is correctly configured? I am already setting "KSPSetOperators(ksp, A, A, SAME_NONZERO_PATTERN)". Thanks in advance for any tip! Regards, Frederico. From hao.yu at peraglobal.com Fri May 10 06:20:38 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Fri, 10 May 2013 19:20:38 +0800 Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ICDXqreiOiBQRVRzYyBwcm9i?= =?gb2312?b?bGVt?= In-Reply-To: <87vc6rjoda.fsf@mcs.anl.gov> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn>, <87vc6rjoda.fsf@mcs.anl.gov> Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> The configure.log and make.log are attached. it shows that 'win32fe cl' does not work. I dont' know what the problem is. Thanks! Hao ________________________________________ ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] ????: 2013?5?10? 15:16 ???: ??; petsc-users ??: Re: [petsc-users] ??: ??: PETsc problem ?? writes: > it shows that 'win32fe cl' does not work. I dont' know what the problem is. You couldn't have found a less helpful way to report this. Note the bold part: http://www.mcs.anl.gov/petsc/documentation/bugreporting.html -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 49474 bytes Desc: configure.log URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: application/octet-stream Size: 78 bytes Desc: make.log URL: From jedbrown at mcs.anl.gov Fri May 10 06:38:15 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 10 May 2013 06:38:15 -0500 Subject: [petsc-users] Does MUMPS check the nonzero pattern for each iteration? In-Reply-To: <682CC3CD7A208742B8C2D116C67199010B172ED5@MBX13.d.ethz.ch> References: <682CC3CD7A208742B8C2D116C67199010B172ED5@MBX13.d.ethz.ch> Message-ID: <874nebjc9k.fsf@mcs.anl.gov> Santos Teixeira Frederico writes: > Hi, > > I am using PETSc with MUMPS and based on the time required for each > iteraction (it is almost the same amount of time), I suspect MUMPS is > not using the information about the nonzero pattern. My question is: > how do I make sure it is correctly configured? I am already setting > "KSPSetOperators(ksp, A, A, SAME_NONZERO_PATTERN)". Profile, don't speculate. If you see something like this, then symbolic factorization is indeed reused. MatLUFactorSym 1 1.0 5.0068e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 3 1.0 1.2076e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 4 0 0 0 0 4 0 0 0 0 0 The vast majority of the work is in numeric factorization. MUMPS has a sequential stage in symbolic factorization so symbolic can be significant at high core count, but otherwise, it's negligible. From knepley at gmail.com Fri May 10 06:55:20 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 10 May 2013 06:55:20 -0500 Subject: [petsc-users] =?gb2312?b?tPC4tDogtPC4tDog16q3ojogUEVUc2MgcHJvYmxl?= =?gb2312?b?bQ==?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> Message-ID: On Fri, May 10, 2013 at 6:20 AM, ?? wrote: > > The configure.log and make.log are attached. > > it shows that 'win32fe cl' does not work. I dont' know what the problem > is. > Can you compile something with it? It looks like it is not installed correctly. Matt > Thanks! > > > > Hao > ________________________________________ > ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] > ????: 2013?5?10? 15:16 > ???: ??; petsc-users > ??: Re: [petsc-users] ??: ??: PETsc problem > > ?? writes: > > > it shows that 'win32fe cl' does not work. I dont' know what the problem > is. > > You couldn't have found a less helpful way to report this. > > Note the bold part: > > http://www.mcs.anl.gov/petsc/documentation/bugreporting.html > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhenglun.wei at gmail.com Fri May 10 08:15:52 2013 From: zhenglun.wei at gmail.com (Alan) Date: Fri, 10 May 2013 08:15:52 -0500 Subject: [petsc-users] CG or GMRES In-Reply-To: <87k3n9ov7u.fsf@mcs.anl.gov> References: <518ADF34.9010109@gmail.com> <87k3n9ov7u.fsf@mcs.anl.gov> Message-ID: <518CF308.703@gmail.com> Thank you so much, Dr. Brown. I have a minor question on the 'gamg'. As you said, 'gamg' works for many moderately non-symmetric problems. Does this apply for general algebraic MG preconditioner or just 'gamg' in PETSc. As you know, does 'BoomerAMG' suffer from the non-symmetric matrices problem? Should we only use regular MG as the preconditioner for highly non-symmetric problems? thanks, Alan > "Zhenglun (Alan) Wei" writes: > >> Dear folks, >> I hope you're having a nice day. >> For the Poisson solver in /src/ksp/ksp/example/tutorial/ex45.c, I used >> the ksp_type = CG to solve it before; it converges very fast with >> pc_type = gamg. However, I was trying to check if the matrix generated >> by the 'ComputeMatrix' is symmetric by using "ierr = MatIsSymmetric(B, >> tol, &flg);". It shows that this matrix is not exact a symmetric one by >> setting tol = 0.0. Yet, the matrix is 'symmetric' if the tol > 0.01. > The matrix does not enforce boundary conditions symmetrically. > >> Does this mean that, even if the matrix is not exact symmetric, the CG >> could still be used. > You happen to be iterating in a "benign" space in which the operator is SPD. > >> This brings me a question. Can the CG be used to solve an actual >> unsymmetric matrix as long as 'MatIsSymmetric' returns a 'PETSC_TRUE' >> value with certain tolerance. > No. > >> Is there any rule of thumb for this tolerence? Also, as a >> preconditioner, does 'gamg' only work for symmetric positive-definite >> matrix? or it works for any matrix or even with GMRES? > It works for many moderately non-symmetric, certainly for something that only > has non-symmetric boundary conditions. From knepley at gmail.com Fri May 10 08:21:08 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 10 May 2013 08:21:08 -0500 Subject: [petsc-users] CG or GMRES In-Reply-To: <518CF308.703@gmail.com> References: <518ADF34.9010109@gmail.com> <87k3n9ov7u.fsf@mcs.anl.gov> <518CF308.703@gmail.com> Message-ID: On Fri, May 10, 2013 at 8:15 AM, Alan wrote: > Thank you so much, Dr. Brown. > I have a minor question on the 'gamg'. As you said, 'gamg' works for > many moderately non-symmetric problems. Does this apply for general > algebraic MG preconditioner or just 'gamg' in PETSc. As you know, does > 'BoomerAMG' suffer from the non-symmetric matrices problem? Should we > only use regular MG as the preconditioner for highly non-symmetric > problems? > There is nothing that prevents AMG from working on non-symmetric matrices (unlike CG), but there are no guarantees that it will do a good job. Matt > thanks, > Alan > > > "Zhenglun (Alan) Wei" writes: > > > >> Dear folks, > >> I hope you're having a nice day. > >> For the Poisson solver in /src/ksp/ksp/example/tutorial/ex45.c, I used > >> the ksp_type = CG to solve it before; it converges very fast with > >> pc_type = gamg. However, I was trying to check if the matrix generated > >> by the 'ComputeMatrix' is symmetric by using "ierr = MatIsSymmetric(B, > >> tol, &flg);". It shows that this matrix is not exact a symmetric one by > >> setting tol = 0.0. Yet, the matrix is 'symmetric' if the tol > 0.01. > > The matrix does not enforce boundary conditions symmetrically. > > > >> Does this mean that, even if the matrix is not exact symmetric, the CG > >> could still be used. > > You happen to be iterating in a "benign" space in which the operator is > SPD. > > > >> This brings me a question. Can the CG be used to solve an actual > >> unsymmetric matrix as long as 'MatIsSymmetric' returns a 'PETSC_TRUE' > >> value with certain tolerance. > > No. > > > >> Is there any rule of thumb for this tolerence? Also, as a > >> preconditioner, does 'gamg' only work for symmetric positive-definite > >> matrix? or it works for any matrix or even with GMRES? > > It works for many moderately non-symmetric, certainly for something that > only > > has non-symmetric boundary conditions. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zhenglun.wei at gmail.com Fri May 10 08:29:00 2013 From: zhenglun.wei at gmail.com (Alan) Date: Fri, 10 May 2013 08:29:00 -0500 Subject: [petsc-users] CG or GMRES In-Reply-To: References: <518ADF34.9010109@gmail.com> <87k3n9ov7u.fsf@mcs.anl.gov> <518CF308.703@gmail.com> Message-ID: <518CF61C.2090503@gmail.com> Thanks, Dr. Knepley, Can I treat AMG and MG like this way. To solve problems with non-symmetric matrices, MG needs users to provide the coarse mesh information. This take more complexity on coding than using AMG; but MG, generally, do better job than MG for non-symmetric problems. Alan > On Fri, May 10, 2013 at 8:15 AM, Alan > wrote: > > Thank you so much, Dr. Brown. > I have a minor question on the 'gamg'. As you said, 'gamg' works for > many moderately non-symmetric problems. Does this apply for general > algebraic MG preconditioner or just 'gamg' in PETSc. As you know, does > 'BoomerAMG' suffer from the non-symmetric matrices problem? Should we > only use regular MG as the preconditioner for highly non-symmetric > problems? > > > There is nothing that prevents AMG from working on non-symmetric > matrices (unlike CG), > but there are no guarantees that it will do a good job. > > Matt > > thanks, > Alan > > > "Zhenglun (Alan) Wei" > writes: > > > >> Dear folks, > >> I hope you're having a nice day. > >> For the Poisson solver in /src/ksp/ksp/example/tutorial/ex45.c, > I used > >> the ksp_type = CG to solve it before; it converges very fast with > >> pc_type = gamg. However, I was trying to check if the matrix > generated > >> by the 'ComputeMatrix' is symmetric by using "ierr = > MatIsSymmetric(B, > >> tol, &flg);". It shows that this matrix is not exact a > symmetric one by > >> setting tol = 0.0. Yet, the matrix is 'symmetric' if the tol > > 0.01. > > The matrix does not enforce boundary conditions symmetrically. > > > >> Does this mean that, even if the matrix is not exact symmetric, > the CG > >> could still be used. > > You happen to be iterating in a "benign" space in which the > operator is SPD. > > > >> This brings me a question. Can the CG be used to solve an actual > >> unsymmetric matrix as long as 'MatIsSymmetric' returns a > 'PETSC_TRUE' > >> value with certain tolerance. > > No. > > > >> Is there any rule of thumb for this tolerence? Also, as a > >> preconditioner, does 'gamg' only work for symmetric > positive-definite > >> matrix? or it works for any matrix or even with GMRES? > > It works for many moderately non-symmetric, certainly for > something that only > > has non-symmetric boundary conditions. > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 10 08:41:36 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 10 May 2013 08:41:36 -0500 Subject: [petsc-users] CG or GMRES In-Reply-To: <518CF61C.2090503@gmail.com> References: <518ADF34.9010109@gmail.com> <87k3n9ov7u.fsf@mcs.anl.gov> <518CF308.703@gmail.com> <518CF61C.2090503@gmail.com> Message-ID: <87haibhrzj.fsf@mcs.anl.gov> Alan writes: > Thanks, Dr. Knepley, > Can I treat AMG and MG like this way. To solve problems with > non-symmetric matrices, MG needs users to provide the coarse mesh > information. This take more complexity on coding than using AMG; but MG, > generally, do better job than MG for non-symmetric problems. It's hard to generalize. FAS multigrid with conservative discretizations and element agglomeration is popular for transport-dominated problems like steady-state transonic flows. Other geometric MG schemes could perform worse than a given algebraic multigrid. From zhenglun.wei at gmail.com Fri May 10 08:52:35 2013 From: zhenglun.wei at gmail.com (Zhenglun (Alan) Wei) Date: Fri, 10 May 2013 08:52:35 -0500 Subject: [petsc-users] CG or GMRES In-Reply-To: <87haibhrzj.fsf@mcs.anl.gov> References: <518ADF34.9010109@gmail.com> <87k3n9ov7u.fsf@mcs.anl.gov> <518CF308.703@gmail.com> <518CF61C.2090503@gmail.com> <87haibhrzj.fsf@mcs.anl.gov> Message-ID: <518CFBA3.9080904@gmail.com> I understand now. Thank for your answer. best, :) Alan > Alan writes: > >> Thanks, Dr. Knepley, >> Can I treat AMG and MG like this way. To solve problems with >> non-symmetric matrices, MG needs users to provide the coarse mesh >> information. This take more complexity on coding than using AMG; but MG, >> generally, do better job than MG for non-symmetric problems. > It's hard to generalize. FAS multigrid with conservative > discretizations and element agglomeration is popular for > transport-dominated problems like steady-state transonic flows. Other > geometric MG schemes could perform worse than a given algebraic > multigrid. From jedbrown at mcs.anl.gov Fri May 10 15:30:59 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 10 May 2013 15:30:59 -0500 Subject: [petsc-users] VecLoad_HDF5 In-Reply-To: <518D57D6.5000104@physik.uni-freiburg.de> References: <51894F66.5030402@physik.uni-freiburg.de> <87a9o6tkiq.fsf@mcs.anl.gov> <518A36B9.3060304@physik.uni-freiburg.de> <87fvxxsliv.fsf@mcs.anl.gov> <518D57D6.5000104@physik.uni-freiburg.de> Message-ID: <87mws2fugs.fsf@mcs.anl.gov> Klaus Zimmermann writes: > Hi Jed, > > thanks for the update and sorry for the delay. We had a national holiday > yesterday and honestly I was confused by the kz38/maint-3.2 branch in > the main petsc/petsc repo. I was still trying to figure out how to > proceed, but seems everything is resolved now. Perfect. Since we don't normally merge topics immediately to their final place (i.e., they go to 'next' first), we create topic branches in petsc/petsc.git so that we can keep track of them. > Thanks again > Klaus > > Am 08.05.2013 14:28, schrieb Jed Brown: >> Klaus Zimmermann writes: >> >>> Hello Jed, >>> >>> I think it would be nice to fix those bugs in the stable releases, >>> therefore I did the cherry-picking and created pull requests on >>> bitbucket. If you feel this is too much trouble perhaps a known issues >>> list would be an alternative. To just have it fail with the rather >>> obscure error messages like now can be a bit irritating. >> >> Thanks, I merged the one for 'maint' and commented on the other one. >> Bitbucket pull requests should really have a way to be updated without >> marking them as rejected. >> From knepley at gmail.com Fri May 10 15:32:56 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 10 May 2013 15:32:56 -0500 Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ILTwuLQ6INeqt6I6IFBFVHNj?= =?gb2312?b?IHByb2JsZW0=?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> Message-ID: On Fri, May 10, 2013 at 12:11 PM, ?? wrote: > What do you mean by compiling something with it ? I don't know. you mean > 'win32fe cl' is not installed correctly? > I mean, can you compile a file using cl from the command line? Matt > > Hao > ------------------------------ > *???:* Matthew Knepley [knepley at gmail.com] > *????:* 2013?5?10? 19:55 > *???:* ?? > *??:* petsc-users at mcs.anl.gov > *??:* Re: [petsc-users] ??: ??: ??: PETsc problem > > On Fri, May 10, 2013 at 6:20 AM, ?? wrote: > >> >> The configure.log and make.log are attached. >> >> it shows that 'win32fe cl' does not work. I dont' know what the problem >> is. >> > > Can you compile something with it? It looks like it is not installed > correctly. > > Matt > > >> Thanks! >> >> >> >> Hao >> ________________________________________ >> ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] >> ????: 2013?5?10? 15:16 >> ???: ??; petsc-users >> ??: Re: [petsc-users] ??: ??: PETsc problem >> >> ?? writes: >> >> > it shows that 'win32fe cl' does not work. I dont' know what the >> problem is. >> >> You couldn't have found a less helpful way to report this. >> >> Note the bold part: >> >> http://www.mcs.anl.gov/petsc/documentation/bugreporting.html >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From popov at uni-mainz.de Mon May 13 07:47:10 2013 From: popov at uni-mainz.de (Anton Popov) Date: Mon, 13 May 2013 14:47:10 +0200 Subject: [petsc-users] GlobalToLocal Begin/End Message-ID: <5190E0CE.5000903@uni-mainz.de> Hi guys, I need "GlobalToLocal" simultaneously about five vectors composed with different DMs. What do you think will do better? 1) post all Begins first, followed by all Ends 2) post each Begin-End couple one after another Many thanks, Anton From knepley at gmail.com Mon May 13 09:08:16 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 13 May 2013 09:08:16 -0500 Subject: [petsc-users] GlobalToLocal Begin/End In-Reply-To: <5190E0CE.5000903@uni-mainz.de> References: <5190E0CE.5000903@uni-mainz.de> Message-ID: On Mon, May 13, 2013 at 7:47 AM, Anton Popov wrote: > Hi guys, > > I need "GlobalToLocal" simultaneously about five vectors composed with > different DMs. > What do you think will do better? > > 1) post all Begins first, followed by all Ends > 2) post each Begin-End couple one after another > It does not really matter unless you have work that can be done in the middle. Matt > Many thanks, > > Anton > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Mon May 13 09:51:48 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 13 May 2013 09:51:48 -0500 Subject: [petsc-users] GlobalToLocal Begin/End In-Reply-To: References: <5190E0CE.5000903@uni-mainz.de> Message-ID: <87sj1r2arf.fsf@mcs.anl.gov> Matthew Knepley writes: > On Mon, May 13, 2013 at 7:47 AM, Anton Popov wrote: > >> Hi guys, >> >> I need "GlobalToLocal" simultaneously about five vectors composed with >> different DMs. >> What do you think will do better? >> >> 1) post all Begins first, followed by all Ends >> 2) post each Begin-End couple one after another >> > > It does not really matter unless you have work that can be done in the > middle. And the fine-tuning depends on the network hardware. If the implementation is good, posting all the Begins first should be faster because it allows more overlap, but that causes more "out of order" network traffic, so it doesn't always work out that way. Some implementations do message coalescing (subject to a size threshold). In most cases, creating one fat vector that contains all five small ones will be faster on the network. From bsmith at mcs.anl.gov Tue May 14 12:51:06 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 14 May 2013 12:51:06 -0500 Subject: [petsc-users] Release of petsc-3.4 Message-ID: <203D0565-C51A-4E8B-A5C4-F5038BBFDEFD@mcs.anl.gov> We are pleased to announce the release of PETSc version 3.4 at http://www.mcs.anl.gov/petsc The major changes and updates can be found at http://www.mcs.anl.gov/petsc/petsc/documentation/changes/34.html We recommend upgrading to PETSc 3.4 immediately. As always please report problems to petsc-maint at mcs.anl.gov and ask questions at petsc-users at mcs.anl.gov Notable new features include a system for managing unstructured grids with PDE solvers in DMPlex. Capability and performance improvements to the algebraic multigrid preconditioners PCGAMG, many new nonlinear solvers in SNES, many improvements to the ODE solvers in TS including the new TSEIMEX, and support for parallel dense linear algebra using MatElemental. The library also has better encapsulation and better control of symbols. If you are using the threaded version of PETSc or PETSc on GPUs you should continue to work with petsc-dev since this code is too much in flux to be contained in a release. This release includes contributions from Brad Aagaard Shri Abhyankar Mark Adams Satish Balay Blaise Bourdin Jed Brown Peter Brune Emil Constantinescu Lois Curfman McInnes Lisandro Dalcin Wenjun Deng Sean Farley John Fettig Glenn Hammond Shao-Ching Huang Tobin Isaac Chetan Jhurani Dmitry Karpeev Matthew Knepley Michael Kraus Patrick Lacasse R?mi Lacroix Michael Lange Jungho Lee Paul Mullowney Adri?n N?meth Jack Poulson Jose Roman Karl Rupp Patrick Sanan Barry Smith Tim Tautges Richard Tran Mills Stefano Zampini Hong Zhang (ANL/IIT) Hong Zhang (Virginia Tech) Xuan Zhou Thanks Barry From ckontzialis at lycos.com Tue May 14 12:58:10 2013 From: ckontzialis at lycos.com (Konstantinos Kontzialis) Date: Tue, 14 May 2013 14:58:10 -0300 Subject: [petsc-users] Question on asm Message-ID: <51927B32.6010901@lycos.com> Dear all, I am using asm as a preconditioner for solving the Navier Stokes equations. When using ILU as the subdomain preconditioner, I notice convergence problems in the partition boundaries. This does not happen with the Jacobi preconditioner. Do I need to provide any connectivity, node mapping, and/or overlap information to Petsc before calling KSPsolve? Thanks, Kostas From jedbrown at mcs.anl.gov Tue May 14 13:03:38 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 14 May 2013 13:03:38 -0500 Subject: [petsc-users] Question on asm In-Reply-To: <51927B32.6010901@lycos.com> References: <51927B32.6010901@lycos.com> Message-ID: <8761yll9qd.fsf@mcs.anl.gov> Konstantinos Kontzialis writes: > Dear all, > > I am using asm as a preconditioner for solving the Navier Stokes > equations. When using ILU as the subdomain preconditioner, I notice > convergence problems in the partition boundaries. I.e., it converges slowly there? That is natural because the preconditioner is more accurate in the interior of subdomains. > This does not happen with the Jacobi preconditioner. Naturally, because Jacobi is identical regardless of the partition. > Do I need to provide any connectivity, node mapping, and/or overlap > information to Petsc before calling KSPsolve? No, the matrix is enough. From jedbrown at mcs.anl.gov Tue May 14 13:40:27 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 14 May 2013 13:40:27 -0500 Subject: [petsc-users] Release of petsc-3.4 In-Reply-To: <203D0565-C51A-4E8B-A5C4-F5038BBFDEFD@mcs.anl.gov> References: <203D0565-C51A-4E8B-A5C4-F5038BBFDEFD@mcs.anl.gov> Message-ID: <87y5bhjtgk.fsf@mcs.anl.gov> Barry Smith writes: > We are pleased to announce the release of PETSc version 3.4 at http://www.mcs.anl.gov/petsc > > The major changes and updates can be found at > > http://www.mcs.anl.gov/petsc/petsc/documentation/changes/34.html The correct address is: http://www.mcs.anl.gov/petsc/documentation/changes/34.html From isik.ozcer at mail.mcgill.ca Tue May 14 14:35:58 2013 From: isik.ozcer at mail.mcgill.ca (Isik Ali Ozcer) Date: Tue, 14 May 2013 19:35:58 +0000 Subject: [petsc-users] FW: [Fwd: Re: Question on asm] Message-ID: Hi All, Regarding ILU convergence at subdomain boundaries, I need to add to my colleague's message that we followed a very basic tutorial that shows matrix and rhs assembly, preconditioner and solver setup, and finally a KSPsolve. We did not prepare any application ordering information or index sets. I am not sure if Petsc in our implementation has all the info it needs to perform proper overlapping operations with ASM. Do we need to worry about setting up AO and IS contexts before attempting KSPsolve with ASM and ILU? Thanks a lot and have a great day, Isik petsc-newb -------- Original Message -------- Subject: Re: [petsc-users] Question on asm Date: Tue, 14 May 2013 13:03:38 -0500 From: Jed Brown To: Konstantinos Kontzialis , petsc-users at mcs.anl.gov Konstantinos Kontzialis writes: > Dear all, > > I am using asm as a preconditioner for solving the Navier Stokes > equations. When using ILU as the subdomain preconditioner, I notice > convergence problems in the partition boundaries. I.e., it converges slowly there? That is natural because the preconditioner is more accurate in the interior of subdomains. > This does not happen with the Jacobi preconditioner. Naturally, because Jacobi is identical regardless of the partition. > Do I need to provide any connectivity, node mapping, and/or overlap > information to Petsc before calling KSPsolve? No, the matrix is enough. From mpovolot at purdue.edu Tue May 14 14:44:38 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Tue, 14 May 2013 15:44:38 -0400 Subject: [petsc-users] Release of petsc-3.4 In-Reply-To: <203D0565-C51A-4E8B-A5C4-F5038BBFDEFD@mcs.anl.gov> References: <203D0565-C51A-4E8B-A5C4-F5038BBFDEFD@mcs.anl.gov> Message-ID: <51929426.10808@purdue.edu> Thank you for the info. What is exactly "better encapsulation and better control of symbols"? Michael. On 05/14/2013 01:51 PM, Barry Smith wrote: > We are pleased to announce the release of PETSc version 3.4 at http://www.mcs.anl.gov/petsc > > The major changes and updates can be found at > > http://www.mcs.anl.gov/petsc/petsc/documentation/changes/34.html > > We recommend upgrading to PETSc 3.4 immediately. As always please report problems to > petsc-maint at mcs.anl.gov and ask questions at petsc-users at mcs.anl.gov > > Notable new features include a system for managing unstructured grids with PDE solvers in DMPlex. > Capability and performance improvements to the algebraic multigrid preconditioners PCGAMG, many > new nonlinear solvers in SNES, many improvements to the ODE solvers in TS including the new > TSEIMEX, and support for parallel dense linear algebra using MatElemental. The library also > has better encapsulation and better control of symbols. > > If you are using the threaded version of PETSc or PETSc on GPUs you should continue to work with > petsc-dev since this code is too much in flux to be contained in a release. > > This release includes contributions from > > Brad Aagaard > Shri Abhyankar > Mark Adams > Satish Balay > Blaise Bourdin > Jed Brown > Peter Brune > Emil Constantinescu > Lois Curfman McInnes > Lisandro Dalcin > Wenjun Deng > Sean Farley > John Fettig > Glenn Hammond > Shao-Ching Huang > Tobin Isaac > Chetan Jhurani > Dmitry Karpeev > Matthew Knepley > Michael Kraus > Patrick Lacasse > R?mi Lacroix > Michael Lange > Jungho Lee > Paul Mullowney > Adri?n N?meth > Jack Poulson > Jose Roman > Karl Rupp > Patrick Sanan > Barry Smith > Tim Tautges > Richard Tran Mills > Stefano Zampini > Hong Zhang (ANL/IIT) > Hong Zhang (Virginia Tech) > Xuan Zhou > > Thanks > > Barry > -- Michael Povolotskyi, PhD Research Assistant Professor Network for Computational Nanotechnology 207 S Martin Jischke Drive Purdue University, DLR, room 441-10 West Lafayette, Indiana 47907 phone: +1-765-494-9396 fax: +1-765-496-6026 From knepley at gmail.com Tue May 14 15:05:37 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 14 May 2013 15:05:37 -0500 Subject: [petsc-users] FW: [Fwd: Re: Question on asm] In-Reply-To: References: Message-ID: On Tue, May 14, 2013 at 2:35 PM, Isik Ali Ozcer wrote: > Hi All, > > Regarding ILU convergence at subdomain boundaries, I need to add to my > colleague's message that we followed a very basic tutorial that shows > matrix and rhs assembly, preconditioner and solver setup, and finally a > KSPsolve. We did not prepare any application ordering information or index > sets. I am not sure if Petsc in our implementation has all the info it > needs to perform proper overlapping operations with ASM. Do we need to > worry about setting up AO and IS contexts before attempting KSPsolve with > ASM and ILU? > No the matrix is enough. Matt > Thanks a lot and have a great day, > > Isik > petsc-newb > > > > -------- Original Message -------- > Subject: Re: [petsc-users] Question on asm > Date: Tue, 14 May 2013 13:03:38 -0500 > From: Jed Brown > To: Konstantinos Kontzialis , > petsc-users at mcs.anl.gov > > > > Konstantinos Kontzialis writes: > > > Dear all, > > > > I am using asm as a preconditioner for solving the Navier Stokes > > equations. When using ILU as the subdomain preconditioner, I notice > > convergence problems in the partition boundaries. > > I.e., it converges slowly there? That is natural because the > preconditioner is more accurate in the interior of subdomains. > > > This does not happen with the Jacobi preconditioner. > > Naturally, because Jacobi is identical regardless of the partition. > > > Do I need to provide any connectivity, node mapping, and/or overlap > > information to Petsc before calling KSPsolve? > > No, the matrix is enough. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue May 14 15:25:54 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 14 May 2013 15:25:54 -0500 (CDT) Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On Tue, 14 May 2013, Matteo Parsani wrote: > Dear PETSc developers and users, > I have just updated petsc from 3.3-p7 to petsc 3.4. and during the > installation testing I get the following message: > > pmatteo at parsani-lan:~/research/lib_src/petsc$ make PETSC_DIR=/scratch/home0/ > pmatteo/research/workspace/codes/ssdc/deps/petscnew test > Running test examples to verify correct installation > Using PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ > petscnew and PETSC_ARCH=arch-linux2-c-debug > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 > MPIprocess > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 > MPIprocesses > egrep: /scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/petscnew > /arch-linux2-c-debug/include/petscconf.h: No such file or directory Perhaps you are having file system problems? > Fortran example src/snes/examples/tutorials/ex5f run successfully with > 1 MPIprocess > Completed test examples > > The test pass successfully but .... Ok - then the library is ok and useable. > > Attached the log file. > > Moreover, when I run my Fortran 90 as usual it seems the libpetsc.so can > not be opened (./NSE: error while loading shared libraries: libpetsc.so: > cannot open shared object file: No such file or directory) > > PATH and LD_LIBRARY_PATH are set exactly as for petsc 3.3-p7 through my . > bashrc file and libpetsc.so is in the right location. > > If I switch to 3.3-p7 (just by point to the other directory where > petsc3.3-p7 is installed) it works fine. Perhaps you can use PETSc makefiles - so you don't have to rely on LD_LIBRARY_PATH? [or use -Wl,-rpath,/scratch/home0/pmatteo/research/lib_src/petsc/arch-linux2-c-debug/lib] You can always do 'ldd NSE' to see if the required sharedlibraries are found or not Satish > > Any idea? > > > Thanks, > > > From balay at mcs.anl.gov Tue May 14 15:32:42 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 14 May 2013 15:32:42 -0500 (CDT) Subject: [petsc-users] Release of petsc-3.4 In-Reply-To: <51929426.10808@purdue.edu> References: <203D0565-C51A-4E8B-A5C4-F5038BBFDEFD@mcs.anl.gov> <51929426.10808@purdue.edu> Message-ID: I think its a reference to the following in the changes file. >>>>>>> The configure options --with-c-support and --with-c++-support have been removed. A PETSc library built using C or C++ can be called from either C or C++. The primary functional reason to use --with-clanguage=C++ is to use std::complex data types. Other users will likely prefer --with-clanguage=C (the default) because it compiles somewhat faster. The --with-c-support option is no longer needed because it is now the default behavior when using --with-clanguage=c++. <<<<<<< And the usage of _attribute__((visibility ("default")) vs __attribute__((visibility ("hidden")) for public vs private petsc functions. http://gcc.gnu.org/wiki/Visibility Satish On Tue, 14 May 2013, Michael Povolotskyi wrote: > Thank you for the info. > What is exactly "better encapsulation and better control of symbols"? > Michael. > > > On 05/14/2013 01:51 PM, Barry Smith wrote: > > We are pleased to announce the release of PETSc version 3.4 at > > http://www.mcs.anl.gov/petsc > > > > The major changes and updates can be found at > > > > http://www.mcs.anl.gov/petsc/petsc/documentation/changes/34.html > > > > We recommend upgrading to PETSc 3.4 immediately. As always please report > > problems to > > petsc-maint at mcs.anl.gov and ask questions at petsc-users at mcs.anl.gov > > > > Notable new features include a system for managing unstructured grids > > with PDE solvers in DMPlex. > > Capability and performance improvements to the algebraic multigrid > > preconditioners PCGAMG, many > > new nonlinear solvers in SNES, many improvements to the ODE solvers in TS > > including the new > > TSEIMEX, and support for parallel dense linear algebra using MatElemental. > > The library also > > has better encapsulation and better control of symbols. > > > > If you are using the threaded version of PETSc or PETSc on GPUs you > > should continue to work with > > petsc-dev since this code is too much in flux to be contained in a release. > > > > This release includes contributions from > > > > Brad Aagaard > > Shri Abhyankar > > Mark Adams > > Satish Balay > > Blaise Bourdin > > Jed Brown > > Peter Brune > > Emil Constantinescu > > Lois Curfman McInnes > > Lisandro Dalcin > > Wenjun Deng > > Sean Farley > > John Fettig > > Glenn Hammond > > Shao-Ching Huang > > Tobin Isaac > > Chetan Jhurani > > Dmitry Karpeev > > Matthew Knepley > > Michael Kraus > > Patrick Lacasse > > R?mi Lacroix > > Michael Lange > > Jungho Lee > > Paul Mullowney > > Adri?n N?meth > > Jack Poulson > > Jose Roman > > Karl Rupp > > Patrick Sanan > > Barry Smith > > Tim Tautges > > Richard Tran Mills > > Stefano Zampini > > Hong Zhang (ANL/IIT) > > Hong Zhang (Virginia Tech) > > Xuan Zhou > > > > Thanks > > > > Barry > > > > > From jedbrown at mcs.anl.gov Tue May 14 15:34:12 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 14 May 2013 15:34:12 -0500 Subject: [petsc-users] Release of petsc-3.4 In-Reply-To: <51929426.10808@purdue.edu> References: <203D0565-C51A-4E8B-A5C4-F5038BBFDEFD@mcs.anl.gov> <51929426.10808@purdue.edu> Message-ID: <87ppwtjo6z.fsf@mcs.anl.gov> Michael Povolotskyi writes: > Thank you for the info. > What is exactly "better encapsulation and better control of symbols"? Headers are more precise (leading to fewer, faster recompiles) and fewer internal details leak out. For example, the contents of PetscObject and Vec are no longer exposed (to be accessed indirectly via inlines or macros, for example) which allows us to upgrade that structure without breaking binary compatibility. One practical consequence of this is that you can use LD_PRELOAD to run optimized code against a debugging PETSc without recompiling. That is, compile your application (and intermediate libraries, if applicable) against an optimized PETSC_ARCH. Then make a debugging PETSC_ARCH that is otherwise identical: $ cd $PETSC_DIR $ $PETSC_ARCH/conf/reconfigure*.py --with-debugging=1 PETSC_ARCH=${PETSC_ARCH}-dbg $ make PETSC_ARCH=${PETSC_ARCH}-dbg Now, without rebuilding anything, this gives you the usual debugging diagnostics (input validation, stack debugging, etc.): $ cd ~/your-project $ LD_PRELOAD=$PETSC_DIR/${PETSC_ARCH}-dbg/lib/libpetsc.so yourapp -options From a.vergottis at gmail.com Tue May 14 16:06:24 2013 From: a.vergottis at gmail.com (Anthony Vergottis) Date: Tue, 14 May 2013 22:06:24 +0100 Subject: [petsc-users] Matrices - Arrays Message-ID: To whom it may concern, I am very new to Petsc and I would very much appreciate some assistance. What would be the best way to create an array of matrix objects? The number of matrix objects required would be determined during run time. For example, I am looking into the FEM and I would like to create a matrix for each element (I know how to set up the matrix objects). Therefore, if I am using a mesh with 100 element I would like to determine during runtime that I will need to create 100 "Mat A[100]" etc, something along those lines. Hope I explained it well. Sorry if this is a trivial question but your help would go a long way. Thanks, keep up the great job guys!! Regards, Anthony Vergottis -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.vergottis at gmail.com Tue May 14 16:12:42 2013 From: a.vergottis at gmail.com (Anthony Vergottis) Date: Tue, 14 May 2013 22:12:42 +0100 Subject: [petsc-users] Matrices - Arrays In-Reply-To: References: Message-ID: P.S I am writing in C/C++ On 14 May 2013 22:06, Anthony Vergottis wrote: > To whom it may concern, > > I am very new to Petsc and I would very much appreciate some assistance. > > What would be the best way to create an array of matrix objects? The > number of matrix objects required would be determined during run time. > > For example, I am looking into the FEM and I would like to create a matrix > for each element (I know how to set up the matrix objects). Therefore, if I > am using a mesh with 100 element I would like to determine during runtime > that I will need to create 100 "Mat A[100]" etc, something along those > lines. Hope I explained it well. > > Sorry if this is a trivial question but your help would go a long way. > > Thanks, keep up the great job guys!! > > Regards, > Anthony Vergottis > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue May 14 16:15:15 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 14 May 2013 16:15:15 -0500 Subject: [petsc-users] Matrices - Arrays In-Reply-To: References: Message-ID: On Tue, May 14, 2013 at 4:06 PM, Anthony Vergottis wrote: > To whom it may concern, > > I am very new to Petsc and I would very much appreciate some assistance. > > What would be the best way to create an array of matrix objects? The > number of matrix objects required would be determined during run time. > > For example, I am looking into the FEM and I would like to create a matrix > for each element (I know how to set up the matrix objects). Therefore, if I > am using a mesh with 100 element I would like to determine during runtime > that I will need to create 100 "Mat A[100]" etc, something along those > lines. Hope I explained it well. > > Sorry if this is a trivial question but your help would go a long way. > I don't think you want to do this. Generally, element matrices can be raw arrays. If you really want an array of Mat objects, declare it Mat *mats; Allocate it ierr = PetscMalloc(numMats * sizeof(Mat), &mats);CHKERRQ(ierr); and call MatCreate(), etc. for each entry. Matt > Thanks, keep up the great job guys!! > > Regards, > Anthony Vergottis > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From suifengls at gmail.com Tue May 14 16:31:40 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Tue, 14 May 2013 14:31:40 -0700 Subject: [petsc-users] Variable Block Row format in PETSc Message-ID: To whom it may concern, I use parmetis to partition a mesh for a sparse matrix. Then I distribute the data to the appropriate processors according to the result of partition. The sparse matrix is stored in Variable Block Row(VBR) format. After the distribution, I want to call PETSc KSP solver to solve Ax = b. I tried to convert VBR to AIJ or CSR format, but the data would be re-distributed. The ideal method is to keep the distribution result from parmetis. For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 has 2, 3, 5. I wish the PETSc would not change this distribution and solve Ax = b. Are there any approaches to call KSP solver in VBR format from PETSc? Or any suggestions for solving Ax = b? Thanks in advance. Regards, Longxiang Chen -------------- next part -------------- An HTML attachment was scrubbed... URL: From suifengls at gmail.com Tue May 14 16:31:40 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Tue, 14 May 2013 14:31:40 -0700 Subject: [petsc-users] Variable Block Row format in PETSc Message-ID: To whom it may concern, I use parmetis to partition a mesh for a sparse matrix. Then I distribute the data to the appropriate processors according to the result of partition. The sparse matrix is stored in Variable Block Row(VBR) format. After the distribution, I want to call PETSc KSP solver to solve Ax = b. I tried to convert VBR to AIJ or CSR format, but the data would be re-distributed. The ideal method is to keep the distribution result from parmetis. For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 has 2, 3, 5. I wish the PETSc would not change this distribution and solve Ax = b. Are there any approaches to call KSP solver in VBR format from PETSc? Or any suggestions for solving Ax = b? Thanks in advance. Regards, Longxiang Chen -------------- next part -------------- An HTML attachment was scrubbed... URL: From psanan at cms.caltech.edu Tue May 14 16:41:05 2013 From: psanan at cms.caltech.edu (Patrick Sanan) Date: Tue, 14 May 2013 14:41:05 -0700 Subject: [petsc-users] Matrices - Arrays In-Reply-To: References: Message-ID: <3D6D9685-FFFE-4E7D-BE73-7302EEC1A330@cms.caltech.edu> On May 14, 2013, at 2:15 PM, Matthew Knepley wrote: > On Tue, May 14, 2013 at 4:06 PM, Anthony Vergottis wrote: > To whom it may concern, > > I am very new to Petsc and I would very much appreciate some assistance. > > What would be the best way to create an array of matrix objects? The number of matrix objects required would be determined during run time. > > For example, I am looking into the FEM and I would like to create a matrix for each element (I know how to set up the matrix objects). Therefore, if I am using a mesh with 100 element I would like to determine during runtime that I will need to create 100 "Mat A[100]" etc, something along those lines. Hope I explained it well. > > Sorry if this is a trivial question but your help would go a long way. > > I don't think you want to do this. Generally, element matrices can be raw arrays. If you really want > an array of Mat objects, declare it > > Mat *mats; > > Allocate it > > ierr = PetscMalloc(numMats * sizeof(Mat), &mats);CHKERRQ(ierr); > > and call MatCreate(), etc. for each entry. Another alternative is to use a lightweight vector/matrix library for small local matrices. Since you're using C++, an easy-to-use option might be Eigen. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue May 14 16:51:07 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 14 May 2013 16:51:07 -0500 Subject: [petsc-users] Variable Block Row format in PETSc In-Reply-To: References: Message-ID: What kind of VBR matrix? What are you partitioning using parmetis? A mesh? The blocks of the matrix? How do you create the entries in the matrix? On May 14, 2013 4:36 PM, "Longxiang Chen" wrote: > To whom it may concern, > > I use parmetis to partition a mesh for a sparse matrix. > Then I distribute the data to the appropriate processors according to the > result of partition. > > The sparse matrix is stored in Variable Block Row(VBR) format. > After the distribution, I want to call PETSc KSP solver to solve Ax = b. > I tried to convert VBR to AIJ or CSR format, but the data would be > re-distributed. > > The ideal method is to keep the distribution result from parmetis. > For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 has > 2, 3, 5. I wish the PETSc would not change this distribution and solve Ax > = b. > > Are there any approaches to call KSP solver in VBR format from PETSc? > Or any suggestions for solving Ax = b? > > Thanks in advance. > > Regards, > Longxiang Chen > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From suifengls at gmail.com Tue May 14 17:02:18 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Tue, 14 May 2013 15:02:18 -0700 Subject: [petsc-users] Variable Block Row format in PETSc In-Reply-To: References: Message-ID: VBR like in this link, use 6 arrays to represent a matrix. http://docs.oracle.com/cd/E19061-01/hpc.cluster5/817-0086-10/prog-sparse-support.html Each row is a vertex in the graph, , and use parmetis to partition the graph to minimize the number of cuts between different processors. (reduce communication when calculate Matrix-Vector) The matrix is calculated from Jacobian and construct the A and b from the result of Jacobian (in VBR). Best regards, Longxiang Chen Do something every day that gets you closer to being done. -------------------------------------------------------------- 465 Winston Chung Hall Computer Science Engineering University of California, Riverside On Tue, May 14, 2013 at 2:51 PM, Jed Brown wrote: > What kind of VBR matrix? What are you partitioning using parmetis? A mesh? > The blocks of the matrix? How do you create the entries in the matrix? > On May 14, 2013 4:36 PM, "Longxiang Chen" wrote: > >> To whom it may concern, >> >> I use parmetis to partition a mesh for a sparse matrix. >> Then I distribute the data to the appropriate processors according to >> the result of partition. >> >> The sparse matrix is stored in Variable Block Row(VBR) format. >> After the distribution, I want to call PETSc KSP solver to solve Ax = b. >> I tried to convert VBR to AIJ or CSR format, but the data would be >> re-distributed. >> >> The ideal method is to keep the distribution result from parmetis. >> For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 >> has 2, 3, 5. I wish the PETSc would not change this distribution and >> solve Ax = b. >> >> Are there any approaches to call KSP solver in VBR format from PETSc? >> Or any suggestions for solving Ax = b? >> >> Thanks in advance. >> >> Regards, >> Longxiang Chen >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue May 14 17:11:48 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 14 May 2013 17:11:48 -0500 (CDT) Subject: [petsc-users] Variable Block Row format in PETSc In-Reply-To: References: Message-ID: AIJ matrix format internally supports VBR listed below [called inodes in PETSc] So I'm not sure what problem you are having. Satish On Tue, 14 May 2013, Longxiang Chen wrote: > VBR like in this link, use 6 arrays to represent a matrix. > http://docs.oracle.com/cd/E19061-01/hpc.cluster5/817-0086-10/prog-sparse-support.html > > Each row is a vertex in the graph, , and use parmetis to partition the > graph to minimize the number of cuts between different processors. (reduce > communication when calculate Matrix-Vector) > The matrix is calculated from Jacobian and construct the A and b from the > result of Jacobian (in VBR). > > > Best regards, > Longxiang Chen > > Do something every day that gets you closer to being done. > -------------------------------------------------------------- > 465 Winston Chung Hall > Computer Science Engineering > University of California, Riverside > > > > On Tue, May 14, 2013 at 2:51 PM, Jed Brown wrote: > > > What kind of VBR matrix? What are you partitioning using parmetis? A mesh? > > The blocks of the matrix? How do you create the entries in the matrix? > > On May 14, 2013 4:36 PM, "Longxiang Chen" wrote: > > > >> To whom it may concern, > >> > >> I use parmetis to partition a mesh for a sparse matrix. > >> Then I distribute the data to the appropriate processors according to > >> the result of partition. > >> > >> The sparse matrix is stored in Variable Block Row(VBR) format. > >> After the distribution, I want to call PETSc KSP solver to solve Ax = b. > >> I tried to convert VBR to AIJ or CSR format, but the data would be > >> re-distributed. > >> > >> The ideal method is to keep the distribution result from parmetis. > >> For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 > >> has 2, 3, 5. I wish the PETSc would not change this distribution and > >> solve Ax = b. > >> > >> Are there any approaches to call KSP solver in VBR format from PETSc? > >> Or any suggestions for solving Ax = b? > >> > >> Thanks in advance. > >> > >> Regards, > >> Longxiang Chen > >> > >> > From jedbrown at mcs.anl.gov Tue May 14 17:15:43 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 14 May 2013 17:15:43 -0500 Subject: [petsc-users] Variable Block Row format in PETSc In-Reply-To: References: Message-ID: It's not the same data layout, but I recommend that you just use AIJ and let it optimize internally. Partition the graph first, then allocate the matrix, then compute the Jacobian entries and use MatSetValues. On May 14, 2013 5:11 PM, "Satish Balay" wrote: > AIJ matrix format internally supports VBR listed below [called inodes in > PETSc] > > So I'm not sure what problem you are having. > > Satish > > On Tue, 14 May 2013, Longxiang Chen wrote: > > > VBR like in this link, use 6 arrays to represent a matrix. > > > http://docs.oracle.com/cd/E19061-01/hpc.cluster5/817-0086-10/prog-sparse-support.html > > > > Each row is a vertex in the graph, , and use parmetis to partition the > > graph to minimize the number of cuts between different processors. > (reduce > > communication when calculate Matrix-Vector) > > The matrix is calculated from Jacobian and construct the A and b from the > > result of Jacobian (in VBR). > > > > > > Best regards, > > Longxiang Chen > > > > Do something every day that gets you closer to being done. > > -------------------------------------------------------------- > > 465 Winston Chung Hall > > Computer Science Engineering > > University of California, Riverside > > > > > > > > On Tue, May 14, 2013 at 2:51 PM, Jed Brown wrote: > > > > > What kind of VBR matrix? What are you partitioning using parmetis? A > mesh? > > > The blocks of the matrix? How do you create the entries in the matrix? > > > On May 14, 2013 4:36 PM, "Longxiang Chen" wrote: > > > > > >> To whom it may concern, > > >> > > >> I use parmetis to partition a mesh for a sparse matrix. > > >> Then I distribute the data to the appropriate processors according to > > >> the result of partition. > > >> > > >> The sparse matrix is stored in Variable Block Row(VBR) format. > > >> After the distribution, I want to call PETSc KSP solver to solve Ax = > b. > > >> I tried to convert VBR to AIJ or CSR format, but the data would be > > >> re-distributed. > > >> > > >> The ideal method is to keep the distribution result from parmetis. > > >> For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 > > >> has 2, 3, 5. I wish the PETSc would not change this distribution and > > >> solve Ax = b. > > >> > > >> Are there any approaches to call KSP solver in VBR format from PETSc? > > >> Or any suggestions for solving Ax = b? > > >> > > >> Thanks in advance. > > >> > > >> Regards, > > >> Longxiang Chen > > >> > > >> > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From suifengls at gmail.com Tue May 14 17:20:47 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Tue, 14 May 2013 15:20:47 -0700 Subject: [petsc-users] Variable Block Row format in PETSc In-Reply-To: References: Message-ID: My problem is: I would like to keep the distribution result from parmetis. For example, after parmetis, processor 0 has 0, 1, 4 row, and processor 1 has 2, 3, 5 row. They are not in continuous order. If I set AIJ in PETSc, then 0, 1, 2 would be in processor 0 and 3, 4, 5 are in processor 1, because PETSc needs the matrix in continuous order. Hi, Jed, If I partition the graph first, then allocate the matrix, will petsc distribute the matrix according to the partitioning result or it will allocate by itself (ignore the partition before)? Thanks, Best regards, Longxiang Chen On Tue, May 14, 2013 at 3:11 PM, Satish Balay wrote: > AIJ matrix format internally supports VBR listed below [called inodes in > PETSc] > > So I'm not sure what problem you are having. > > Satish > > On Tue, 14 May 2013, Longxiang Chen wrote: > > > VBR like in this link, use 6 arrays to represent a matrix. > > > http://docs.oracle.com/cd/E19061-01/hpc.cluster5/817-0086-10/prog-sparse-support.html > > > > Each row is a vertex in the graph, , and use parmetis to partition the > > graph to minimize the number of cuts between different processors. > (reduce > > communication when calculate Matrix-Vector) > > The matrix is calculated from Jacobian and construct the A and b from the > > result of Jacobian (in VBR). > > > > > > Best regards, > > Longxiang Chen > > > > Do something every day that gets you closer to being done. > > -------------------------------------------------------------- > > 465 Winston Chung Hall > > Computer Science Engineering > > University of California, Riverside > > > > > > > > On Tue, May 14, 2013 at 2:51 PM, Jed Brown wrote: > > > > > What kind of VBR matrix? What are you partitioning using parmetis? A > mesh? > > > The blocks of the matrix? How do you create the entries in the matrix? > > > On May 14, 2013 4:36 PM, "Longxiang Chen" wrote: > > > > > >> To whom it may concern, > > >> > > >> I use parmetis to partition a mesh for a sparse matrix. > > >> Then I distribute the data to the appropriate processors according to > > >> the result of partition. > > >> > > >> The sparse matrix is stored in Variable Block Row(VBR) format. > > >> After the distribution, I want to call PETSc KSP solver to solve Ax = > b. > > >> I tried to convert VBR to AIJ or CSR format, but the data would be > > >> re-distributed. > > >> > > >> The ideal method is to keep the distribution result from parmetis. > > >> For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 > > >> has 2, 3, 5. I wish the PETSc would not change this distribution and > > >> solve Ax = b. > > >> > > >> Are there any approaches to call KSP solver in VBR format from PETSc? > > >> Or any suggestions for solving Ax = b? > > >> > > >> Thanks in advance. > > >> > > >> Regards, > > >> Longxiang Chen > > >> > > >> > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue May 14 17:23:09 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 14 May 2013 17:23:09 -0500 Subject: [petsc-users] Variable Block Row format in PETSc In-Reply-To: References: Message-ID: Just renumber your nodes. Store the permutation so that you can interpret the result in terms of the old numbering. On May 14, 2013 5:21 PM, "Longxiang Chen" wrote: > My problem is: > > I would like to keep the distribution result from parmetis. > For example, after parmetis, processor 0 has 0, 1, 4 row, and processor 1 > has 2, 3, 5 row. They are not in continuous order. > > If I set AIJ in PETSc, then 0, 1, 2 would be in processor 0 and 3, 4, 5 > are in processor 1, because PETSc needs the matrix in continuous order. > > Hi, Jed, > > If I partition the graph first, then allocate the matrix, will petsc > distribute the matrix according to the partitioning result or it will > allocate by itself (ignore the partition before)? > Thanks, > > Best regards, > Longxiang Chen > > > > > > On Tue, May 14, 2013 at 3:11 PM, Satish Balay wrote: > >> AIJ matrix format internally supports VBR listed below [called inodes in >> PETSc] >> >> So I'm not sure what problem you are having. >> >> Satish >> >> On Tue, 14 May 2013, Longxiang Chen wrote: >> >> > VBR like in this link, use 6 arrays to represent a matrix. >> > >> http://docs.oracle.com/cd/E19061-01/hpc.cluster5/817-0086-10/prog-sparse-support.html >> > >> > Each row is a vertex in the graph, , and use parmetis to partition the >> > graph to minimize the number of cuts between different processors. >> (reduce >> > communication when calculate Matrix-Vector) >> > The matrix is calculated from Jacobian and construct the A and b from >> the >> > result of Jacobian (in VBR). >> > >> > >> > Best regards, >> > Longxiang Chen >> > >> > Do something every day that gets you closer to being done. >> > -------------------------------------------------------------- >> > 465 Winston Chung Hall >> > Computer Science Engineering >> > University of California, Riverside >> > >> > >> > >> > On Tue, May 14, 2013 at 2:51 PM, Jed Brown >> wrote: >> > >> > > What kind of VBR matrix? What are you partitioning using parmetis? A >> mesh? >> > > The blocks of the matrix? How do you create the entries in the matrix? >> > > On May 14, 2013 4:36 PM, "Longxiang Chen" >> wrote: >> > > >> > >> To whom it may concern, >> > >> >> > >> I use parmetis to partition a mesh for a sparse matrix. >> > >> Then I distribute the data to the appropriate processors according >> to >> > >> the result of partition. >> > >> >> > >> The sparse matrix is stored in Variable Block Row(VBR) format. >> > >> After the distribution, I want to call PETSc KSP solver to solve Ax >> = b. >> > >> I tried to convert VBR to AIJ or CSR format, but the data would be >> > >> re-distributed. >> > >> >> > >> The ideal method is to keep the distribution result from parmetis. >> > >> For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 >> > >> has 2, 3, 5. I wish the PETSc would not change this distribution and >> > >> solve Ax = b. >> > >> >> > >> Are there any approaches to call KSP solver in VBR format from PETSc? >> > >> Or any suggestions for solving Ax = b? >> > >> >> > >> Thanks in advance. >> > >> >> > >> Regards, >> > >> Longxiang Chen >> > >> >> > >> >> > >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue May 14 17:25:56 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 14 May 2013 17:25:56 -0500 (CDT) Subject: [petsc-users] Variable Block Row format in PETSc In-Reply-To: References: Message-ID: sure - the internal representation in the link below is different - but if the AIJ matrix assembled is as shown below then the inodes code will detect it and use it. Satish >>>>>>>>>>>>>>>>>>>> 0 1 2 3 4 5 6 7 8 +------+---------+----+-------+ 0 | 1 2 | | 3 | | 1 | 4 5 | | 6 | | +------+---------+----+-------+ 2 | | 7 8 9 | 10 | | +------+---------+----+-------+ 3 | | | 11 | 12 13 | 4 | | | 14 | 15 16 | 5 | | | 17 | 18 19 | +------+---------+----+-------+ 6 <<<<<<<<<<< On Tue, 14 May 2013, Jed Brown wrote: > It's not the same data layout, but I recommend that you just use AIJ and > let it optimize internally. Partition the graph first, then allocate the > matrix, then compute the Jacobian entries and use MatSetValues. > On May 14, 2013 5:11 PM, "Satish Balay" wrote: > > > AIJ matrix format internally supports VBR listed below [called inodes in > > PETSc] > > > > So I'm not sure what problem you are having. > > > > Satish > > > > On Tue, 14 May 2013, Longxiang Chen wrote: > > > > > VBR like in this link, use 6 arrays to represent a matrix. > > > > > http://docs.oracle.com/cd/E19061-01/hpc.cluster5/817-0086-10/prog-sparse-support.html > > > > > > Each row is a vertex in the graph, , and use parmetis to partition the > > > graph to minimize the number of cuts between different processors. > > (reduce > > > communication when calculate Matrix-Vector) > > > The matrix is calculated from Jacobian and construct the A and b from the > > > result of Jacobian (in VBR). > > > > > > > > > Best regards, > > > Longxiang Chen > > > > > > Do something every day that gets you closer to being done. > > > -------------------------------------------------------------- > > > 465 Winston Chung Hall > > > Computer Science Engineering > > > University of California, Riverside > > > > > > > > > > > > On Tue, May 14, 2013 at 2:51 PM, Jed Brown wrote: > > > > > > > What kind of VBR matrix? What are you partitioning using parmetis? A > > mesh? > > > > The blocks of the matrix? How do you create the entries in the matrix? > > > > On May 14, 2013 4:36 PM, "Longxiang Chen" wrote: > > > > > > > >> To whom it may concern, > > > >> > > > >> I use parmetis to partition a mesh for a sparse matrix. > > > >> Then I distribute the data to the appropriate processors according to > > > >> the result of partition. > > > >> > > > >> The sparse matrix is stored in Variable Block Row(VBR) format. > > > >> After the distribution, I want to call PETSc KSP solver to solve Ax = > > b. > > > >> I tried to convert VBR to AIJ or CSR format, but the data would be > > > >> re-distributed. > > > >> > > > >> The ideal method is to keep the distribution result from parmetis. > > > >> For example, after parmetis, processor 0 has 0, 1, 4, and processor 1 > > > >> has 2, 3, 5. I wish the PETSc would not change this distribution and > > > >> solve Ax = b. > > > >> > > > >> Are there any approaches to call KSP solver in VBR format from PETSc? > > > >> Or any suggestions for solving Ax = b? > > > >> > > > >> Thanks in advance. > > > >> > > > >> Regards, > > > >> Longxiang Chen > > > >> > > > >> > > > > > > > > From parsani.matteo at gmail.com Wed May 15 07:09:28 2013 From: parsani.matteo at gmail.com (Matteo Parsani) Date: Wed, 15 May 2013 08:09:28 -0400 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: Hello Satish, the problem with the PATH and LD_LIBRARY_PATH has been fixed. The administrator of our systems was working on the OS to install new libraries and my .bashrc was not loaded properly. So, now everything is fine. Although I set --prefix to install petsc in my code dependencies, I noticed that I still was using PETSC_ARC in both petsc 3.3-p7 and petsc 3.4 (something like: make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/ petsc-3.3-p7 PETSC_ARCH=arch-linux2-c-debug install). That was my fault, sorry. However, with 3.3-p7 even if I use PETSC_ARCH=arch-linux2-c-debug I did not get egrep: /scratch/home0/pmatteo/ research/workspace/codes/ssdc/deps/petsc/arch-linux2-c-debug/include/ petscconf.h: No such file or directory Whereas with petsc 3.4 I got it because of course under petsc installation directory I have no petsc/arch-linux2-c-debug. All the header files are in petsc/lib as it should be. Thus a couple of questions: 1- Installation of petsc 3.4 seems to be more careful and it is able to detect that I am setting --prefix and PETSC_ARCH. It does not say that explicitly but it print the "egrep message". Would not be better to check if the user is setting --prefix and change the installation instructions printed at screen? Also in the petsc installation documentation it is stressed that PETSC_ARCH must not be used with --prefix but ss it is now, even if the user sets --prefix, the instructions say: make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3-p7 PETSC _ARCH=arch-linux2-c-debug install 2- this is just a curiosity: why petsc 3.3-p7 was not giving me back the " egrep message" though I was setting PETSC_ARCH=arch-linux2-c-debug? Thank you. On Tue, May 14, 2013 at 4:25 PM, Satish Balay wrote: > On Tue, 14 May 2013, Matteo Parsani wrote: > > > Dear PETSc developers and users, > > I have just updated petsc from 3.3-p7 to petsc 3.4. and during the > > installation testing I get the following message: > > > > pmatteo at parsani-lan:~/research/lib_src/petsc$ make > PETSC_DIR=/scratch/home0/ > > pmatteo/research/workspace/codes/ssdc/deps/petscnew test > > Running test examples to verify correct installation > > Using > PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ > > petscnew and PETSC_ARCH=arch-linux2-c-debug > > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 > > MPIprocess > > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 > > MPIprocesses > > egrep: /scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/petscnew > > /arch-linux2-c-debug/include/petscconf.h: No such file or directory > > Perhaps you are having file system problems? > > > Fortran example src/snes/examples/tutorials/ex5f run successfully with > > 1 MPIprocess > > Completed test examples > > > > The test pass successfully but .... > > Ok - then the library is ok and useable. > > > > Attached the log file. > > > > Moreover, when I run my Fortran 90 as usual it seems the libpetsc.so can > > not be opened (./NSE: error while loading shared libraries: libpetsc.so: > > cannot open shared object file: No such file or directory) > > > > PATH and LD_LIBRARY_PATH are set exactly as for petsc 3.3-p7 through my . > > bashrc file and libpetsc.so is in the right location. > > > > If I switch to 3.3-p7 (just by point to the other directory where > > petsc3.3-p7 is installed) it works fine. > > Perhaps you can use PETSc makefiles - so you don't have to rely on > LD_LIBRARY_PATH? > [or use > -Wl,-rpath,/scratch/home0/pmatteo/research/lib_src/petsc/arch-linux2-c-debug/lib] > > You can always do 'ldd NSE' to see if the required sharedlibraries are > found or not > > Satish > > > > > Any idea? > > > > > > Thanks, > > > > > > > > -- Matteo -------------- next part -------------- An HTML attachment was scrubbed... URL: From parsani.matteo at gmail.com Wed May 15 07:33:59 2013 From: parsani.matteo at gmail.com (Matteo Parsani) Date: Wed, 15 May 2013 08:33:59 -0400 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: Just an update. I have install petsc 3.4 without setting PETSC_ARCH and with surprise I still get the message egrep: /scratch/home0/pmatteo/ research/workspace/codes/ssdc/deps/petsc/arch-linux2-c-debug/include/ petscconf.h: No such file or directory Thus I have tried export PETSC_ARCH= and the rerun make PETSC_DIR=/scratch/home0/pmatteo /research/workspace/codes/ssdc/deps/petsc but I still get the same message. Lisandro suggested to run the test with the following make PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ petsc PETSC_ARCH= test and this works without printing any message. Thus we think that maybe the test_build target, for the case of testing a prefix install has to be reviewed. Is that right? Thank you. On Wed, May 15, 2013 at 8:09 AM, Matteo Parsani wrote: > Hello Satish, > the problem with the PATH and LD_LIBRARY_PATH has been fixed. The > administrator of our systems was working on the OS to install new libraries > and my .bashrc was not loaded properly. So, now everything is fine. > > Although I set --prefix to install petsc in my code dependencies, I > noticed that I still was using PETSC_ARC in both petsc 3.3-p7 and petsc3.4 (something like: make > PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3-p7 PETSC > _ARCH=arch-linux2-c-debug install). That was my fault, sorry. > > However, with 3.3-p7 even if I use PETSC_ARCH=arch-linux2-c-debug I did > not get > > egrep: /scratch/home0/pmatteo/ > research/workspace/codes/ssdc/deps/petsc/arch-linux2-c-debug/include/ > petscconf.h: No such file or directory > > Whereas with petsc 3.4 I got it because of course under petscinstallation directory I have no > petsc/arch-linux2-c-debug. All the header files are in petsc/lib as it > should be. > > Thus a couple of questions: > > 1- Installation of petsc 3.4 seems to be more careful and it is able to > detect that I am setting --prefix and PETSC_ARCH. It does not say that > explicitly but it print the "egrep message". Would not be better to check > if the user is setting --prefix and change the installation instructions > printed at screen? > Also in the petsc installation documentation it is stressed that PETSC_ARCH > must not be used with --prefix but ss it is now, even if the user sets > --prefix, the instructions say: > > make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3-p7 PETSC > _ARCH=arch-linux2-c-debug install > > > > 2- this is just a curiosity: why petsc 3.3-p7 was not giving me back the " > egrep message" though I was setting PETSC_ARCH=arch-linux2-c-debug? > > > Thank you. > > > > > On Tue, May 14, 2013 at 4:25 PM, Satish Balay wrote: > >> On Tue, 14 May 2013, Matteo Parsani wrote: >> >> > Dear PETSc developers and users, >> > I have just updated petsc from 3.3-p7 to petsc 3.4. and during the >> > installation testing I get the following message: >> > >> > pmatteo at parsani-lan:~/research/lib_src/petsc$ make >> PETSC_DIR=/scratch/home0/ >> > pmatteo/research/workspace/codes/ssdc/deps/petscnew test >> > Running test examples to verify correct installation >> > Using >> PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ >> > petscnew and PETSC_ARCH=arch-linux2-c-debug >> > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 >> > MPIprocess >> > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 >> > MPIprocesses >> > egrep: >> /scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/petscnew >> > /arch-linux2-c-debug/include/petscconf.h: No such file or directory >> >> Perhaps you are having file system problems? >> >> > Fortran example src/snes/examples/tutorials/ex5f run successfully with >> > 1 MPIprocess >> > Completed test examples >> > >> > The test pass successfully but .... >> >> Ok - then the library is ok and useable. >> > >> > Attached the log file. >> > >> > Moreover, when I run my Fortran 90 as usual it seems the libpetsc.so can >> > not be opened (./NSE: error while loading shared libraries: libpetsc.so: >> > cannot open shared object file: No such file or directory) >> > >> > PATH and LD_LIBRARY_PATH are set exactly as for petsc 3.3-p7 through my >> . >> > bashrc file and libpetsc.so is in the right location. >> > >> > If I switch to 3.3-p7 (just by point to the other directory where >> > petsc3.3-p7 is installed) it works fine. >> >> Perhaps you can use PETSc makefiles - so you don't have to rely on >> LD_LIBRARY_PATH? >> [or use >> -Wl,-rpath,/scratch/home0/pmatteo/research/lib_src/petsc/arch-linux2-c-debug/lib] >> >> You can always do 'ldd NSE' to see if the required sharedlibraries are >> found or not >> >> Satish >> >> > >> > Any idea? >> > >> > >> > Thanks, >> > >> > >> > >> >> > > > -- > Matteo > -- Matteo -------------- next part -------------- An HTML attachment was scrubbed... URL: From parsani.matteo at gmail.com Wed May 15 07:54:40 2013 From: parsani.matteo at gmail.com (Matteo Parsani) Date: Wed, 15 May 2013 08:54:40 -0400 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: Hello Satish, Lisandro has found that in the make file we have include ././${PETSC_ARCH}/conf/petscvariables However if one uses --prefix PETSC_ARCH must not be used This is a very minor issue but I think it would be nice to have it fixed. Thank you in advance. Best Regards On Wed, May 15, 2013 at 8:33 AM, Matteo Parsani wrote: > Just an update. I have install petsc 3.4 without setting PETSC_ARCH > and with surprise I still get the message > > > egrep: /scratch/home0/pmatteo/ > research/workspace/codes/ssdc/deps/petsc/arch-linux2-c-debug/include/ > petscconf.h: No such file or directory > > Thus I have tried > > export PETSC_ARCH= > > and the rerun make PETSC_DIR=/scratch/home0/pmatteo > /research/workspace/codes/ssdc/deps/petsc > > but I still get the same message. > > > Lisandro suggested to run the test with the following > > > make PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ > petsc PETSC_ARCH= test > > and this works without printing any message. Thus we think that maybe the > test_build target, for the case of testing a prefix install has to be > reviewed. Is that right? > > Thank you. > > > > > > On Wed, May 15, 2013 at 8:09 AM, Matteo Parsani wrote: > >> Hello Satish, >> the problem with the PATH and LD_LIBRARY_PATH has been fixed. The >> administrator of our systems was working on the OS to install new libraries >> and my .bashrc was not loaded properly. So, now everything is fine. >> >> Although I set --prefix to install petsc in my code dependencies, I >> noticed that I still was using PETSC_ARC in both petsc 3.3-p7 and petsc3.4 (something like: make >> PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3-p7 PETSC >> _ARCH=arch-linux2-c-debug install). That was my fault, sorry. >> >> However, with 3.3-p7 even if I use PETSC_ARCH=arch-linux2-c-debug I did >> not get >> >> egrep: /scratch/home0/pmatteo/ >> research/workspace/codes/ssdc/deps/petsc/arch-linux2-c-debug/include/ >> petscconf.h: No such file or directory >> >> Whereas with petsc 3.4 I got it because of course under petscinstallation directory I have no >> petsc/arch-linux2-c-debug. All the header files are in petsc/lib as it >> should be. >> >> Thus a couple of questions: >> >> 1- Installation of petsc 3.4 seems to be more careful and it is able to >> detect that I am setting --prefix and PETSC_ARCH. It does not say that >> explicitly but it print the "egrep message". Would not be better to >> check if the user is setting --prefix and change the installation >> instructions printed at screen? >> Also in the petsc installation documentation it is stressed that PETSC_ARCH >> must not be used with --prefix but ss it is now, even if the user sets >> --prefix, the instructions say: >> >> make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3-p7 PETSC >> _ARCH=arch-linux2-c-debug install >> >> >> >> 2- this is just a curiosity: why petsc 3.3-p7 was not giving me back the >> "egrep message" though I was setting PETSC_ARCH=arch-linux2-c-debug? >> >> >> Thank you. >> >> >> >> >> On Tue, May 14, 2013 at 4:25 PM, Satish Balay wrote: >> >>> On Tue, 14 May 2013, Matteo Parsani wrote: >>> >>> > Dear PETSc developers and users, >>> > I have just updated petsc from 3.3-p7 to petsc 3.4. and during the >>> > installation testing I get the following message: >>> > >>> > pmatteo at parsani-lan:~/research/lib_src/petsc$ make >>> PETSC_DIR=/scratch/home0/ >>> > pmatteo/research/workspace/codes/ssdc/deps/petscnew test >>> > Running test examples to verify correct installation >>> > Using >>> PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ >>> > petscnew and PETSC_ARCH=arch-linux2-c-debug >>> > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 >>> > MPIprocess >>> > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 >>> > MPIprocesses >>> > egrep: >>> /scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/petscnew >>> > /arch-linux2-c-debug/include/petscconf.h: No such file or directory >>> >>> Perhaps you are having file system problems? >>> >>> > Fortran example src/snes/examples/tutorials/ex5f run successfully with >>> > 1 MPIprocess >>> > Completed test examples >>> > >>> > The test pass successfully but .... >>> >>> Ok - then the library is ok and useable. >>> > >>> > Attached the log file. >>> > >>> > Moreover, when I run my Fortran 90 as usual it seems the libpetsc.so >>> can >>> > not be opened (./NSE: error while loading shared libraries: >>> libpetsc.so: >>> > cannot open shared object file: No such file or directory) >>> > >>> > PATH and LD_LIBRARY_PATH are set exactly as for petsc 3.3-p7 through >>> my . >>> > bashrc file and libpetsc.so is in the right location. >>> > >>> > If I switch to 3.3-p7 (just by point to the other directory where >>> > petsc3.3-p7 is installed) it works fine. >>> >>> Perhaps you can use PETSc makefiles - so you don't have to rely on >>> LD_LIBRARY_PATH? >>> [or use >>> -Wl,-rpath,/scratch/home0/pmatteo/research/lib_src/petsc/arch-linux2-c-debug/lib] >>> >>> You can always do 'ldd NSE' to see if the required sharedlibraries are >>> found or not >>> >>> Satish >>> >>> > >>> > Any idea? >>> > >>> > >>> > Thanks, >>> > >>> > >>> > >>> >>> >> >> >> -- >> Matteo >> > > > > -- > Matteo > -- Matteo -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed May 15 08:54:38 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 15 May 2013 08:54:38 -0500 (CDT) Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On Wed, 15 May 2013, Matteo Parsani wrote: > Hello Satish, > Lisandro has found that in the make file we have > > include ././${PETSC_ARCH}/conf/petscvariables we try to support too many ways of installing petsc [with prefix/ without prefix, with a defaut PETSC_ARCH, without PETSC_DIR set etc..] And we need this line for users who forget to set PETSC_ARCH in a non-prefix build. [to pick up a defaut PETSC_ARCH] > However if one uses --prefix PETSC_ARCH must not be used yes. > > This is a very minor issue but I think it would be nice to have it fixed. If one uses configure with prefix: >>>>> balay at petsc^/sandbox/balay/petsc-3.4.0 $ ./configure --prefix=/sandbox/balay/petsc-prefix <<<< you get the message: >>>>>>>> xxx=========================================================================xxx Configure stage complete. Now build PETSc libraries with (cmake build): make PETSC_DIR=/sandbox/balay/petsc-3.4.0 PETSC_ARCH=arch-linux2-c-debug all or (experimental with python): PETSC_DIR=/sandbox/balay/petsc-3.4.0 PETSC_ARCH=arch-linux2-c-debug ./config/builder.py xxx=========================================================================xxx <<<<<<<<< Now if you run make as indicated above: you get the message: >>>>>>>> ========================================= Now to install the libraries do: make PETSC_DIR=/sandbox/balay/petsc-3.4.0 PETSC_ARCH=arch-linux2-c-debug install ========================================= <<<<<<< And if that above 'make install' command is run, the message printed is: ==================================== Install complete. It is useable with PETSC_DIR=/sandbox/balay/petsc-prefix [and no more PETSC_ARCH]. Now to check if the libraries are working do (in current directory): make PETSC_DIR=/sandbox/balay/petsc-prefix test ==================================== Perhaps this message is ambiguous? Satish > > Thank you in advance. > > Best Regards > > > On Wed, May 15, 2013 at 8:33 AM, Matteo Parsani wrote: > > > Just an update. I have install petsc 3.4 without setting PETSC_ARCH > > and with surprise I still get the message > > > > > > egrep: /scratch/home0/pmatteo/ > > research/workspace/codes/ssdc/deps/petsc/arch-linux2-c-debug/include/ > > petscconf.h: No such file or directory > > > > Thus I have tried > > > > export PETSC_ARCH= > > > > and the rerun make PETSC_DIR=/scratch/home0/pmatteo > > /research/workspace/codes/ssdc/deps/petsc > > > > but I still get the same message. > > > > > > Lisandro suggested to run the test with the following > > > > > > make PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ > > petsc PETSC_ARCH= test > > > > and this works without printing any message. Thus we think that maybe the > > test_build target, for the case of testing a prefix install has to be > > reviewed. Is that right? > > > > Thank you. > > > > > > > > > > > > On Wed, May 15, 2013 at 8:09 AM, Matteo Parsani wrote: > > > >> Hello Satish, > >> the problem with the PATH and LD_LIBRARY_PATH has been fixed. The > >> administrator of our systems was working on the OS to install new libraries > >> and my .bashrc was not loaded properly. So, now everything is fine. > >> > >> Although I set --prefix to install petsc in my code dependencies, I > >> noticed that I still was using PETSC_ARC in both petsc 3.3-p7 and petsc3.4 (something like: make > >> PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3-p7 PETSC > >> _ARCH=arch-linux2-c-debug install). That was my fault, sorry. > >> > >> However, with 3.3-p7 even if I use PETSC_ARCH=arch-linux2-c-debug I did > >> not get > >> > >> egrep: /scratch/home0/pmatteo/ > >> research/workspace/codes/ssdc/deps/petsc/arch-linux2-c-debug/include/ > >> petscconf.h: No such file or directory > >> > >> Whereas with petsc 3.4 I got it because of course under petscinstallation directory I have no > >> petsc/arch-linux2-c-debug. All the header files are in petsc/lib as it > >> should be. > >> > >> Thus a couple of questions: > >> > >> 1- Installation of petsc 3.4 seems to be more careful and it is able to > >> detect that I am setting --prefix and PETSC_ARCH. It does not say that > >> explicitly but it print the "egrep message". Would not be better to > >> check if the user is setting --prefix and change the installation > >> instructions printed at screen? > >> Also in the petsc installation documentation it is stressed that PETSC_ARCH > >> must not be used with --prefix but ss it is now, even if the user sets > >> --prefix, the instructions say: > >> > >> make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3-p7 PETSC > >> _ARCH=arch-linux2-c-debug install > >> > >> > >> > >> 2- this is just a curiosity: why petsc 3.3-p7 was not giving me back the > >> "egrep message" though I was setting PETSC_ARCH=arch-linux2-c-debug? > >> > >> > >> Thank you. > >> > >> > >> > >> > >> On Tue, May 14, 2013 at 4:25 PM, Satish Balay wrote: > >> > >>> On Tue, 14 May 2013, Matteo Parsani wrote: > >>> > >>> > Dear PETSc developers and users, > >>> > I have just updated petsc from 3.3-p7 to petsc 3.4. and during the > >>> > installation testing I get the following message: > >>> > > >>> > pmatteo at parsani-lan:~/research/lib_src/petsc$ make > >>> PETSC_DIR=/scratch/home0/ > >>> > pmatteo/research/workspace/codes/ssdc/deps/petscnew test > >>> > Running test examples to verify correct installation > >>> > Using > >>> PETSC_DIR=/scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/ > >>> > petscnew and PETSC_ARCH=arch-linux2-c-debug > >>> > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 > >>> > MPIprocess > >>> > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 > >>> > MPIprocesses > >>> > egrep: > >>> /scratch/home0/pmatteo/research/workspace/codes/ssdc/deps/petscnew > >>> > /arch-linux2-c-debug/include/petscconf.h: No such file or directory > >>> > >>> Perhaps you are having file system problems? > >>> > >>> > Fortran example src/snes/examples/tutorials/ex5f run successfully with > >>> > 1 MPIprocess > >>> > Completed test examples > >>> > > >>> > The test pass successfully but .... > >>> > >>> Ok - then the library is ok and useable. > >>> > > >>> > Attached the log file. > >>> > > >>> > Moreover, when I run my Fortran 90 as usual it seems the libpetsc.so > >>> can > >>> > not be opened (./NSE: error while loading shared libraries: > >>> libpetsc.so: > >>> > cannot open shared object file: No such file or directory) > >>> > > >>> > PATH and LD_LIBRARY_PATH are set exactly as for petsc 3.3-p7 through > >>> my . > >>> > bashrc file and libpetsc.so is in the right location. > >>> > > >>> > If I switch to 3.3-p7 (just by point to the other directory where > >>> > petsc3.3-p7 is installed) it works fine. > >>> > >>> Perhaps you can use PETSc makefiles - so you don't have to rely on > >>> LD_LIBRARY_PATH? > >>> [or use > >>> -Wl,-rpath,/scratch/home0/pmatteo/research/lib_src/petsc/arch-linux2-c-debug/lib] > >>> > >>> You can always do 'ldd NSE' to see if the required sharedlibraries are > >>> found or not > >>> > >>> Satish > >>> > >>> > > >>> > Any idea? > >>> > > >>> > > >>> > Thanks, > >>> > > >>> > > >>> > > >>> > >>> > >> > >> > >> -- > >> Matteo > >> > > > > > > > > -- > > Matteo > > > > > > From dalcinl at gmail.com Wed May 15 09:00:06 2013 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Wed, 15 May 2013 17:00:06 +0300 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On 15 May 2013 16:54, Satish Balay wrote: > On Wed, 15 May 2013, Matteo Parsani wrote: > >> Hello Satish, >> Lisandro has found that in the make file we have >> >> include ././${PETSC_ARCH}/conf/petscvariables > > we try to support too many ways of installing petsc [with prefix/ > without prefix, with a defaut PETSC_ARCH, without PETSC_DIR set etc..] > And we need this line for users who forget to set PETSC_ARCH in a > non-prefix build. [to pick up a defaut PETSC_ARCH] > Satish, I think all what is needed is to fix a little the test_build target. Basically, test if the file under PETSC_ARCH exists, otherwise use PETSC_DIR/include/petscconf.h . Perhpas I'm missing something, but this should be near to work. -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo 3000 Santa Fe, Argentina Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From balay at mcs.anl.gov Wed May 15 09:12:25 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 15 May 2013 09:12:25 -0500 (CDT) Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On Wed, 15 May 2013, Lisandro Dalcin wrote: > On 15 May 2013 16:54, Satish Balay wrote: > > On Wed, 15 May 2013, Matteo Parsani wrote: > > > >> Hello Satish, > >> Lisandro has found that in the make file we have > >> > >> include ././${PETSC_ARCH}/conf/petscvariables > > > > we try to support too many ways of installing petsc [with prefix/ > > without prefix, with a defaut PETSC_ARCH, without PETSC_DIR set etc..] > > And we need this line for users who forget to set PETSC_ARCH in a > > non-prefix build. [to pick up a defaut PETSC_ARCH] > > > > > Satish, I think all what is needed is to fix a little the test_build > target. Basically, test if the file under PETSC_ARCH exists, otherwise > use PETSC_DIR/include/petscconf.h . Perhpas I'm missing something, but > this should be near to work. We've avoided using gnu make extensions - so there is no usage of 'if' directives in petsc makefiles. [and we've used include directive as alternative] But there is a plan to switch over to gnumake - in which case all of this code can perhaps be reworked. And I'm able to reproduce the problem. The issue is not the above line - but the following [from target 'test']: >>>>>> @if [ "${FC}" != "" ]; then \ egrep "^#define PETSC_USE_FORTRAN_DATATYPES 1" ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h | tee .ftn-dtype.log > /dev/null; \ if test -s .ftn-dtype.log; then F90TEST="testex5f90t"; else F90TEST="testex5f"; fi; ${RM} .ftn-dtype.log; \ cd src/snes/examples/tutorials >/dev/null; ${OMAKE} PETSC_ARCH=${PETSC_ARCH} PETSC_DIR=${PETSC_DIR} $${F90TEST}; \ fi; <<<<<< balay at petsc^/sandbox/balay/petsc-3.4.0 $ make PETSC_ARCH=arch-linux2-c-debug PETSC_DIR=/sandbox/balay/petsc-prefix test Running test examples to verify correct installation Using PETSC_DIR=/sandbox/balay/petsc-prefix and PETSC_ARCH=arch-linux2-c-debug C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI process C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI processes egrep: /sandbox/balay/petsc-prefix/arch-linux2-c-debug/include/petscconf.h: No such file or directory Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 MPI process Completed test examples balay at petsc^/sandbox/balay/petsc-3.4.0 $ Perhaps a fix is possible. [by somehow figuring out that this is a prefix build - and flagging an error if PETSC_ARCH is detected - in 'test:' target] Satish From dalcinl at gmail.com Wed May 15 09:16:10 2013 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Wed, 15 May 2013 17:16:10 +0300 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On 15 May 2013 17:12, Satish Balay wrote: > On Wed, 15 May 2013, Lisandro Dalcin wrote: > >> On 15 May 2013 16:54, Satish Balay wrote: >> > On Wed, 15 May 2013, Matteo Parsani wrote: >> > >> >> Hello Satish, >> >> Lisandro has found that in the make file we have >> >> >> >> include ././${PETSC_ARCH}/conf/petscvariables >> > >> > we try to support too many ways of installing petsc [with prefix/ >> > without prefix, with a defaut PETSC_ARCH, without PETSC_DIR set etc..] >> > And we need this line for users who forget to set PETSC_ARCH in a >> > non-prefix build. [to pick up a defaut PETSC_ARCH] >> > >> >> >> Satish, I think all what is needed is to fix a little the test_build >> target. Basically, test if the file under PETSC_ARCH exists, otherwise >> use PETSC_DIR/include/petscconf.h . Perhpas I'm missing something, but >> this should be near to work. > > We've avoided using gnu make extensions - so there is no usage of 'if' > directives in petsc makefiles. [and we've used include directive as > alternative] > I was not clear enough. I'm talking about a shell "if", something like if [-f ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h]; then \ egrep ... else egrep .... fi -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo 3000 Santa Fe, Argentina Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From balay at mcs.anl.gov Wed May 15 09:32:34 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 15 May 2013 09:32:34 -0500 (CDT) Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On Wed, 15 May 2013, Lisandro Dalcin wrote: > On 15 May 2013 17:12, Satish Balay wrote: > > On Wed, 15 May 2013, Lisandro Dalcin wrote: > > > >> On 15 May 2013 16:54, Satish Balay wrote: > >> > On Wed, 15 May 2013, Matteo Parsani wrote: > >> > > >> >> Hello Satish, > >> >> Lisandro has found that in the make file we have > >> >> > >> >> include ././${PETSC_ARCH}/conf/petscvariables > >> > > >> > we try to support too many ways of installing petsc [with prefix/ > >> > without prefix, with a defaut PETSC_ARCH, without PETSC_DIR set etc..] > >> > And we need this line for users who forget to set PETSC_ARCH in a > >> > non-prefix build. [to pick up a defaut PETSC_ARCH] > >> > > >> > >> > >> Satish, I think all what is needed is to fix a little the test_build > >> target. Basically, test if the file under PETSC_ARCH exists, otherwise > >> use PETSC_DIR/include/petscconf.h . Perhpas I'm missing something, but > >> this should be near to work. > > > > We've avoided using gnu make extensions - so there is no usage of 'if' > > directives in petsc makefiles. [and we've used include directive as > > alternative] > > > > I was not clear enough. I'm talking about a shell "if", something like > > if [-f ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h]; then \ > egrep ... > else > egrep .... > fi Sure - I was hoping for a: 'if [ "${PREFIX}" = "" ]' - but currently configure doesn't appear so set such a thing. We could add that - or just use the if statement of petscconf.h as you suggest. Satish ----- diff --git a/makefile b/makefile index 372e593..9bc5cfb 100644 --- a/makefile +++ b/makefile @@ -138,7 +138,11 @@ test_build: @cd src/snes/examples/tutorials >/dev/null; ${OMAKE} PETSC_ARCH=${PETSC_ARCH} PETSC_DIR=${PETSC_DIR} clean @cd src/snes/examples/tutorials >/dev/null; ${OMAKE} PETSC_ARCH=${PETSC_ARCH} PETSC_DIR=${PETSC_DIR} testex19 @if [ "${FC}" != "" ]; then \ - egrep "^#define PETSC_USE_FORTRAN_DATATYPES 1" ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h | tee .ftn-dtype.log > /dev/null; \ + if [ -f ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h ]; then \ + egrep "^#define PETSC_USE_FORTRAN_DATATYPES 1" ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h | tee .ftn-dtype.log > /dev/null; \ + else \ + egrep "^#define PETSC_USE_FORTRAN_DATATYPES 1" ${PETSC_DIR}/include/petscconf.h | tee .ftn-dtype.log > /dev/null; \ + fi; \ if test -s .ftn-dtype.log; then F90TEST="testex5f90t"; else F90TEST="testex5f"; fi; ${RM} .ftn-dtype.log; \ cd src/snes/examples/tutorials >/dev/null; ${OMAKE} PETSC_ARCH=${PETSC_ARCH} PETSC_DIR=${PETSC_DIR} $${F90TEST}; \ fi; From stali at geology.wisc.edu Wed May 15 09:39:52 2013 From: stali at geology.wisc.edu (Tabrez Ali) Date: Wed, 15 May 2013 09:39:52 -0500 Subject: [petsc-users] VecStrideScatter question Message-ID: <51939E38.6020302@geology.wisc.edu> Hello I have two parallel vectors (same layout) of different lengths, e.g., V1=[u1 v1 w1 u2 v2 w2 u3 v3 w3]' and V2=[x1 x2 x3]' and I wish to add them in a way such that V3=[ u1 v1 w1+x1 u2 v2 w2+x2 u3 v3 w3+x3 ] Is VecStrideScatter appropriate for this? I tried call VecStrideScatter(V2,3,V1,Add_Values,ierr) but it seems to fail. Thanks in advance Tabrez -- No one trusts a model except the one who wrote it; Everyone trusts an observation except the one who made it- Harlow Shapley From jedbrown at mcs.anl.gov Wed May 15 09:46:56 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 15 May 2013 09:46:56 -0500 Subject: [petsc-users] VecStrideScatter question In-Reply-To: <51939E38.6020302@geology.wisc.edu> References: <51939E38.6020302@geology.wisc.edu> Message-ID: <87fvxo1esf.fsf@mcs.anl.gov> Tabrez Ali writes: > Hello > > I have two parallel vectors (same layout) of different lengths, e.g., > > V1=[u1 v1 w1 u2 v2 w2 u3 v3 w3]' and > V2=[x1 x2 x3]' > > and I wish to add them in a way such that > > V3=[ > u1 > v1 > w1+x1 > u2 > v2 > w2+x2 > u3 > v3 > w3+x3 > ] > > Is VecStrideScatter appropriate for this? I tried > > call VecStrideScatter(V2,3,V1,Add_Values,ierr) > > but it seems to fail. "seems to fail" is not helpful, but in any case, the indexing starts at 0 (so pass 2 instead of 3). From parsani.matteo at gmail.com Wed May 15 10:03:18 2013 From: parsani.matteo at gmail.com (Matteo Parsani) Date: Wed, 15 May 2013 11:03:18 -0400 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: Thanks Satish and Lisandro. It not a problem at all because PETSc is usable but such an if would avoid that message. Regarding the compilation, installation and test instruction printed a screen, maybe also there it would be nice to have some if statements. Thank you again. On Wed, May 15, 2013 at 10:32 AM, Satish Balay wrote: > On Wed, 15 May 2013, Lisandro Dalcin wrote: > > > On 15 May 2013 17:12, Satish Balay wrote: > > > On Wed, 15 May 2013, Lisandro Dalcin wrote: > > > > > >> On 15 May 2013 16:54, Satish Balay wrote: > > >> > On Wed, 15 May 2013, Matteo Parsani wrote: > > >> > > > >> >> Hello Satish, > > >> >> Lisandro has found that in the make file we have > > >> >> > > >> >> include ././${PETSC_ARCH}/conf/petscvariables > > >> > > > >> > we try to support too many ways of installing petsc [with prefix/ > > >> > without prefix, with a defaut PETSC_ARCH, without PETSC_DIR set > etc..] > > >> > And we need this line for users who forget to set PETSC_ARCH in a > > >> > non-prefix build. [to pick up a defaut PETSC_ARCH] > > >> > > > >> > > >> > > >> Satish, I think all what is needed is to fix a little the test_build > > >> target. Basically, test if the file under PETSC_ARCH exists, otherwise > > >> use PETSC_DIR/include/petscconf.h . Perhpas I'm missing something, but > > >> this should be near to work. > > > > > > We've avoided using gnu make extensions - so there is no usage of 'if' > > > directives in petsc makefiles. [and we've used include directive as > > > alternative] > > > > > > > I was not clear enough. I'm talking about a shell "if", something like > > > > if [-f ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h]; then \ > > egrep ... > > else > > egrep .... > > fi > > Sure - I was hoping for a: 'if [ "${PREFIX}" = "" ]' - but currently > configure doesn't appear so set such a thing. We could add that - or > just use the if statement of petscconf.h as you suggest. > > Satish > > ----- > > diff --git a/makefile b/makefile > index 372e593..9bc5cfb 100644 > --- a/makefile > +++ b/makefile > @@ -138,7 +138,11 @@ test_build: > @cd src/snes/examples/tutorials >/dev/null; ${OMAKE} > PETSC_ARCH=${PETSC_ARCH} PETSC_DIR=${PETSC_DIR} clean > @cd src/snes/examples/tutorials >/dev/null; ${OMAKE} > PETSC_ARCH=${PETSC_ARCH} PETSC_DIR=${PETSC_DIR} testex19 > @if [ "${FC}" != "" ]; then \ > - egrep "^#define PETSC_USE_FORTRAN_DATATYPES 1" > ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h | tee .ftn-dtype.log > > /dev/null; \ > + if [ -f ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h ]; then \ > + egrep "^#define PETSC_USE_FORTRAN_DATATYPES 1" > ${PETSC_DIR}/${PETSC_ARCH}/include/petscconf.h | tee .ftn-dtype.log > > /dev/null; \ > + else \ > + egrep "^#define PETSC_USE_FORTRAN_DATATYPES 1" > ${PETSC_DIR}/include/petscconf.h | tee .ftn-dtype.log > /dev/null; \ > + fi; \ > if test -s .ftn-dtype.log; then F90TEST="testex5f90t"; else > F90TEST="testex5f"; fi; ${RM} .ftn-dtype.log; \ > cd src/snes/examples/tutorials >/dev/null; ${OMAKE} > PETSC_ARCH=${PETSC_ARCH} PETSC_DIR=${PETSC_DIR} $${F90TEST}; \ > fi; > > -- Matteo -------------- next part -------------- An HTML attachment was scrubbed... URL: From pengxwang at hotmail.com Wed May 15 10:19:21 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Wed, 15 May 2013 10:19:21 -0500 Subject: [petsc-users] local Vec to global Vec and global 3-D array Message-ID: Hello, 1. I am trying to save the solution of a PDE for a 3-D geometry domain. I used DM to manage the matrix and vector. The solution vector was converted to local 3D arrays on each process successfully. But there was errors when the functions DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called. I built the code by following the example in /petsc-3.3-p6/src/dm/examples/tutorials/ex3.c. The codes for this are as followings : /*Get the solution vec */ ierr = KSPSolve(ksp,PETSC_NULL,PETSC_NULL);CHKERRQ(ierr); ierr = KSPGetSolution(ksp,&x);CHKERRQ(ierr); /* local 3d arrays */ ierr = DMDAVecGetArray(da, x, &localArray3d ); CHKERRQ(ierr); /* global 3d array */ ierr = DMCreateGlobalVector(da,&gsol3d);CHKERRQ(ierr); ierr = DMLocalToGlobalBegin(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); ierr = DMLocalToGlobalEnd(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); 2. I am trying to visualize the solution in a 3-D domain by using some software such as Tecplot. To my understand, I need to output the solution in the format of 3-D array. The vectors of KSP solution in PETSc are local vectors on each process. So they have to be assembled to a global vector and then converted to a 3-D global array. I am not sure if my approach is a good way and if there is some functions in PETSc to output the global solutions directly? Part of the error information when DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called [0]PETSC ERROR: [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message ------------------------------------ [3]PETSC ERROR: --------------------- Error Message ------------------------------------ [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ Nonconforming object sizes! Nonconforming object sizes! [2]PETSC ERROR: [3]PETSC ERROR: Vector wrong size 30 for scatter 60 (scatter forward and vector from != ctx from size)! Vector wrong size 20 for scatter 45 (scatter forward and vector from != ctx from size)! -------------- next part -------------- An HTML attachment was scrubbed... URL: From stali at geology.wisc.edu Wed May 15 10:26:16 2013 From: stali at geology.wisc.edu (Tabrez Ali) Date: Wed, 15 May 2013 10:26:16 -0500 Subject: [petsc-users] VecStrideScatter question In-Reply-To: <87fvxo1esf.fsf@mcs.anl.gov> References: <51939E38.6020302@geology.wisc.edu> <87fvxo1esf.fsf@mcs.anl.gov> Message-ID: <5193A918.6010003@geology.wisc.edu> The error message (with stride of 2) is [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Argument out of range! [1]PETSC ERROR: Start of stride subvector (2) is too large for stride Have you set the vector blocksize (1) correctly with VecSetBlockSize()?! [1]PETSC ERROR: ------------------------------------------------------------------------ Do I have to use VecSetBlockSize as done in http://www.mcs.anl.gov/petsc/petsc-current/src/vec/vec/examples/tutorials/ex12.c.html Actually my V1 (12 entries) and V2 (3 entries) are like V1=[u1 v1 w1 u2 v2 w2 u3 v3 w3 a b c]' and V2=[ x1 x2 x3 ]' what I eventually want is V1=[u1 v1 w1+x1 u2 v2 w2+x2 u3 v3 w3+x3 a b c] Would the [a b c] at the end of V1 cause problems? Tabrez On 05/15/2013 09:46 AM, Jed Brown wrote: > Tabrez Ali writes: > >> Hello >> >> I have two parallel vectors (same layout) of different lengths, e.g., >> >> V1=[u1 v1 w1 u2 v2 w2 u3 v3 w3]' and >> V2=[x1 x2 x3]' >> >> and I wish to add them in a way such that >> >> V3=[ >> u1 >> v1 >> w1+x1 >> u2 >> v2 >> w2+x2 >> u3 >> v3 >> w3+x3 >> ] >> >> Is VecStrideScatter appropriate for this? I tried >> >> call VecStrideScatter(V2,3,V1,Add_Values,ierr) >> >> but it seems to fail. > "seems to fail" is not helpful, but in any case, the indexing starts at > 0 (so pass 2 instead of 3). -- No one trusts a model except the one who wrote it; Everyone trusts an observation except the one who made it- Harlow Shapley From balay at mcs.anl.gov Wed May 15 10:29:38 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 15 May 2013 10:29:38 -0500 (CDT) Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On Wed, 15 May 2013, Matteo Parsani wrote: > Thanks Satish and Lisandro. It not a problem at all because PETSc is usable > but such an if would avoid that message. > > Regarding the compilation, installation and test instruction printed a > screen, maybe also there it would be nice to have some if statements. Could you elaborate what you mean by this? What instructions would you like to see? thanks, Satish From parsani.matteo at gmail.com Wed May 15 10:34:00 2013 From: parsani.matteo at gmail.com (Matteo Parsani) Date: Wed, 15 May 2013 11:34:00 -0400 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: I mean if one is installing with --prefix, then PETSC_ARCH is not needed right? Thus getting at screen this type of instructions make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3- p7 PETSC_ARCH=arch-linux2-c-debug install can be maybe misleading (I am referring to PETSC_ARCH=arch-linux2-c-debug) On Wed, May 15, 2013 at 11:29 AM, Satish Balay wrote: > On Wed, 15 May 2013, Matteo Parsani wrote: > > > Thanks Satish and Lisandro. It not a problem at all because PETSc is > usable > > but such an if would avoid that message. > > > > Regarding the compilation, installation and test instruction printed a > > screen, maybe also there it would be nice to have some if statements. > > Could you elaborate what you mean by this? What instructions would you > like to see? > > thanks, > Satish > > -- Matteo -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed May 15 10:41:14 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 15 May 2013 10:41:14 -0500 (CDT) Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: On Wed, 15 May 2013, Matteo Parsani wrote: > I mean if one is installing with --prefix, then PETSC_ARCH is not needed > right? > > Thus getting at screen this type of instructions > > make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3- > p7 PETSC_ARCH=arch-linux2-c-debug install > > can be maybe misleading (I am referring to PETSC_ARCH=arch-linux2-c-debug) PETSC_ARCH is used for the build process - until 'make install' - but not after that [i.e once petsc is installed - user codes don't use PETSC_ARCH]. Hence the instructions printed are as such. Satish From parsani.matteo at gmail.com Wed May 15 10:50:50 2013 From: parsani.matteo at gmail.com (Matteo Parsani) Date: Wed, 15 May 2013 11:50:50 -0400 Subject: [petsc-users] petsc 3.4: In-Reply-To: References: Message-ID: Okay thank you. On Wed, May 15, 2013 at 11:41 AM, Satish Balay wrote: > On Wed, 15 May 2013, Matteo Parsani wrote: > > > I mean if one is installing with --prefix, then PETSC_ARCH is not needed > > right? > > > > Thus getting at screen this type of instructions > > > > make PETSC_DIR=/scratch/home0/pmatteo/research/lib_src/petsc-3.3- > > p7 PETSC_ARCH=arch-linux2-c-debug install > > > > can be maybe misleading (I am referring to > PETSC_ARCH=arch-linux2-c-debug) > > PETSC_ARCH is used for the build process - until 'make install' - but > not after that [i.e once petsc is installed - user codes don't use > PETSC_ARCH]. Hence the instructions printed are as such. > > Satish > > > -- Matteo -------------- next part -------------- An HTML attachment was scrubbed... URL: From jefonseca at gmail.com Wed May 15 11:14:26 2013 From: jefonseca at gmail.com (Jim Fonseca) Date: Wed, 15 May 2013 12:14:26 -0400 Subject: [petsc-users] libMesh and SLEPc for PETSc 3.4.0 Message-ID: Hi, What versions (newest release, dev?) of libMesh and SLEPc are compatible with PETSc 3.4.0? Thanks Jim -- Jim Fonseca, PhD Research Scientist Network for Computational Nanotechnology Purdue University 765-496-6495 www.jimfonseca.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed May 15 11:35:08 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 15 May 2013 11:35:08 -0500 Subject: [petsc-users] VecStrideScatter question In-Reply-To: <5193A918.6010003@geology.wisc.edu> References: <51939E38.6020302@geology.wisc.edu> <87fvxo1esf.fsf@mcs.anl.gov> <5193A918.6010003@geology.wisc.edu> Message-ID: <010894A8-82D1-4765-AE47-419019E0F3DD@mcs.anl.gov> On May 15, 2013, at 10:26 AM, Tabrez Ali wrote: > The error message (with stride of 2) is > > [1]PETSC ERROR: --------------------- Error Message ------------------------------------ > [1]PETSC ERROR: Argument out of range! > [1]PETSC ERROR: Start of stride subvector (2) is too large for stride > Have you set the vector blocksize (1) correctly with VecSetBlockSize()?! > [1]PETSC ERROR: ------------------------------------------------------------------------ > > Do I have to use VecSetBlockSize as done in http://www.mcs.anl.gov/petsc/petsc-current/src/vec/vec/examples/tutorials/ex12.c.html > > Actually my V1 (12 entries) and V2 (3 entries) are like > > V1=[u1 v1 w1 u2 v2 w2 u3 v3 w3 a b c]' and > > V2=[ x1 x2 x3 ]' what I eventually want is > > V1=[u1 v1 w1+x1 u2 v2 w2+x2 u3 v3 w3+x3 a b c] > > Would the [a b c] at the end of V1 cause problems? Yes, you'd need to have an extra entry at the end of V2 that contained a zero. Barry > > Tabrez > > > On 05/15/2013 09:46 AM, Jed Brown wrote: >> Tabrez Ali writes: >> >>> Hello >>> >>> I have two parallel vectors (same layout) of different lengths, e.g., >>> >>> V1=[u1 v1 w1 u2 v2 w2 u3 v3 w3]' and >>> V2=[x1 x2 x3]' >>> >>> and I wish to add them in a way such that >>> >>> V3=[ >>> u1 >>> v1 >>> w1+x1 >>> u2 >>> v2 >>> w2+x2 >>> u3 >>> v3 >>> w3+x3 >>> ] >>> >>> Is VecStrideScatter appropriate for this? I tried >>> >>> call VecStrideScatter(V2,3,V1,Add_Values,ierr) >>> >>> but it seems to fail. >> "seems to fail" is not helpful, but in any case, the indexing starts at >> 0 (so pass 2 instead of 3). > > > -- > No one trusts a model except the one who wrote it; Everyone trusts an observation except the one who made it- Harlow Shapley > From bsmith at mcs.anl.gov Wed May 15 11:40:03 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 15 May 2013 11:40:03 -0500 Subject: [petsc-users] local Vec to global Vec and global 3-D array In-Reply-To: References: Message-ID: <166403D2-9130-4429-87EA-E845F4291961@mcs.anl.gov> On May 15, 2013, at 10:19 AM, Roc Wang wrote: > Hello, > > 1. I am trying to save the solution of a PDE for a 3-D geometry domain. I used DM to manage the matrix and vector. > The solution vector was converted to local 3D arrays on each process successfully. But there was errors when the functions DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called. > > I built the code by following the example in /petsc-3.3-p6/src/dm/examples/tutorials/ex3.c. The codes for this are as followings : > > /*Get the solution vec */ > ierr = KSPSolve(ksp,PETSC_NULL,PETSC_NULL);CHKERRQ(ierr); > ierr = KSPGetSolution(ksp,&x);CHKERRQ(ierr); > > /* local 3d arrays */ > ierr = DMDAVecGetArray(da, x, &localArray3d ); CHKERRQ(ierr); > > /* global 3d array */ > ierr = DMCreateGlobalVector(da,&gsol3d);CHKERRQ(ierr); > ierr = DMLocalToGlobalBegin(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); > ierr = DMLocalToGlobalEnd(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); The x here must not have been obtained from this da. You need to have obtained the x somewhere before with DMCreateLocalVector(da,&x) or with a VecDuplicate from such a vector. Likely the vector put into the KSP is not from this da. > > > 2. I am trying to visualize the solution in a 3-D domain by using some software such as Tecplot. To my understand, I need to output the solution in the format of 3-D array. The vectors of KSP solution in PETSc are local vectors on each process. So they have to be assembled to a global vector and then converted to a 3-D global array. I am not sure if my approach is a good way and if there is some functions in PETSc to output the global solutions directly? Yes. VecView() on DMDA vectors automatically reorders the entries on the file to use the natural ordering. There are variety of possible viewers you can use including binary, ASCII (no good for anything but tiny problems), VTK, HDF5 Barry > > Part of the error information when DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called > > [0]PETSC ERROR: [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message ------------------------------------ > [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > Nonconforming object sizes! > Nonconforming object sizes! > [2]PETSC ERROR: [3]PETSC ERROR: Vector wrong size 30 for scatter 60 (scatter forward and vector from != ctx from size)! > Vector wrong size 20 for scatter 45 (scatter forward and vector from != ctx from size)! From jedbrown at mcs.anl.gov Wed May 15 11:52:29 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 15 May 2013 11:52:29 -0500 Subject: [petsc-users] libMesh and SLEPc for PETSc 3.4.0 In-Reply-To: References: Message-ID: <874ne418z6.fsf@mcs.anl.gov> Jim Fonseca writes: > Hi, > What versions (newest release, dev?) of libMesh and SLEPc are compatible > with PETSc 3.4.0? libMesh 'master' (as of just now) and slepc-dev work with petsc-3.4.0. From jedbrown at mcs.anl.gov Wed May 15 11:59:28 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 15 May 2013 11:59:28 -0500 Subject: [petsc-users] VecStrideScatter question In-Reply-To: <5193A918.6010003@geology.wisc.edu> References: <51939E38.6020302@geology.wisc.edu> <87fvxo1esf.fsf@mcs.anl.gov> <5193A918.6010003@geology.wisc.edu> Message-ID: <87y5bgyya7.fsf@mcs.anl.gov> Tabrez Ali writes: > The error message (with stride of 2) is > > [1]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [1]PETSC ERROR: Argument out of range! > [1]PETSC ERROR: Start of stride subvector (2) is too large for stride > Have you set the vector blocksize (1) correctly with VecSetBlockSize()?! > [1]PETSC ERROR: > ------------------------------------------------------------------------ > > Do I have to use VecSetBlockSize as done in > http://www.mcs.anl.gov/petsc/petsc-current/src/vec/vec/examples/tutorials/ex12.c.html Yes, you must set the block size to use the VecStride functions. > Actually my V1 (12 entries) and V2 (3 entries) are like > > V1=[u1 v1 w1 u2 v2 w2 u3 v3 w3 a b c]' and > > V2=[ x1 x2 x3 ]' what I eventually want is > > V1=[u1 v1 w1+x1 u2 v2 w2+x2 u3 v3 w3+x3 a b c] > > Would the [a b c] at the end of V1 cause problems? Yeah, the dimensions have to match exactly. Just pad it however you need. From bsmith at mcs.anl.gov Wed May 15 14:09:42 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 15 May 2013 14:09:42 -0500 Subject: [petsc-users] local Vec to global Vec and global 3-D array In-Reply-To: References: , <166403D2-9130-4429-87EA-E845F4291961@mcs.anl.gov> Message-ID: On May 15, 2013, at 12:41 PM, Roc Wang wrote: > Thanks, please take a look my further questions. > > > Subject: Re: [petsc-users] local Vec to global Vec and global 3-D array > > From: bsmith at mcs.anl.gov > > Date: Wed, 15 May 2013 11:40:03 -0500 > > CC: petsc-users at mcs.anl.gov > > To: pengxwang at hotmail.com > > > > > > On May 15, 2013, at 10:19 AM, Roc Wang wrote: > > > > > Hello, > > > > > > 1. I am trying to save the solution of a PDE for a 3-D geometry domain. I used DM to manage the matrix and vector. > > > The solution vector was converted to local 3D arrays on each process successfully. But there was errors when the functions DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called. > > > > > > I built the code by following the example in /petsc-3.3-p6/src/dm/examples/tutorials/ex3.c. The codes for this are as followings : > > > > > > /*Get the solution vec */ > > > ierr = KSPSolve(ksp,PETSC_NULL,PETSC_NULL);CHKERRQ(ierr); > > > ierr = KSPGetSolution(ksp,&x);CHKERRQ(ierr); > > > > > > /* local 3d arrays */ > > > ierr = DMDAVecGetArray(da, x, &localArray3d ); CHKERRQ(ierr); > > > > > > /* global 3d array */ > > > ierr = DMCreateGlobalVector(da,&gsol3d);CHKERRQ(ierr); > > > ierr = DMLocalToGlobalBegin(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); > > > ierr = DMLocalToGlobalEnd(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); > > > > The x here must not have been obtained from this da. You need to have obtained the x somewhere before with DMCreateLocalVector(da,&x) or with a VecDuplicate from such a vector. Likely the vector put into the KSP is not from this da. > > !*************************************** > The procedure of creating x is like this: > > call KSPCreate() to build ksp; > call DMDACreate3d() to build da; > ierr = KSPSetDM(ksp,da); // associate da with ksp; > > Then after ksp was solved: > > ierr = KSPSolve(ksp,PETSC_NULL,PETSC_NULL);CHKERRQ(ierr); > ierr = KSPGetSolution(ksp,&x);CHKERRQ(ierr); > ierr = DMDAVecGetArray(da, x, &localArray3d ); CHKERRQ(ierr); > > Here, da was associated with the ksp by calling ierr = KSPSetDM(ksp,da). > The x here was obtained by calling the KSPGetSolution(ksp,&x), before this I didn't call DMCreateLocalVector(da,&x). It seems Vec x is associated with da since there is no errors when DMDAVecGetArray(da, x, &localArray3d ) was called. The x from KSPGetSolution() IS a global vector, it is not a global vector. You cannot do a local to global from the x global vector. > Do you mean I should call DMCreateLocalVector(da,&x) explicitly before ierr = KSPGetSolution(ksp,&x)? > !*************************************** > > > > > > > > > > 2. I am trying to visualize the solution in a 3-D domain by using some software such as Tecplot. To my understand, I need to output the solution in the format of 3-D array. The vectors of KSP solution in PETSc are local vectors on each process. So they have to be assembled to a global vector and then converted to a 3-D global array. I am not sure if my approach is a good way and if there is some functions in PETSc to output the global solutions directly? > > > > Yes. VecView() on DMDA vectors automatically reorders the entries on the file to use the natural ordering. There are variety of possible viewers you can use including binary, ASCII (no good for anything but tiny problems), VTK, HDF5 > > !*************************************** > VecView only outputs the vector itself. How to output the indexes of nodes in 3 dimemsions (i,j,k) for each element of the vector, such as: > > i j k x > 0 0 0 1.0 > 0 0 1 2.0 > ... > m n p 10.0 > !*************************************** If you want some ASCII format like that, then write a STAND ALONE sequential program that reads from the binary file with VecLoad() and then outputs the format you want. Never Never ever try to do a parallel output of data in this kind of format, it will be slow and hard to write and there is no reason to write it. Barry > > > > Barry > > > > > > > > Part of the error information when DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called > > > > > > [0]PETSC ERROR: [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > Nonconforming object sizes! > > > Nonconforming object sizes! > > > [2]PETSC ERROR: [3]PETSC ERROR: Vector wrong size 30 for scatter 60 (scatter forward and vector from != ctx from size)! > > > Vector wrong size 20 for scatter 45 (scatter forward and vector from != ctx from size)! > > From pengxwang at hotmail.com Wed May 15 14:25:36 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Wed, 15 May 2013 14:25:36 -0500 Subject: [petsc-users] local Vec to global Vec and global 3-D array In-Reply-To: References: , <166403D2-9130-4429-87EA-E845F4291961@mcs.anl.gov> , Message-ID: Thanks, I will write the stand alone code for it. One more question about the global and local vector below. > Subject: Re: [petsc-users] local Vec to global Vec and global 3-D array > From: bsmith at mcs.anl.gov > Date: Wed, 15 May 2013 14:09:42 -0500 > To: pengxwang at hotmail.com; petsc-users at mcs.anl.gov > > > On May 15, 2013, at 12:41 PM, Roc Wang wrote: > > > Thanks, please take a look my further questions. > > > > > Subject: Re: [petsc-users] local Vec to global Vec and global 3-D array > > > From: bsmith at mcs.anl.gov > > > Date: Wed, 15 May 2013 11:40:03 -0500 > > > CC: petsc-users at mcs.anl.gov > > > To: pengxwang at hotmail.com > > > > > > > > > On May 15, 2013, at 10:19 AM, Roc Wang wrote: > > > > > > > Hello, > > > > > > > > 1. I am trying to save the solution of a PDE for a 3-D geometry domain. I used DM to manage the matrix and vector. > > > > The solution vector was converted to local 3D arrays on each process successfully. But there was errors when the functions DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called. > > > > > > > > I built the code by following the example in /petsc-3.3-p6/src/dm/examples/tutorials/ex3.c. The codes for this are as followings : > > > > > > > > /*Get the solution vec */ > > > > ierr = KSPSolve(ksp,PETSC_NULL,PETSC_NULL);CHKERRQ(ierr); > > > > ierr = KSPGetSolution(ksp,&x);CHKERRQ(ierr); > > > > > > > > /* local 3d arrays */ > > > > ierr = DMDAVecGetArray(da, x, &localArray3d ); CHKERRQ(ierr); > > > > > > > > /* global 3d array */ > > > > ierr = DMCreateGlobalVector(da,&gsol3d);CHKERRQ(ierr); > > > > ierr = DMLocalToGlobalBegin(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); > > > > ierr = DMLocalToGlobalEnd(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); > > > > > > The x here must not have been obtained from this da. You need to have obtained the x somewhere before with DMCreateLocalVector(da,&x) or with a VecDuplicate from such a vector. Likely the vector put into the KSP is not from this da. > > > > !*************************************** > > The procedure of creating x is like this: > > > > call KSPCreate() to build ksp; > > call DMDACreate3d() to build da; > > ierr = KSPSetDM(ksp,da); // associate da with ksp; > > > > Then after ksp was solved: > > > > ierr = KSPSolve(ksp,PETSC_NULL,PETSC_NULL);CHKERRQ(ierr); > > ierr = KSPGetSolution(ksp,&x);CHKERRQ(ierr); > > ierr = DMDAVecGetArray(da, x, &localArray3d ); CHKERRQ(ierr); > > > > Here, da was associated with the ksp by calling ierr = KSPSetDM(ksp,da). > > The x here was obtained by calling the KSPGetSolution(ksp,&x), before this I didn't call DMCreateLocalVector(da,&x). It seems Vec x is associated with da since there is no errors when DMDAVecGetArray(da, x, &localArray3d ) was called. > > The x from KSPGetSolution() IS a global vector, it is not a global vector. You cannot do a local to global from the x global vector. !******************************* The x from KSPGetSolution() IS a global vector, it is not a "local" vector? (It should be a typo :) ) So Can I convert the global x to a 3d array directly? Thank. > > > Do you mean I should call DMCreateLocalVector(da,&x) explicitly before ierr = KSPGetSolution(ksp,&x)? > > !*************************************** > > > > > > > > > > > > > > 2. I am trying to visualize the solution in a 3-D domain by using some software such as Tecplot. To my understand, I need to output the solution in the format of 3-D array. The vectors of KSP solution in PETSc are local vectors on each process. So they have to be assembled to a global vector and then converted to a 3-D global array. I am not sure if my approach is a good way and if there is some functions in PETSc to output the global solutions directly? > > > > > > Yes. VecView() on DMDA vectors automatically reorders the entries on the file to use the natural ordering. There are variety of possible viewers you can use including binary, ASCII (no good for anything but tiny problems), VTK, HDF5 > > > > !*************************************** > > VecView only outputs the vector itself. How to output the indexes of nodes in 3 dimemsions (i,j,k) for each element of the vector, such as: > > > > i j k x > > 0 0 0 1.0 > > 0 0 1 2.0 > > ... > > m n p 10.0 > > !*************************************** > > If you want some ASCII format like that, then write a STAND ALONE sequential program that reads from the binary file with VecLoad() and then outputs the format you want. Never Never ever try to do a parallel output of data in this kind of format, it will be slow and hard to write and there is no reason to write it. > > Barry > > > > > > > Barry > > > > > > > > > > > Part of the error information when DMLocalToGlobalBegin() and DMLocalToGlobalEnd() were called > > > > > > > > [0]PETSC ERROR: [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > Nonconforming object sizes! > > > > Nonconforming object sizes! > > > > [2]PETSC ERROR: [3]PETSC ERROR: Vector wrong size 30 for scatter 60 (scatter forward and vector from != ctx from size)! > > > > Vector wrong size 20 for scatter 45 (scatter forward and vector from != ctx from size)! > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed May 15 15:30:07 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 15 May 2013 15:30:07 -0500 Subject: [petsc-users] local Vec to global Vec and global 3-D array In-Reply-To: References: , <166403D2-9130-4429-87EA-E845F4291961@mcs.anl.gov> , Message-ID: <81952131-8FFC-490A-A2AD-CF2E32964764@mcs.anl.gov> On May 15, 2013, at 2:25 PM, Roc Wang wrote: > Thanks, I will write the stand alone code for it. One more question about the global and local vector below. > > > > > > ierr = DMCreateGlobalVector(da,&gsol3d);CHKERRQ(ierr); > > > > > ierr = DMLocalToGlobalBegin(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); > > > > > ierr = DMLocalToGlobalEnd(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); > > > > In the above you are trying to scatter a local vector x to a global vector gsol3d. But x is a global vector. > > !******************************* > The x from KSPGetSolution() IS a global vector, it is not a "local" vector? (It should be a typo :) ) It is a global vector. > So Can I convert the global x to a 3d array directly? Thank. You can convert either a global vector or a local vector to 3d array access BUT the converted global vector has NO ghost values accessible while the converted local vector has ghost points available (after the call to DMGlobalToLocalBegin/End()). > From mike.hui.zhang at hotmail.com Wed May 15 16:32:12 2013 From: mike.hui.zhang at hotmail.com (Hui Zhang) Date: Wed, 15 May 2013 23:32:12 +0200 Subject: [petsc-users] KSPPlugin Message-ID: Hello, I just saw KSPPlugin on bitbucket. Very nice feature. When will it be available to us? Thanks! From jedbrown at mcs.anl.gov Wed May 15 16:36:54 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 15 May 2013 16:36:54 -0500 Subject: [petsc-users] KSPPlugin In-Reply-To: References: Message-ID: <87r4h8x6vd.fsf@mcs.anl.gov> Hui Zhang writes: > Hello, > > I just saw KSPPlugin on bitbucket. Very nice feature. When will it be > available to us? I'm changing it to be easier/more transparent to use, but also less clear what is a plugin. There is a balance between having precise control over the scope of a plugin and having the plugin Just Work without needing to explicitly activate it. If you have comments about your preferences, I'd like to hear them. From ztdepyahoo at 163.com Thu May 16 10:01:40 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Thu, 16 May 2013 23:01:40 +0800 (CST) Subject: [petsc-users] why the user manual does not include the instruction for DMMesh and adda. how to lean it. Message-ID: <77ebb1c7.1297f.13eaddc5c3a.Coremail.ztdepyahoo@163.com> -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu May 16 10:09:30 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 16 May 2013 10:09:30 -0500 Subject: [petsc-users] why the user manual does not include the instruction for DMMesh and adda. how to lean it. In-Reply-To: <77ebb1c7.1297f.13eaddc5c3a.Coremail.ztdepyahoo@163.com> References: <77ebb1c7.1297f.13eaddc5c3a.Coremail.ztdepyahoo@163.com> Message-ID: <87bo8bvu51.fsf@mcs.anl.gov> DMMesh is deprecated and will be removed in the next release. Please use its replacement, DMPlex, which is documented. ADDA is pretty much unsupported. (The code was a student project, and generally not up to the standards of the rest of the library.) Can you describe the problem that you are considering using it for? From bsmith at mcs.anl.gov Thu May 16 11:33:45 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 16 May 2013 11:33:45 -0500 Subject: [petsc-users] local Vec to global Vec and global 3-D array In-Reply-To: References: , <166403D2-9130-4429-87EA-E845F4291961@mcs.anl.gov> , , <81952131-8FFC-490A-A2AD-CF2E32964764@mcs.anl.gov> Message-ID: All processes have a "subblock" of the entire array; with the local Vec the subblock has the ghost values, with the global Vec it does not have the ghost points. No process has the entire array. BUT the indexing INTO the array on each process is using the GLOBAL I,J,K indices so in some sense all processes have the entire array "virtually" but they can ONLY access their chunk, if they try to access outside of their chunk it will fail. Barry On May 16, 2013, at 10:34 AM, Roc Wang wrote: > Thanks. One more question. Sorry, but I am really confused here. > >> Subject: Re: [petsc-users] local Vec to global Vec and global 3-D array >> From: bsmith at mcs.anl.gov >> Date: Wed, 15 May 2013 15:30:07 -0500 >> CC: petsc-users at mcs.anl.gov >> To: pengxwang at hotmail.com >> >> >> On May 15, 2013, at 2:25 PM, Roc Wang wrote: >> >>> Thanks, I will write the stand alone code for it. One more question about the global and local vector below. >>> >>>>>>> ierr = DMCreateGlobalVector(da,&gsol3d);CHKERRQ(ierr); >>>>>>> ierr = DMLocalToGlobalBegin(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); >>>>>>> ierr = DMLocalToGlobalEnd(da,x,INSERT_VALUES,gsol3d);CHKERRQ(ierr); >>>>>> >> >> In the above you are trying to scatter a local vector x to a global vector gsol3d. But x is a global vector. >> >>> >>> !******************************* >>> The x from KSPGetSolution() IS a global vector, it is not a "local" vector? (It should be a typo :) ) >> >> It is a global vector. >> >>> So Can I convert the global x to a 3d array directly? Thank. >> >> You can convert either a global vector or a local vector to 3d array access BUT the converted global vector has NO ghost values accessible while the converted local vector has ghost points available (after the call to DMGlobalToLocalBegin/End()). >> > if x is a global vector with da, then after the following procedure, > > PetscScalar ***globalArray3d; > ierr = DMDAVecGetArray(da, x, &globalArray3d ); CHKERRQ(ierr); > > All processes have the same globalArray3d with size of global or local? For example, if M,N,P is the global dimension in each direction of the 3-d array, then the size of globalArray3d is [0,M-1], [0, N-1] and [0,P-1] in each process? From shchen at www.phys.lsu.edu Thu May 16 11:52:31 2013 From: shchen at www.phys.lsu.edu (Shaohao Chen) Date: Thu, 16 May 2013 11:52:31 -0500 Subject: [petsc-users] errors when repeatedly using MatSetValues and MatAssembly in a loop Message-ID: <20130516164756.M88599@physics.lsu.edu> Dear writers and users of PETSc, I received errors as attached. It seems that I over allocated memory, so that the system killed my job. I repeatedly used MatSetValues and MatAssembly in a big loop. This error massage appears during this loop. The structure of my codes is as following. Could you please give me some hints where I could over allocate memory? Thanks! Structure of my codes: Mat A; MatCreate(A); MatSetSizes(A); MatSetUp(A); MatSetValues(A); // set initio values MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); MatSetOption(A,MAT_NEW_NONZERO_LOCATIONS,PETSC_FALSE); // fix nonzero structure for all uses below ------ begin loop ----- ... MatSetValues(A); // update values of some parts of the matrix MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); // the error massage appears here, after tens of steps of the loop. MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); KSPSolve(A,...); . ------ end loop ----- ==== attached errors ===== [0]PETSC ERROR: --------------------------------------------------------------------- --- [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] MatStashScatterGetMesg_Private line 617 src/mat/utils/matstash.c [0]PETSC ERROR: [0] MatAssemblyEnd_MPIAIJ line 673 src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: [0] MatAssemblyEnd line 4930 src/mat/interface/matrix.c [0]PETSC ERROR: --------------------- Error Message ----------------------------------- - [0]PETSC ERROR: Signal received! [0]PETSC ERROR: --------------------------------------------------------------------- --- [0]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: --------------------------------------------------------------------- --- [0]PETSC ERROR: /home/shaohao/program-basis/ais/ais-tdse on a arch-linux2-c-debug named qb002 by shaohao Thu May 16 10:55:09 2013 [0]PETSC ERROR: Libraries linked from /usr/local/packages/petsc/3.4.0/intel-11.1-mvapich- 1.1/lib [0]PETSC ERROR: Configure run at Tue May 14 14:20:15 2013 [0]PETSC ERROR: Configure options --prefix=/usr/local/packages/petsc/3.4.0/intel-11.1- mvapich-1.1 --with-mpi=1 --with-mpi-compilers=1 --with-c-support=1 --with-fortran=1 -- with-c++-support=1 --with-lapack-lib=/usr/local/packages/lapack/3.4.2/intel- 11.1/lib/liblapack.a --with-blas-lib=/usr/local/packages/lapack/3.4.2/intel-11.1/lib/libblas.a -- with-expat=1 --with-expat-dir=/usr [0]PETSC ERROR: --------------------------------------------------------------------- --- [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [0] [MPI Abort by user] Aborting Program! Abort signaled by rank 0: MPI Abort by user Aborting program ! Exit code -3 signaled from qb002 Killing remote processes...MPI process terminated unexpectedly DONE -- Shaohao Chen Department of Physics & Astronomy, Louisiana State University, Baton Rouge, LA From jedbrown at mcs.anl.gov Thu May 16 12:01:25 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 16 May 2013 12:01:25 -0500 Subject: [petsc-users] errors when repeatedly using MatSetValues and MatAssembly in a loop In-Reply-To: <20130516164756.M88599@physics.lsu.edu> References: <20130516164756.M88599@physics.lsu.edu> Message-ID: <87obcavoyi.fsf@mcs.anl.gov> Shaohao Chen writes: > Dear writers and users of PETSc, > > I received errors as attached. It seems that I over allocated memory, so that the system killed my > job. I repeatedly used MatSetValues and MatAssembly in a big loop. This error massage appears > during this loop. The structure of my codes is as following. Could you please give me some hints > where I could over allocate memory? Thanks! > > Structure of my codes: > Mat A; > MatCreate(A); > MatSetSizes(A); > MatSetUp(A); > MatSetValues(A); // set initio values > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > MatSetOption(A,MAT_NEW_NONZERO_LOCATIONS,PETSC_FALSE); // fix nonzero structure for all > uses below > ------ begin loop ----- > ... > MatSetValues(A); // update values of some parts of the matrix > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); // the error massage appears here, after tens of > steps of the loop. > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > KSPSolve(A,...); > . > ------ end loop ----- > > > ==== attached errors ===== > > [0]PETSC ERROR: --------------------------------------------------------------------- > --- > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access > out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC > ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption > errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] MatStashScatterGetMesg_Private line 617 src/mat/utils/matstash.c > [0]PETSC ERROR: [0] MatAssemblyEnd_MPIAIJ line 673 src/mat/impls/aij/mpi/mpiaij.c > [0]PETSC ERROR: [0] MatAssemblyEnd line 4930 src/mat/interface/matrix.c I suspect other memory corruption. Please try valgrind. If you don't figure it out, send a reduced test case so that we can reproduce. > [0]PETSC ERROR: --------------------- Error Message ----------------------------------- > - > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: --------------------------------------------------------------------- > --- > [0]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: --------------------------------------------------------------------- > --- > [0]PETSC ERROR: /home/shaohao/program-basis/ais/ais-tdse on a arch-linux2-c-debug named > qb002 by shaohao Thu May 16 10:55:09 2013 > [0]PETSC ERROR: Libraries linked from /usr/local/packages/petsc/3.4.0/intel-11.1-mvapich- > 1.1/lib > [0]PETSC ERROR: Configure run at Tue May 14 14:20:15 2013 > [0]PETSC ERROR: Configure options --prefix=/usr/local/packages/petsc/3.4.0/intel-11.1- > mvapich-1.1 --with-mpi=1 --with-mpi-compilers=1 --with-c-support=1 --with-fortran=1 -- > with-c++-support=1 --with-lapack-lib=/usr/local/packages/lapack/3.4.2/intel- > 11.1/lib/liblapack.a --with-blas-lib=/usr/local/packages/lapack/3.4.2/intel-11.1/lib/libblas.a -- > with-expat=1 --with-expat-dir=/usr > [0]PETSC ERROR: --------------------------------------------------------------------- > --- > [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file > [0] [MPI Abort by user] Aborting Program! > Abort signaled by rank 0: MPI Abort by user Aborting program ! > Exit code -3 signaled from qb002 > Killing remote processes...MPI process terminated unexpectedly > DONE > > > -- > Shaohao Chen > Department of Physics & Astronomy, > Louisiana State University, > Baton Rouge, LA From ztdepyahoo at 163.com Thu May 16 21:20:15 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 17 May 2013 10:20:15 +0800 (CST) Subject: [petsc-users] how to set the output format of vecview Message-ID: <555cde3a.441a.13eb049a106.Coremail.ztdepyahoo@163.com> i want to set the output of VecView(x,PETSC_VIEWER_STDOUT_WORLD) in a scientific format. how to get it? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu May 16 21:35:19 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 16 May 2013 21:35:19 -0500 Subject: [petsc-users] how to set the output format of vecview In-Reply-To: <555cde3a.441a.13eb049a106.Coremail.ztdepyahoo@163.com> References: <555cde3a.441a.13eb049a106.Coremail.ztdepyahoo@163.com> Message-ID: <87bo8as594.fsf@mcs.anl.gov> ??? writes: > i want to set the output of VecView(x,PETSC_VIEWER_STDOUT_WORLD) in a > scientific format. how to get it? You can set your own viewer. The ASCII format should only be used for human inspection, not for communicating with another program. Use a binary viewer if you need to be more precise. From ztdepyahoo at 163.com Thu May 16 21:45:49 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 17 May 2013 10:45:49 +0800 (CST) Subject: [petsc-users] how to set the output format of vecview In-Reply-To: <87bo8as594.fsf@mcs.anl.gov> References: <555cde3a.441a.13eb049a106.Coremail.ztdepyahoo@163.com> <87bo8as594.fsf@mcs.anl.gov> Message-ID: <11554cb4.5153.13eb0610814.Coremail.ztdepyahoo@163.com> thank you very much! could you please tell which examples demonstrate the binary viewer. ? 2013-05-17 10:35:19?"Jed Brown" ??? >??? writes: > >> i want to set the output of VecView(x,PETSC_VIEWER_STDOUT_WORLD) in a >> scientific format. how to get it? > >You can set your own viewer. The ASCII format should only be used for >human inspection, not for communicating with another program. Use a >binary viewer if you need to be more precise. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu May 16 21:57:18 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 16 May 2013 21:57:18 -0500 Subject: [petsc-users] how to set the output format of vecview In-Reply-To: <11554cb4.5153.13eb0610814.Coremail.ztdepyahoo@163.com> References: <555cde3a.441a.13eb049a106.Coremail.ztdepyahoo@163.com> <87bo8as594.fsf@mcs.anl.gov> <11554cb4.5153.13eb0610814.Coremail.ztdepyahoo@163.com> Message-ID: <878v3es48h.fsf@mcs.anl.gov> ??? writes: > thank you very much! could you please tell which examples demonstrate the binary viewer. http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerBinaryOpen.html Or automatic, e.g., src/ksp/ksp/examples/tutorials/ex2.c (or any other example that solves something) ./ex2 -ksp_view_mat binary:my-file-name -ksp_view_rhs binary:my-file-name::append Then load it up: $ octave -q octave:1> [A,b] = PetscBinaryRead('foo'); octave:2> [size(A); size(b)] ans = 56 56 56 1 From choi240 at purdue.edu Fri May 17 06:38:25 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Fri, 17 May 2013 07:38:25 -0400 (EDT) Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <1798107332.132436.1368787768610.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <354560210.132456.1368790705307.JavaMail.root@mailhub028.itcs.purdue.edu> Hi all, I am trying to set up a matrix with 3d coordinates(x,y,z) and value. If X=max(x), Y=max(y) and Z=max(z), then the matrix is as follows: A = [ A1 A2 ... AZ ] (each A* is X-by-Y block matrix and A consists of Z block matrices) When I tried to set up the matrix using MatSetValues with each coordinates and value, I could get the matrix, but it took 10 minutes. So I tried to set up the matrix again using MatSetLocalToGlobalMapping(Mat x, ISLocalToGlobalMapping rmapping, ISLocalToGlobalMapping cmapping), but ISLocalToGloalMapping was minus because my dataset is too big (X:26M, Y:26M, Z:48M, nonzero:144M) and I got errors. Is there the way to set up the matrix quickly in petsc? If so, please let me know the way. And my code using MatSetLocalToGlobalMapping following: ISLocalToGlobalMappingCreate(PETSC_COMM_SELF, 1, &zr, PETSC_OWN_POINTER, &irow); ISLocalToGlobalMappingUnBlock(irow, X, &orow); for (z=0; z References: <354560210.132456.1368790705307.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <8761yhss4x.fsf@mcs.anl.gov> Joon hee Choi writes: > Hi all, > > I am trying to set up a matrix with 3d coordinates(x,y,z) and > value. If X=max(x), Y=max(y) and Z=max(z), then the matrix is as > follows: A = [ A1 A2 ... AZ ] (each A* is X-by-Y block matrix and A > consists of Z block matrices) Are these decoupled diagonal blocks or a sequence of blocks in the row. That is, is the full size of the matrix (X, Y*Z) or (X*Z, Y*Z). From the code below, it looks like you intend for it to be the former. What parallel distribution do you intend for the input vector (of dimension Y*Z) and the output vector (of dimension X)? > When I tried to set up the matrix using MatSetValues with each > coordinates and value, I could get the matrix, but it took 10 > minutes. This is almost certainly due to not preallocating. > So I tried to set up the matrix again using > MatSetLocalToGlobalMapping(Mat x, ISLocalToGlobalMapping rmapping, > ISLocalToGlobalMapping cmapping), but ISLocalToGloalMapping was minus > because my dataset is too big (X:26M, Y:26M, Z:48M, nonzero:144M) and > I got errors. Is there the way to set up the matrix quickly in petsc? > If so, please let me know the way. And my code using > MatSetLocalToGlobalMapping following: > > > ISLocalToGlobalMappingCreate(PETSC_COMM_SELF, 1, &zr, PETSC_OWN_POINTER, &irow); This has only one entry. > ISLocalToGlobalMappingUnBlock(irow, X, &orow); How did you set the block size of the matrix? > > for (z=0; z { > ISLocalToGlobalMappingCreate(PETSC_COMM_SELF, 1, &z, PETSC_OWN_POINTER, &icol); This mapping also has only one entry. It is not intended for you to change the local-to-global mapping on a matrix. > MatSetLocalToGlobalMappingBlock(A, irow, icol); > ISLocalToGlobalMappingUnBlock(icol, Y, &ocol); > MatSetLocalToGlobalMapping(A, orow, ocol); > > while(!mapX[z].empty()) > { > x = mapX[z].front(); > y = mapY[z].front(); So instead of changing it, just add the offset to y here. > val = mapVal[z].front(); > MatSetValuesLocal(A, 1, &x, 1, &y, &val, INSERT_VALUES); > mapX[z].pop_front(); > mapY[z].pop_front(); > mapVal[z].pop_front(); > } > } > > Thank you, > > Joon From choi240 at purdue.edu Fri May 17 14:18:59 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Fri, 17 May 2013 15:18:59 -0400 (EDT) Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <8761yhss4x.fsf@mcs.anl.gov> Message-ID: <1074323751.133085.1368818339224.JavaMail.root@mailhub028.itcs.purdue.edu> Hi Jed Brown, Thank you for your fast reply. Your last comments looks like my first code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as matrix type and (X, Y) as block size. Also, I implemented SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The Matrix-Set-Up part of the first code is as follows: sort(tups.begin(), tups.end()); MatCreate(PETSC_COMM_SELF, &A); MatSetType(A, MATSEQAIJ); MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); MatSetBlockSizes(A, X, Y); MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); sz = tups.size(); for (i=0; i(tups[i]); y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; val = std::tr1::get<3>(tups[i]); MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); } MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); I used tuple (x, z, y, value) and vector of c++. I didn't get any errors from this code. However, it took about 9 minutes in this part. Please let me know what to change my code. Thank you. Joon ----- Original Message ----- From: "Jed Brown" To: "Joon hee Choi" , petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 8:33:18 AM Subject: Re: [petsc-users] 3-dimension to matrix Joon hee Choi writes: > Hi all, > > I am trying to set up a matrix with 3d coordinates(x,y,z) and > value. If X=max(x), Y=max(y) and Z=max(z), then the matrix is as > follows: A = [ A1 A2 ... AZ ] (each A* is X-by-Y block matrix and A > consists of Z block matrices) Are these decoupled diagonal blocks or a sequence of blocks in the row. That is, is the full size of the matrix (X, Y*Z) or (X*Z, Y*Z). From the code below, it looks like you intend for it to be the former. What parallel distribution do you intend for the input vector (of dimension Y*Z) and the output vector (of dimension X)? > When I tried to set up the matrix using MatSetValues with each > coordinates and value, I could get the matrix, but it took 10 > minutes. This is almost certainly due to not preallocating. > So I tried to set up the matrix again using > MatSetLocalToGlobalMapping(Mat x, ISLocalToGlobalMapping rmapping, > ISLocalToGlobalMapping cmapping), but ISLocalToGloalMapping was minus > because my dataset is too big (X:26M, Y:26M, Z:48M, nonzero:144M) and > I got errors. Is there the way to set up the matrix quickly in petsc? > If so, please let me know the way. And my code using > MatSetLocalToGlobalMapping following: > > > ISLocalToGlobalMappingCreate(PETSC_COMM_SELF, 1, &zr, PETSC_OWN_POINTER, &irow); This has only one entry. > ISLocalToGlobalMappingUnBlock(irow, X, &orow); How did you set the block size of the matrix? > > for (z=0; z { > ISLocalToGlobalMappingCreate(PETSC_COMM_SELF, 1, &z, PETSC_OWN_POINTER, &icol); This mapping also has only one entry. It is not intended for you to change the local-to-global mapping on a matrix. > MatSetLocalToGlobalMappingBlock(A, irow, icol); > ISLocalToGlobalMappingUnBlock(icol, Y, &ocol); > MatSetLocalToGlobalMapping(A, orow, ocol); > > while(!mapX[z].empty()) > { > x = mapX[z].front(); > y = mapY[z].front(); So instead of changing it, just add the offset to y here. > val = mapVal[z].front(); > MatSetValuesLocal(A, 1, &x, 1, &y, &val, INSERT_VALUES); > mapX[z].pop_front(); > mapY[z].pop_front(); > mapVal[z].pop_front(); > } > } > > Thank you, > > Joon From jedbrown at mcs.anl.gov Fri May 17 14:24:09 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 14:24:09 -0500 Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <1074323751.133085.1368818339224.JavaMail.root@mailhub028.itcs.purdue.edu> References: <1074323751.133085.1368818339224.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <87zjvtpfza.fsf@mcs.anl.gov> Joon hee Choi writes: > Thank you for your fast reply. Your last comments looks like my first > code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as > matrix type and (X, Y) as block size. Also, I implemented > SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The > Matrix-Set-Up part of the first code is as follows: > > sort(tups.begin(), tups.end()); > MatCreate(PETSC_COMM_SELF, &A); > MatSetType(A, MATSEQAIJ); > MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); > MatSetBlockSizes(A, X, Y); > MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); > > sz = tups.size(); > for (i=0; i x = std::tr1::get<0>(tups[i]); > y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; > val = std::tr1::get<3>(tups[i]); > MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); > } > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > > I used tuple (x, z, y, value) and vector of c++. I didn't get any > errors from this code. However, it took about 9 minutes in this > part. What version of PETSc? Your preallocation was almost certainly not sufficient. From mrosso at uci.edu Fri May 17 14:41:17 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 12:41:17 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid Message-ID: <519687DD.4050209@uci.edu> Hi, I am successfully using PETSc (v3.3) in parallel to solve the Poisson equation in 3D . The discretization is done by using finite difference on a uniform structured grid. So far I used the conjugate gradient method, but I would like to give a try to multigrid. The documentation describes multigrid as a preconditioner only, thus I would like to know if it is possible to use multigrid as a solver and, if so, if you could give my some tips to start. I am using the DMDA context to define matrix and vectors and my code is written Fortran. Thank you, Michele -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 14:45:31 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 14:45:31 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <519687DD.4050209@uci.edu> References: <519687DD.4050209@uci.edu> Message-ID: <87r4h5pezo.fsf@mcs.anl.gov> Michele Rosso writes: > Hi, > > I am successfully using PETSc (v3.3) in parallel to solve the Poisson > equation in 3D . Please upgrade to petsc-3.4 when you get a chance. > The discretization is done by using finite difference on a uniform > structured grid. So far I used the conjugate gradient method, but I > would like to give a try to multigrid. The documentation describes > multigrid as a preconditioner only, thus I would like to know if it is > possible to use multigrid as a solver -ksp_type richardson will not accelerate your multigrid. Krylov with preconditioning is almost never slower, and a lot more robust. > and, if so, if you could give my some tips to start. -pc_type mg -pc_mg_levels 3 -pc_type gamg -pc_type ml -pc_type hypre From choi240 at purdue.edu Fri May 17 15:40:57 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Fri, 17 May 2013 16:40:57 -0400 (EDT) Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <87zjvtpfza.fsf@mcs.anl.gov> Message-ID: <307977034.133238.1368823257261.JavaMail.root@mailhub028.itcs.purdue.edu> My petsc version is 3.3.5. And I made nnz using the number of each row. When I wrote the matrix using MatSetValues, I wrote from low row and column to high row and column. However, when I changed the order writing the matrix, it took much more time (up to 3hrs). So I am concerned that the slowness is because the big size of matrix (2.6*10^7, 1.248*10^15) is related to reading from or writing to the cache, ram, or hard. Anyway, the following code is the part that I read the data from a file and set up tuples and nnz: FILE *fp = fopen("data.txt", "r"); while (fscanf(fp, "%d %d %d %d", &x, &y, &z, &v) == 4) { tups.push_back(std::tr1::make_tuple (x-1, z-1, y-1, v)); nzrow[i-1] += 1; if (x > X) X = x; if (y > Y) Y = y; if (z > Z) Z = z; } fclose(fp); PetscMalloc(X*sizeof(PetscInt), &nnz); memset(nnz, 0, X); for (itnz=nzrow.begin(); itnz!=nzrow.end(); ++itnz) { nnz[itnz->first] = itnz->second; } sort(tups.begin(), tups.end()); If my code is wrong, then please let me know. Thank you, Joon ----- Original Message ----- From: "Jed Brown" To: "Joon hee Choi" Cc: petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 3:24:09 PM Subject: Re: [petsc-users] 3-dimension to matrix Joon hee Choi writes: > Thank you for your fast reply. Your last comments looks like my first > code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as > matrix type and (X, Y) as block size. Also, I implemented > SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The > Matrix-Set-Up part of the first code is as follows: > > sort(tups.begin(), tups.end()); > MatCreate(PETSC_COMM_SELF, &A); > MatSetType(A, MATSEQAIJ); > MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); > MatSetBlockSizes(A, X, Y); > MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); > > sz = tups.size(); > for (i=0; i x = std::tr1::get<0>(tups[i]); > y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; > val = std::tr1::get<3>(tups[i]); > MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); > } > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > > I used tuple (x, z, y, value) and vector of c++. I didn't get any > errors from this code. However, it took about 9 minutes in this > part. What version of PETSc? Your preallocation was almost certainly not sufficient. From jedbrown at mcs.anl.gov Fri May 17 15:48:28 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 15:48:28 -0500 Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <307977034.133238.1368823257261.JavaMail.root@mailhub028.itcs.purdue.edu> References: <307977034.133238.1368823257261.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <87d2sppc2r.fsf@mcs.anl.gov> Joon hee Choi writes: > My petsc version is 3.3.5. > And I made nnz using the number of each row. When I wrote the matrix > using MatSetValues, I wrote from low row and column to high row and > column. However, when I changed the order writing the matrix, it took > much more time (up to 3hrs). So I am concerned that the slowness is > because the big size of matrix (2.6*10^7, 1.248*10^15) is related to > reading from or writing to the cache, ram, or hard. What are you going to do with a matrix of that dimension? You can't apply it to a vector because you can't store the vector. How many entries does it have? > Anyway, the following code is the part that I read the data from a > file and set up tuples and nnz: > > FILE *fp = fopen("data.txt", "r"); > while (fscanf(fp, "%d %d %d %d", &x, &y, &z, &v) == 4) > { > tups.push_back(std::tr1::make_tuple (x-1, z-1, y-1, v)); > nzrow[i-1] += 1; > if (x > X) X = x; > if (y > Y) Y = y; > if (z > Z) Z = z; > } > fclose(fp); > PetscMalloc(X*sizeof(PetscInt), &nnz); > memset(nnz, 0, X); > for (itnz=nzrow.begin(); itnz!=nzrow.end(); ++itnz) { > nnz[itnz->first] = itnz->second; What is this trying to do? nnz will just be the column of the last point to be processed... > } > sort(tups.begin(), tups.end()); > > If my code is wrong, then please let me know. > > Thank you, > Joon > > ----- Original Message ----- > From: "Jed Brown" > To: "Joon hee Choi" > Cc: petsc-users at mcs.anl.gov > Sent: Friday, May 17, 2013 3:24:09 PM > Subject: Re: [petsc-users] 3-dimension to matrix > > Joon hee Choi writes: > >> Thank you for your fast reply. Your last comments looks like my first >> code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as >> matrix type and (X, Y) as block size. Also, I implemented >> SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The >> Matrix-Set-Up part of the first code is as follows: >> >> sort(tups.begin(), tups.end()); >> MatCreate(PETSC_COMM_SELF, &A); >> MatSetType(A, MATSEQAIJ); >> MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); >> MatSetBlockSizes(A, X, Y); >> MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); >> >> sz = tups.size(); >> for (i=0; i> x = std::tr1::get<0>(tups[i]); >> y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; >> val = std::tr1::get<3>(tups[i]); >> MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); >> } >> MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); >> MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); >> >> >> I used tuple (x, z, y, value) and vector of c++. I didn't get any >> errors from this code. However, it took about 9 minutes in this >> part. > > What version of PETSc? Your preallocation was almost certainly not > sufficient. From knepley at gmail.com Fri May 17 15:50:41 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 17 May 2013 15:50:41 -0500 Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <307977034.133238.1368823257261.JavaMail.root@mailhub028.itcs.purdue.edu> References: <87zjvtpfza.fsf@mcs.anl.gov> <307977034.133238.1368823257261.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: On Fri, May 17, 2013 at 3:40 PM, Joon hee Choi wrote: > My petsc version is 3.3.5. > And I made nnz using the number of each row. When I wrote the matrix using > MatSetValues, I wrote from low row and column to high row and column. > However, when I changed the order writing the matrix, it took much more > time (up to 3hrs). So I am concerned that the slowness is because the big > size of matrix (2.6*10^7, 1.248*10^15) is related to reading from or > writing to the cache, ram, or hard. > No, it is bad preallocation. This should be simple to fix. First, turn on errors MatSetOption(MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_TRUE) and second run with -info. Thanks, Matt > Anyway, the following code is the part that I read the data from a file > and set up tuples and nnz: > > FILE *fp = fopen("data.txt", "r"); > while (fscanf(fp, "%d %d %d %d", &x, &y, &z, &v) == 4) > { > tups.push_back(std::tr1::make_tuple (x-1, z-1, y-1, v)); > nzrow[i-1] += 1; > if (x > X) X = x; > if (y > Y) Y = y; > if (z > Z) Z = z; > } > fclose(fp); > PetscMalloc(X*sizeof(PetscInt), &nnz); > memset(nnz, 0, X); > for (itnz=nzrow.begin(); itnz!=nzrow.end(); ++itnz) { > nnz[itnz->first] = itnz->second; > } > sort(tups.begin(), tups.end()); > > If my code is wrong, then please let me know. > > Thank you, > Joon > > ----- Original Message ----- > From: "Jed Brown" > To: "Joon hee Choi" > Cc: petsc-users at mcs.anl.gov > Sent: Friday, May 17, 2013 3:24:09 PM > Subject: Re: [petsc-users] 3-dimension to matrix > > Joon hee Choi writes: > > > Thank you for your fast reply. Your last comments looks like my first > > code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as > > matrix type and (X, Y) as block size. Also, I implemented > > SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The > > Matrix-Set-Up part of the first code is as follows: > > > > sort(tups.begin(), tups.end()); > > MatCreate(PETSC_COMM_SELF, &A); > > MatSetType(A, MATSEQAIJ); > > MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); > > MatSetBlockSizes(A, X, Y); > > MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); > > > > sz = tups.size(); > > for (i=0; i > x = std::tr1::get<0>(tups[i]); > > y = std::tr1::get<2>(tups[i]) + > std::tr1::get<1>(tups[i])*Y; > > val = std::tr1::get<3>(tups[i]); > > MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); > > } > > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > > > > > I used tuple (x, z, y, value) and vector of c++. I didn't get any > > errors from this code. However, it took about 9 minutes in this > > part. > > What version of PETSc? Your preallocation was almost certainly not > sufficient. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From peter.lichtner at gmail.com Fri May 17 15:59:13 2013 From: peter.lichtner at gmail.com (Peter Lichtner) Date: Fri, 17 May 2013 14:59:13 -0600 Subject: [petsc-users] 3-dimension to matrix In-Reply-To: References: <87zjvtpfza.fsf@mcs.anl.gov> <307977034.133238.1368823257261.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <28B72B78-A3A5-4634-9269-9C3099F55FE3@gmail.com> I found a similar problem solving Laplace's equation: using MatCreate took forever, whereas using MatCreateAIJ instead the time for MatSetValues was essentially negligible. I set up MatCreate with a call to MatSetSizes. ...Peter On May 17, 2013, at 2:50 PM, Matthew Knepley wrote: > On Fri, May 17, 2013 at 3:40 PM, Joon hee Choi wrote: > My petsc version is 3.3.5. > And I made nnz using the number of each row. When I wrote the matrix using MatSetValues, I wrote from low row and column to high row and column. However, when I changed the order writing the matrix, it took much more time (up to 3hrs). So I am concerned that the slowness is because the big size of matrix (2.6*10^7, 1.248*10^15) is related to reading from or writing to the cache, ram, or hard. > > No, it is bad preallocation. This should be simple to fix. First, turn on errors > > MatSetOption(MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_TRUE) > > and second run with -info. > > Thanks, > > Matt > > Anyway, the following code is the part that I read the data from a file and set up tuples and nnz: > > FILE *fp = fopen("data.txt", "r"); > while (fscanf(fp, "%d %d %d %d", &x, &y, &z, &v) == 4) > { > tups.push_back(std::tr1::make_tuple (x-1, z-1, y-1, v)); > nzrow[i-1] += 1; > if (x > X) X = x; > if (y > Y) Y = y; > if (z > Z) Z = z; > } > fclose(fp); > PetscMalloc(X*sizeof(PetscInt), &nnz); > memset(nnz, 0, X); > for (itnz=nzrow.begin(); itnz!=nzrow.end(); ++itnz) { > nnz[itnz->first] = itnz->second; > } > sort(tups.begin(), tups.end()); > > If my code is wrong, then please let me know. > > Thank you, > Joon > > ----- Original Message ----- > From: "Jed Brown" > To: "Joon hee Choi" > Cc: petsc-users at mcs.anl.gov > Sent: Friday, May 17, 2013 3:24:09 PM > Subject: Re: [petsc-users] 3-dimension to matrix > > Joon hee Choi writes: > > > Thank you for your fast reply. Your last comments looks like my first > > code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as > > matrix type and (X, Y) as block size. Also, I implemented > > SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The > > Matrix-Set-Up part of the first code is as follows: > > > > sort(tups.begin(), tups.end()); > > MatCreate(PETSC_COMM_SELF, &A); > > MatSetType(A, MATSEQAIJ); > > MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); > > MatSetBlockSizes(A, X, Y); > > MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); > > > > sz = tups.size(); > > for (i=0; i > x = std::tr1::get<0>(tups[i]); > > y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; > > val = std::tr1::get<3>(tups[i]); > > MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); > > } > > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > > > > > I used tuple (x, z, y, value) and vector of c++. I didn't get any > > errors from this code. However, it took about 9 minutes in this > > part. > > What version of PETSc? Your preallocation was almost certainly not > sufficient. > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener ________________ Peter Lichtner Santa Fe, NM 87507 (505) 692-4029 (c) OFM Research/LANL Guest Scientist -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 16:05:12 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 16:05:12 -0500 Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <28B72B78-A3A5-4634-9269-9C3099F55FE3@gmail.com> References: <87zjvtpfza.fsf@mcs.anl.gov> <307977034.133238.1368823257261.JavaMail.root@mailhub028.itcs.purdue.edu> <28B72B78-A3A5-4634-9269-9C3099F55FE3@gmail.com> Message-ID: <871u95pbav.fsf@mcs.anl.gov> Peter Lichtner writes: > I found a similar problem solving Laplace's equation: using MatCreate > took forever, whereas using MatCreateAIJ instead the time for > MatSetValues was essentially negligible. I set up MatCreate with a > call to MatSetSizes. This is almost certainly caused by bad preallocation information, or from that information not being used. The complete implementation of MatCreateAIJ is: PetscErrorCode MatCreateAIJ(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat *A) { PetscErrorCode ierr; PetscMPIInt size; PetscFunctionBegin; ierr = MatCreate(comm,A);CHKERRQ(ierr); ierr = MatSetSizes(*A,m,n,M,N);CHKERRQ(ierr); ierr = MPI_Comm_size(comm,&size);CHKERRQ(ierr); if (size > 1) { ierr = MatSetType(*A,MATMPIAIJ);CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(*A,d_nz,d_nnz,o_nz,o_nnz);CHKERRQ(ierr); } else { ierr = MatSetType(*A,MATSEQAIJ);CHKERRQ(ierr); ierr = MatSeqAIJSetPreallocation(*A,d_nz,d_nnz);CHKERRQ(ierr); } PetscFunctionReturn(0); } From mrosso at uci.edu Fri May 17 16:11:12 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 14:11:12 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <87r4h5pezo.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> Message-ID: <51969CF0.4030200@uci.edu> Thank you. I will ask to update to 3.4 but I am working on a supercomputer thus I do not have control on the installed software. So you suggest to use conjugate gradient + multigrid as preconditioner, correct? If so, I retain -ksp_type cg instead of -ksp_type richardson correct? Michele On 05/17/2013 12:45 PM, Jed Brown wrote: > Michele Rosso writes: > >> Hi, >> >> I am successfully using PETSc (v3.3) in parallel to solve the Poisson >> equation in 3D . > Please upgrade to petsc-3.4 when you get a chance. > >> The discretization is done by using finite difference on a uniform >> structured grid. So far I used the conjugate gradient method, but I >> would like to give a try to multigrid. The documentation describes >> multigrid as a preconditioner only, thus I would like to know if it is >> possible to use multigrid as a solver > -ksp_type richardson > > will not accelerate your multigrid. Krylov with preconditioning is > almost never slower, and a lot more robust. > >> and, if so, if you could give my some tips to start. > -pc_type mg -pc_mg_levels 3 > > -pc_type gamg > > -pc_type ml > > -pc_type hypre > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 16:16:06 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 16:16:06 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <51969CF0.4030200@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> Message-ID: <87sj1lnw89.fsf@mcs.anl.gov> Michele Rosso writes: > Thank you. > I will ask to update to 3.4 but I am working on a supercomputer thus I > do not have control on the installed software. You can install in your $HOME. > So you suggest to use conjugate gradient + multigrid as preconditioner, > correct? Yes > If so, I retain > > -ksp_type cg > > instead of > > -ksp_type richardson > > correct? Yes From mrosso at uci.edu Fri May 17 17:13:44 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 15:13:44 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <87sj1lnw89.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> Message-ID: <5196AB98.2050800@uci.edu> Thank you. I tried to run with -pc_type mg -pc_mg_levels 3 but I got the following error: [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Object is in wrong state! [0]PETSC ERROR: Must call PCMGSetInterpolation() or PCMGSetInterpolation()! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid06898 by Unknown Fri May 17 17:11:09 2013 [0]PETSC ERROR: Libraries linked from [0]PETSC ERROR: Configure run at [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: PCMGGetInterpolation() line 182 in src/ksp/pc/impls/mg/mgfunc.c [0]PETSC ERROR: PCSetUp_MG() line 638 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c On 05/17/2013 02:16 PM, Jed Brown wrote: > Michele Rosso writes: > >> Thank you. >> I will ask to update to 3.4 but I am working on a supercomputer thus I >> do not have control on the installed software. > You can install in your $HOME. > >> So you suggest to use conjugate gradient + multigrid as preconditioner, >> correct? > Yes > >> If so, I retain >> >> -ksp_type cg >> >> instead of >> >> -ksp_type richardson >> >> correct? > Yes > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 17:16:35 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 17:16:35 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <5196AB98.2050800@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> Message-ID: <87a9ntntfg.fsf@mcs.anl.gov> Michele Rosso writes: > Thank you. > I tried to run with > > -pc_type mg -pc_mg_levels 3 Call KSPSetDM(). Look at src/ksp/ksp/examples/tutorials/ex45.c or src/snes/examples/tutorials/ex5.c From mrosso at uci.edu Fri May 17 17:50:09 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 15:50:09 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <87a9ntntfg.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> Message-ID: <5196B421.8070302@uci.edu> I added KSPSetDM() and now the error I receive is: [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: No support for this operation for this object type! [0]PETSC ERROR: For coloring efficiency ensure number of grid points in X is divisible by 2*stencil_width + 1 ! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid25119 by Unknown Fri May 17 17:41:05 2013 [0]PETSC ERROR: Libraries linked from [0]PETSC ERROR: Configure run at [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: DMCreateColoring_DA_3d_MPIAIJ() line 288 in src/dm/impls/da/fdda.c [0]PETSC ERROR: DMCreateColoring_DA() line 172 in src/dm/impls/da/fdda.c [0]PETSC ERROR: DMCreateColoring() line 709 in src/dm/interface/dm.c [0]PETSC ERROR: DMComputeJacobian() line 2206 in src/dm/interface/dm.c [0]PETSC ERROR: KSPSetUp() line 228 in src/ksp/ksp/interface/itfunc.c My 3D grid is composed by 256^3 nodes, I am using 4 processors and the the DMDA is initialized as: call DMDACreate3d( PETSC_COMM_WORLD , DMDA_BOUNDARY_PERIODIC , DMDA_BOUNDARY_PERIODIC, & & DMDA_BOUNDARY_PERIODIC , DMDA_STENCIL_STAR, 256 , 256 , 256 , 2 , 2 , 1, 1 , 1 , & & 128 , 128 , 256, da , ierr) Thank you very much for your help. Michele On 05/17/2013 03:16 PM, Jed Brown wrote: > Michele Rosso writes: > >> Thank you. >> I tried to run with >> >> -pc_type mg -pc_mg_levels 3 > Call KSPSetDM(). Look at src/ksp/ksp/examples/tutorials/ex45.c or > src/snes/examples/tutorials/ex5.c > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 17:53:49 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 17:53:49 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <5196B421.8070302@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> Message-ID: <8738tlnrpe.fsf@mcs.anl.gov> Michele Rosso writes: > I added KSPSetDM() and now the error I receive is: > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: No support for this operation for this object type! > [0]PETSC ERROR: For coloring efficiency ensure number of grid points in > X is divisible > by 2*stencil_width + 1 > ! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > 11:26:24 CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid25119 by Unknown Fri May 17 > 17:41:05 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: DMCreateColoring_DA_3d_MPIAIJ() line 288 in > src/dm/impls/da/fdda.c > [0]PETSC ERROR: DMCreateColoring_DA() line 172 in src/dm/impls/da/fdda.c > [0]PETSC ERROR: DMCreateColoring() line 709 in src/dm/interface/dm.c > [0]PETSC ERROR: DMComputeJacobian() line 2206 in src/dm/interface/dm.c > [0]PETSC ERROR: KSPSetUp() line 228 in src/ksp/ksp/interface/itfunc.c > > > My 3D grid is composed by 256^3 nodes, I am using 4 processors and the > the DMDA is initialized as: Yes, with a vertex-centered discretization, use 257^3 instead. From mrosso at uci.edu Fri May 17 18:06:46 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 16:06:46 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <8738tlnrpe.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> Message-ID: <5196B806.5020605@uci.edu> So should I always use an odd number of grid points? There is no way around this? Michele On 05/17/2013 03:53 PM, Jed Brown wrote: > Michele Rosso writes: > >> I added KSPSetDM() and now the error I receive is: >> >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: No support for this operation for this object type! >> [0]PETSC ERROR: For coloring efficiency ensure number of grid points in >> X is divisible >> by 2*stencil_width + 1 >> ! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 >> 11:26:24 CDT 2012 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./hit on a named nid25119 by Unknown Fri May 17 >> 17:41:05 2013 >> [0]PETSC ERROR: Libraries linked from >> [0]PETSC ERROR: Configure run at >> [0]PETSC ERROR: Configure options >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: DMCreateColoring_DA_3d_MPIAIJ() line 288 in >> src/dm/impls/da/fdda.c >> [0]PETSC ERROR: DMCreateColoring_DA() line 172 in src/dm/impls/da/fdda.c >> [0]PETSC ERROR: DMCreateColoring() line 709 in src/dm/interface/dm.c >> [0]PETSC ERROR: DMComputeJacobian() line 2206 in src/dm/interface/dm.c >> [0]PETSC ERROR: KSPSetUp() line 228 in src/ksp/ksp/interface/itfunc.c >> >> >> My 3D grid is composed by 256^3 nodes, I am using 4 processors and the >> the DMDA is initialized as: > Yes, with a vertex-centered discretization, use 257^3 instead. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 18:25:57 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 18:25:57 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <5196B806.5020605@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> Message-ID: <87ppwpmbne.fsf@mcs.anl.gov> Michele Rosso writes: > So should I always use an odd number of grid points? > There is no way around this? If you want to use regular geometric coarsening, then yes. That *is* regular node-centered coarsening. Just consider the base case of one element: o ------- o Split that in two: o -- o -- o Look, an odd number of vertices, and as we keep refining, it will stay odd. You can use AMG or write your own interpolation if you want irregular coarsening. From jedbrown at mcs.anl.gov Fri May 17 19:10:35 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 19:10:35 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <5196C5BE.7060601@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> Message-ID: <87mwrtm9l0.fsf@mcs.anl.gov> Please always use "reply-all" so that your messages go to the list. This is standard mailing list etiquette. It is important to preserve threading for people who find this discussion later and so that we do not waste our time re-answering the same questions that have already been answered in private side-conversations. You'll likely get an answer faster that way too. Michele Rosso writes: > If you are referring to -pc_type gamg, I tried it, but I got the same > error message > (For coloring efficiency ensure number of grid points in X is divisible > by 2*stencil_width + 1) The option could not have been used. Always send the ENTIRE error message. Is the code calling KSPSetFromOptions()? Run with -options_left to see if any options did not get used. > On 05/17/2013 04:49 PM, Jed Brown wrote: >> >> Read my first message >> >> On May 17, 2013 6:47 PM, "Michele Rosso" > > wrote: >> >> Ok, I will give a try to AMG then. What is it exactly? >> Thank you! >> >> On 05/17/2013 04:25 PM, Jed Brown wrote: >>> Michele Rosso writes: >>> >>>> So should I always use an odd number of grid points? >>>> There is no way around this? >>> If you want to use regular geometric coarsening, then yes. That *is* >>> regular node-centered coarsening. Just consider the base case of one >>> element: >>> >>> >>> o ------- o >>> >>> Split that in two: >>> >>> o -- o -- o >>> >>> Look, an odd number of vertices, and as we keep refining, it will stay >>> odd. >>> >>> You can use AMG or write your own interpolation if you want irregular coarsening. >>> >> From mrosso at uci.edu Fri May 17 19:24:29 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 17:24:29 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <87mwrtm9l0.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> Message-ID: <5196CA3D.3070001@uci.edu> I run with -pc_type gamg -options_left and I get the error: [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: No support for this operation for this object type! [0]PETSC ERROR: For coloring efficiency ensure number of grid points in X is divisible by 2*stencil_width + 1 ! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid22318 by Unknown Fri May 17 19:21:25 2013 [0]PETSC ERROR: Libraries linked from [0]PETSC ERROR: Configure run at [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: DMCreateColoring_DA_3d_MPIAIJ() line 288 in src/dm/impls/da/fdda.c [0]PETSC ERROR: DMCreateColoring_DA() line 172 in src/dm/impls/da/fdda.c [0]PETSC ERROR: DMCreateColoring() line 709 in src/dm/interface/dm.c [0]PETSC ERROR: DMComputeJacobian() line 2206 in src/dm/interface/dm.c [0]PETSC ERROR: KSPSetUp() line 228 in src/ksp/ksp/interface/itfunc.c The code is calling KSPSetFromOptions() On 05/17/2013 05:10 PM, Jed Brown wrote: > Please always use "reply-all" so that your messages go to the list. > This is standard mailing list etiquette. It is important to preserve > threading for people who find this discussion later and so that we do > not waste our time re-answering the same questions that have already > been answered in private side-conversations. You'll likely get an > answer faster that way too. > > Michele Rosso writes: > >> If you are referring to -pc_type gamg, I tried it, but I got the same >> error message >> (For coloring efficiency ensure number of grid points in X is divisible >> by 2*stencil_width + 1) > The option could not have been used. Always send the ENTIRE error > message. Is the code calling KSPSetFromOptions()? Run with > -options_left to see if any options did not get used. > >> On 05/17/2013 04:49 PM, Jed Brown wrote: >>> Read my first message >>> >>> On May 17, 2013 6:47 PM, "Michele Rosso" >> > wrote: >>> >>> Ok, I will give a try to AMG then. What is it exactly? >>> Thank you! >>> >>> On 05/17/2013 04:25 PM, Jed Brown wrote: >>>> Michele Rosso writes: >>>> >>>>> So should I always use an odd number of grid points? >>>>> There is no way around this? >>>> If you want to use regular geometric coarsening, then yes. That *is* >>>> regular node-centered coarsening. Just consider the base case of one >>>> element: >>>> >>>> >>>> o ------- o >>>> >>>> Split that in two: >>>> >>>> o -- o -- o >>>> >>>> Look, an odd number of vertices, and as we keep refining, it will stay >>>> odd. >>>> >>>> You can use AMG or write your own interpolation if you want irregular coarsening. >>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 19:33:31 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 19:33:31 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <5196CA3D.3070001@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> Message-ID: <87k3mxm8is.fsf@mcs.anl.gov> Michele Rosso writes: > I run with > > -pc_type gamg -options_left > > and I get the error: > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: No support for this operation for this object type! > [0]PETSC ERROR: For coloring efficiency ensure number of grid points in > X is divisible > by 2*stencil_width + 1 > ! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > 11:26:24 CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid22318 by Unknown Fri May 17 > 19:21:25 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: DMCreateColoring_DA_3d_MPIAIJ() line 288 in > src/dm/impls/da/fdda.c You didn't provide a Jacobian. A finite difference Jacobian is mostly used for initial development. For more general problems, either assemble the Jacobian or choose a compatible grid. Note: this message is a consequence of using periodic boundary conditions. Someone cut a corner a long time ago when implementing coloring for periodic and nobody has needed this enough to remove the assumption. A dimension of 255 would also be divisible by 3, if it helps you. if (bx == DMDA_BOUNDARY_PERIODIC && (m % col)) SETERRQ(PetscObjectComm((PetscObject)da),PETSC_ERR_SUP,"For coloring efficiency ensure number of grid points in X is divisible\n\ by 2*stencil_width + 1\n"); > [0]PETSC ERROR: DMCreateColoring_DA() line 172 in src/dm/impls/da/fdda.c > [0]PETSC ERROR: DMCreateColoring() line 709 in src/dm/interface/dm.c > [0]PETSC ERROR: DMComputeJacobian() line 2206 in src/dm/interface/dm.c > [0]PETSC ERROR: KSPSetUp() line 228 in src/ksp/ksp/interface/itfunc.c Looks like you cut the error message off short. > The code is calling > > KSPSetFromOptions() > > > On 05/17/2013 05:10 PM, Jed Brown wrote: >> Please always use "reply-all" so that your messages go to the list. >> This is standard mailing list etiquette. It is important to preserve >> threading for people who find this discussion later and so that we do >> not waste our time re-answering the same questions that have already >> been answered in private side-conversations. You'll likely get an >> answer faster that way too. >> >> Michele Rosso writes: >> >>> If you are referring to -pc_type gamg, I tried it, but I got the same >>> error message >>> (For coloring efficiency ensure number of grid points in X is divisible >>> by 2*stencil_width + 1) >> The option could not have been used. Always send the ENTIRE error >> message. Is the code calling KSPSetFromOptions()? Run with >> -options_left to see if any options did not get used. >> >>> On 05/17/2013 04:49 PM, Jed Brown wrote: >>>> Read my first message >>>> >>>> On May 17, 2013 6:47 PM, "Michele Rosso" >>> > wrote: >>>> >>>> Ok, I will give a try to AMG then. What is it exactly? >>>> Thank you! >>>> >>>> On 05/17/2013 04:25 PM, Jed Brown wrote: >>>>> Michele Rosso writes: >>>>> >>>>>> So should I always use an odd number of grid points? >>>>>> There is no way around this? >>>>> If you want to use regular geometric coarsening, then yes. That *is* >>>>> regular node-centered coarsening. Just consider the base case of one >>>>> element: >>>>> >>>>> >>>>> o ------- o >>>>> >>>>> Split that in two: >>>>> >>>>> o -- o -- o >>>>> >>>>> Look, an odd number of vertices, and as we keep refining, it will stay >>>>> odd. >>>>> >>>>> You can use AMG or write your own interpolation if you want irregular coarsening. >>>>> From mrosso at uci.edu Fri May 17 19:45:46 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 17:45:46 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <87k3mxm8is.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> Message-ID: <5196CF3A.3030000@uci.edu> I noticed that the problem appears even if I use CG with the default preconditioner: commenting KSPSetDM() solves the problem. So basically without a proper grid (it seems no grid with an even numbers of nodes qualifies) and with my own system matrix, I cannot use any type of multigrid pre-conditioner? On 05/17/2013 05:33 PM, Jed Brown wrote: > Michele Rosso writes: > >> I run with >> >> -pc_type gamg -options_left >> >> and I get the error: >> >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: No support for this operation for this object type! >> [0]PETSC ERROR: For coloring efficiency ensure number of grid points in >> X is divisible >> by 2*stencil_width + 1 >> ! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 >> 11:26:24 CDT 2012 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./hit on a named nid22318 by Unknown Fri May 17 >> 19:21:25 2013 >> [0]PETSC ERROR: Libraries linked from >> [0]PETSC ERROR: Configure run at >> [0]PETSC ERROR: Configure options >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: DMCreateColoring_DA_3d_MPIAIJ() line 288 in >> src/dm/impls/da/fdda.c > You didn't provide a Jacobian. A finite difference Jacobian is mostly > used for initial development. For more general problems, either > assemble the Jacobian or choose a compatible grid. > > Note: this message is a consequence of using periodic boundary > conditions. Someone cut a corner a long time ago when implementing > coloring for periodic and nobody has needed this enough to remove the > assumption. A dimension of 255 would also be divisible by 3, if it > helps you. > > if (bx == DMDA_BOUNDARY_PERIODIC && (m % col)) SETERRQ(PetscObjectComm((PetscObject)da),PETSC_ERR_SUP,"For coloring efficiency ensure number of grid points in X is divisible\n\ > by 2*stencil_width + 1\n"); > >> [0]PETSC ERROR: DMCreateColoring_DA() line 172 in src/dm/impls/da/fdda.c >> [0]PETSC ERROR: DMCreateColoring() line 709 in src/dm/interface/dm.c >> [0]PETSC ERROR: DMComputeJacobian() line 2206 in src/dm/interface/dm.c >> [0]PETSC ERROR: KSPSetUp() line 228 in src/ksp/ksp/interface/itfunc.c > Looks like you cut the error message off short. > >> The code is calling >> >> KSPSetFromOptions() >> >> >> On 05/17/2013 05:10 PM, Jed Brown wrote: >>> Please always use "reply-all" so that your messages go to the list. >>> This is standard mailing list etiquette. It is important to preserve >>> threading for people who find this discussion later and so that we do >>> not waste our time re-answering the same questions that have already >>> been answered in private side-conversations. You'll likely get an >>> answer faster that way too. >>> >>> Michele Rosso writes: >>> >>>> If you are referring to -pc_type gamg, I tried it, but I got the same >>>> error message >>>> (For coloring efficiency ensure number of grid points in X is divisible >>>> by 2*stencil_width + 1) >>> The option could not have been used. Always send the ENTIRE error >>> message. Is the code calling KSPSetFromOptions()? Run with >>> -options_left to see if any options did not get used. >>> >>>> On 05/17/2013 04:49 PM, Jed Brown wrote: >>>>> Read my first message >>>>> >>>>> On May 17, 2013 6:47 PM, "Michele Rosso" >>>> > wrote: >>>>> >>>>> Ok, I will give a try to AMG then. What is it exactly? >>>>> Thank you! >>>>> >>>>> On 05/17/2013 04:25 PM, Jed Brown wrote: >>>>>> Michele Rosso writes: >>>>>> >>>>>>> So should I always use an odd number of grid points? >>>>>>> There is no way around this? >>>>>> If you want to use regular geometric coarsening, then yes. That *is* >>>>>> regular node-centered coarsening. Just consider the base case of one >>>>>> element: >>>>>> >>>>>> >>>>>> o ------- o >>>>>> >>>>>> Split that in two: >>>>>> >>>>>> o -- o -- o >>>>>> >>>>>> Look, an odd number of vertices, and as we keep refining, it will stay >>>>>> odd. >>>>>> >>>>>> You can use AMG or write your own interpolation if you want irregular coarsening. >>>>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 17 21:01:12 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 17 May 2013 21:01:12 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <5196CF3A.3030000@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> Message-ID: <87ehd5m4gn.fsf@mcs.anl.gov> Michele Rosso writes: > I noticed that the problem appears even if I use CG with the default > preconditioner: commenting KSPSetDM() solves the problem. Okay, this issue can't show up if you use SNES, but it's a consequence of making geometric multigrid work with a pure KSP interface. You can either use KSPSetComputeOperators() to put your assembly in a function (which will also be called on coarse levels if you use -pc_type mg without Galerkin coarse operators) or you can can provide the Jacobian using KSPSetOperators() as usual, but also call KSPSetDMActive() so that the DM is not used for computing/updating the Jacobian. The logic is cleaner in petsc-3.4 and I think it just does the right thing in your case. > So basically without a proper grid (it seems no grid with an even > numbers of nodes qualifies) and with my own system matrix, I cannot use > any type of multigrid > pre-conditioner? You can use all the AMG methods without setting a DM. From choi240 at purdue.edu Fri May 17 21:16:01 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Fri, 17 May 2013 22:16:01 -0400 (EDT) Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <87d2sppc2r.fsf@mcs.anl.gov> Message-ID: <1680837486.133411.1368843361060.JavaMail.root@mailhub028.itcs.purdue.edu> Thank you for your reply. I will calculate the multiplication of block matrices. I made the vector with tuples from dataset and then set up the matrix by reading each tuple. And the matrix has 144*10^6 non-zero entries. Sincerely, Joon ----- Original Message ----- From: "Jed Brown" To: "Joon hee Choi" Cc: petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 4:48:28 PM Subject: Re: [petsc-users] 3-dimension to matrix Joon hee Choi writes: > My petsc version is 3.3.5. > And I made nnz using the number of each row. When I wrote the matrix > using MatSetValues, I wrote from low row and column to high row and > column. However, when I changed the order writing the matrix, it took > much more time (up to 3hrs). So I am concerned that the slowness is > because the big size of matrix (2.6*10^7, 1.248*10^15) is related to > reading from or writing to the cache, ram, or hard. What are you going to do with a matrix of that dimension? You can't apply it to a vector because you can't store the vector. How many entries does it have? > Anyway, the following code is the part that I read the data from a > file and set up tuples and nnz: > > FILE *fp = fopen("data.txt", "r"); > while (fscanf(fp, "%d %d %d %d", &x, &y, &z, &v) == 4) > { > tups.push_back(std::tr1::make_tuple (x-1, z-1, y-1, v)); > nzrow[i-1] += 1; > if (x > X) X = x; > if (y > Y) Y = y; > if (z > Z) Z = z; > } > fclose(fp); > PetscMalloc(X*sizeof(PetscInt), &nnz); > memset(nnz, 0, X); > for (itnz=nzrow.begin(); itnz!=nzrow.end(); ++itnz) { > nnz[itnz->first] = itnz->second; What is this trying to do? nnz will just be the column of the last point to be processed... > } > sort(tups.begin(), tups.end()); > > If my code is wrong, then please let me know. > > Thank you, > Joon > > ----- Original Message ----- > From: "Jed Brown" > To: "Joon hee Choi" > Cc: petsc-users at mcs.anl.gov > Sent: Friday, May 17, 2013 3:24:09 PM > Subject: Re: [petsc-users] 3-dimension to matrix > > Joon hee Choi writes: > >> Thank you for your fast reply. Your last comments looks like my first >> code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as >> matrix type and (X, Y) as block size. Also, I implemented >> SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The >> Matrix-Set-Up part of the first code is as follows: >> >> sort(tups.begin(), tups.end()); >> MatCreate(PETSC_COMM_SELF, &A); >> MatSetType(A, MATSEQAIJ); >> MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); >> MatSetBlockSizes(A, X, Y); >> MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); >> >> sz = tups.size(); >> for (i=0; i> x = std::tr1::get<0>(tups[i]); >> y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; >> val = std::tr1::get<3>(tups[i]); >> MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); >> } >> MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); >> MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); >> >> >> I used tuple (x, z, y, value) and vector of c++. I didn't get any >> errors from this code. However, it took about 9 minutes in this >> part. > > What version of PETSc? Your preallocation was almost certainly not > sufficient. From choi240 at purdue.edu Fri May 17 21:18:07 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Fri, 17 May 2013 22:18:07 -0400 (EDT) Subject: [petsc-users] 3-dimension to matrix In-Reply-To: Message-ID: <2019368695.133413.1368843487672.JavaMail.root@mailhub028.itcs.purdue.edu> Thank you for your reply. Okay. I will try to check preallocation errors of my matrix. Thanks, Joon ----- Original Message ----- From: "Matthew Knepley" To: "Joon hee Choi" Cc: "Jed Brown" , petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 4:50:41 PM Subject: Re: [petsc-users] 3-dimension to matrix On Fri, May 17, 2013 at 3:40 PM, Joon hee Choi < choi240 at purdue.edu > wrote: My petsc version is 3.3.5. And I made nnz using the number of each row. When I wrote the matrix using MatSetValues, I wrote from low row and column to high row and column. However, when I changed the order writing the matrix, it took much more time (up to 3hrs). So I am concerned that the slowness is because the big size of matrix (2.6*10^7, 1.248*10^15) is related to reading from or writing to the cache, ram, or hard. No, it is bad preallocation. This should be simple to fix. First, turn on errors MatSetOption( MAT_NEW_NONZERO_ALLOCATION_ERR , PETSC_TRUE) and second run with -info. Thanks, Matt Anyway, the following code is the part that I read the data from a file and set up tuples and nnz: FILE *fp = fopen("data.txt", "r"); while (fscanf(fp, "%d %d %d %d", &x, &y, &z, &v) == 4) { tups.push_back(std::tr1::make_tuple (x-1, z-1, y-1, v)); nzrow[i-1] += 1; if (x > X) X = x; if (y > Y) Y = y; if (z > Z) Z = z; } fclose(fp); PetscMalloc(X*sizeof(PetscInt), &nnz); memset(nnz, 0, X); for (itnz=nzrow.begin(); itnz!=nzrow.end(); ++itnz) { nnz[itnz->first] = itnz->second; } sort(tups.begin(), tups.end()); If my code is wrong, then please let me know. Thank you, Joon ----- Original Message ----- From: "Jed Brown" < jedbrown at mcs.anl.gov > To: "Joon hee Choi" < choi240 at purdue.edu > Cc: petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 3:24:09 PM Subject: Re: [petsc-users] 3-dimension to matrix Joon hee Choi < choi240 at purdue.edu > writes: > Thank you for your fast reply. Your last comments looks like my first > code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as > matrix type and (X, Y) as block size. Also, I implemented > SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The > Matrix-Set-Up part of the first code is as follows: > > sort(tups.begin(), tups.end()); > MatCreate(PETSC_COMM_SELF, &A); > MatSetType(A, MATSEQAIJ); > MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); > MatSetBlockSizes(A, X, Y); > MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); > > sz = tups.size(); > for (i=0; i x = std::tr1::get<0>(tups[i]); > y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; > val = std::tr1::get<3>(tups[i]); > MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); > } > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > > I used tuple (x, z, y, value) and vector of c++. I didn't get any > errors from this code. However, it took about 9 minutes in this > part. What version of PETSc? Your preallocation was almost certainly not sufficient. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener From choi240 at purdue.edu Fri May 17 21:25:22 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Fri, 17 May 2013 22:25:22 -0400 (EDT) Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <28B72B78-A3A5-4634-9269-9C3099F55FE3@gmail.com> Message-ID: <1838161655.133415.1368843922501.JavaMail.root@mailhub028.itcs.purdue.edu> Thank you so much for your reply. I think I have to use MatCreateAIJ like you because you experienced the same. I'll try it now. Sincerely, Joon ----- Original Message ----- From: "Peter Lichtner" To: petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 4:59:13 PM Subject: Re: [petsc-users] 3-dimension to matrix I found a similar problem solving Laplace's equation: using MatCreate took forever, whereas using MatCreateAIJ instead the time for MatSetValues was essentially negligible. I set up MatCreate with a call to MatSetSizes. ...Peter On May 17, 2013, at 2:50 PM, Matthew Knepley < knepley at gmail.com > wrote: On Fri, May 17, 2013 at 3:40 PM, Joon hee Choi < choi240 at purdue.edu > wrote: My petsc version is 3.3.5. And I made nnz using the number of each row. When I wrote the matrix using MatSetValues, I wrote from low row and column to high row and column. However, when I changed the order writing the matrix, it took much more time (up to 3hrs). So I am concerned that the slowness is because the big size of matrix (2.6*10^7, 1.248*10^15) is related to reading from or writing to the cache, ram, or hard. No, it is bad preallocation. This should be simple to fix. First, turn on errors MatSetOption( MAT_NEW_NONZERO_ALLOCATION_ERR , PETSC_TRUE) and second run with -info. Thanks, Matt Anyway, the following code is the part that I read the data from a file and set up tuples and nnz: FILE *fp = fopen("data.txt", "r"); while (fscanf(fp, "%d %d %d %d", &x, &y, &z, &v) == 4) { tups.push_back(std::tr1::make_tuple (x-1, z-1, y-1, v)); nzrow[i-1] += 1; if (x > X) X = x; if (y > Y) Y = y; if (z > Z) Z = z; } fclose(fp); PetscMalloc(X*sizeof(PetscInt), &nnz); memset(nnz, 0, X); for (itnz=nzrow.begin(); itnz!=nzrow.end(); ++itnz) { nnz[itnz->first] = itnz->second; } sort(tups.begin(), tups.end()); If my code is wrong, then please let me know. Thank you, Joon ----- Original Message ----- From: "Jed Brown" < jedbrown at mcs.anl.gov > To: "Joon hee Choi" < choi240 at purdue.edu > Cc: petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 3:24:09 PM Subject: Re: [petsc-users] 3-dimension to matrix Joon hee Choi < choi240 at purdue.edu > writes: > Thank you for your fast reply. Your last comments looks like my first > code. The full size of the matrix is (X, Y*Z). Also, I used SEQAIJ as > matrix type and (X, Y) as block size. Also, I implemented > SeqAIJpreallocating with nnz. Nevertheless, it was very slow. The > Matrix-Set-Up part of the first code is as follows: > > sort(tups.begin(), tups.end()); > MatCreate(PETSC_COMM_SELF, &A); > MatSetType(A, MATSEQAIJ); > MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, X, Y*Z); > MatSetBlockSizes(A, X, Y); > MatSeqAIJSetPreallocation(A, PETSC_DEFAULT, nnz); > > sz = tups.size(); > for (i=0; i x = std::tr1::get<0>(tups[i]); > y = std::tr1::get<2>(tups[i]) + std::tr1::get<1>(tups[i])*Y; > val = std::tr1::get<3>(tups[i]); > MatSetValues(A, 1, &x, 1, &y, &val, INSERT_VALUES); > } > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > > I used tuple (x, z, y, value) and vector of c++. I didn't get any > errors from this code. However, it took about 9 minutes in this > part. What version of PETSc? Your preallocation was almost certainly not sufficient. -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener ________________ Peter Lichtner Santa Fe, NM 87507 (505) 692-4029 (c) OFM Research/LANL Guest Scientist From choi240 at purdue.edu Fri May 17 21:28:59 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Fri, 17 May 2013 22:28:59 -0400 (EDT) Subject: [petsc-users] 3-dimension to matrix In-Reply-To: <871u95pbav.fsf@mcs.anl.gov> Message-ID: <486406331.133423.1368844139657.JavaMail.root@mailhub028.itcs.purdue.edu> Thank you for your reply. Yes. I think so now. If I have slow result in spite of applying MatCreateAIJ, then I reply it again. Thank you very much again. Sincerely, Joon ----- Original Message ----- From: "Jed Brown" To: "Peter Lichtner" , petsc-users at mcs.anl.gov Sent: Friday, May 17, 2013 5:05:12 PM Subject: Re: [petsc-users] 3-dimension to matrix Peter Lichtner writes: > I found a similar problem solving Laplace's equation: using MatCreate > took forever, whereas using MatCreateAIJ instead the time for > MatSetValues was essentially negligible. I set up MatCreate with a > call to MatSetSizes. This is almost certainly caused by bad preallocation information, or from that information not being used. The complete implementation of MatCreateAIJ is: PetscErrorCode MatCreateAIJ(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt N,PetscInt d_nz,const PetscInt d_nnz[],PetscInt o_nz,const PetscInt o_nnz[],Mat *A) { PetscErrorCode ierr; PetscMPIInt size; PetscFunctionBegin; ierr = MatCreate(comm,A);CHKERRQ(ierr); ierr = MatSetSizes(*A,m,n,M,N);CHKERRQ(ierr); ierr = MPI_Comm_size(comm,&size);CHKERRQ(ierr); if (size > 1) { ierr = MatSetType(*A,MATMPIAIJ);CHKERRQ(ierr); ierr = MatMPIAIJSetPreallocation(*A,d_nz,d_nnz,o_nz,o_nnz);CHKERRQ(ierr); } else { ierr = MatSetType(*A,MATSEQAIJ);CHKERRQ(ierr); ierr = MatSeqAIJSetPreallocation(*A,d_nz,d_nnz);CHKERRQ(ierr); } PetscFunctionReturn(0); } From mrosso at uci.edu Fri May 17 21:35:08 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 17 May 2013 19:35:08 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <87ehd5m4gn.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> Message-ID: <5196E8DC.1010602@uci.edu> Thank you very much. I will try and let you know. Michele On 05/17/2013 07:01 PM, Jed Brown wrote: > Michele Rosso writes: > >> I noticed that the problem appears even if I use CG with the default >> preconditioner: commenting KSPSetDM() solves the problem. > Okay, this issue can't show up if you use SNES, but it's a consequence > of making geometric multigrid work with a pure KSP interface. You can > either use KSPSetComputeOperators() to put your assembly in a function > (which will also be called on coarse levels if you use -pc_type mg > without Galerkin coarse operators) or you can can provide the Jacobian > using KSPSetOperators() as usual, but also call KSPSetDMActive() so that > the DM is not used for computing/updating the Jacobian. > > The logic is cleaner in petsc-3.4 and I think it just does the right > thing in your case. > >> So basically without a proper grid (it seems no grid with an even >> numbers of nodes qualifies) and with my own system matrix, I cannot use >> any type of multigrid >> pre-conditioner? > You can use all the AMG methods without setting a DM. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaetank at gmail.com Sun May 19 22:11:26 2013 From: gaetank at gmail.com (Gaetan Kenway) Date: Sun, 19 May 2013 23:11:26 -0400 Subject: [petsc-users] h-FGMRES Message-ID: Hi Everyone I am trying to replicate the type of preconditioner described in "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". I have used the following options: (I'm using fortran so the following is my petsc_options file) # Matrix Options -matload_block_size 5 -mat_type mpibaij # KSP solver options -ksp_type gmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 # Nested GMRES Options -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1 The test is run on 64 processors and the total number of block jacobi blocks is 4 (less than nproc). The error I get is: [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [6]PETSC ERROR: INSTEAD the line number of the start of the function [6]PETSC ERROR: is given. [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c [6]PETSC ERROR: [6] PCSetUp line 810 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c [6]PETSC ERROR: [6] KSPSetUp line 182 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c [6]PETSC ERROR: [6] KSPSolve line 351 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c [6]PETSC ERROR: --------------------- Error Message ------------------------------------ [6]PETSC ERROR: Signal received! [6]PETSC ERROR: ------------------------------------------------------------------------ [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 15:10:41 CST 2012 [6]PETSC ERROR: See docs/changes/index.html for recent updates. [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [6]PETSC ERROR: ------------------------------------------------------------------------ [6]PETSC ERROR: ------------------------------------------------------------------------ [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway Sun May 19 23:01:52 2013 [6]PETSC ERROR: Libraries linked from /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 [6]PETSC ERROR: Configure options --with-shared-libraries --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic [6]PETSC ERROR: ------------------------------------------------------------------------ If the number of blocks is greater than or equal to the number of processors it runs fine. I'm using version 3.3-p5. The options as listed in the paper are: -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi -flow_pc_bjacobi_blocks ngp -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type bjacobi -flow_sub_sub_pc_type ilu Any suggestions would be greatly appreciated. Thank you, Gaetan Kenway -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun May 19 22:15:00 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 19 May 2013 22:15:00 -0500 Subject: [petsc-users] h-FGMRES In-Reply-To: References: Message-ID: You should be using PETSc version 3.4 which was recently released and is what the paper is based on. Barry On May 19, 2013, at 10:11 PM, Gaetan Kenway wrote: > Hi Everyone > > I am trying to replicate the type of preconditioner described in "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". > > I have used the following options: (I'm using fortran so the following is my petsc_options file) > > # Matrix Options > -matload_block_size 5 > -mat_type mpibaij > > # KSP solver options > -ksp_type gmres > -ksp_max_it 1000 > -ksp_gmres_restart 200 > -ksp_monitor > -ksp_view > -ksp_pc_side right > -ksp_rtol 1e-6 > > # Nested GMRES Options > -pc_type bjacobi > -pc_bjacobi_blocks 4 > -sub_ksp_type gmres > -sub_ksp_max_it 5 > -sub_pc_type bjacobi > -sub_sub_pc_type ilu > -sub_sub_pc_factor_mat_ordering_type rcm > -sub_sub_pc_factor_levels 1 > > The test is run on 64 processors and the total number of block jacobi blocks is 4 (less than nproc). The error I get is: > > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [6]PETSC ERROR: INSTEAD the line number of the start of the function > [6]PETSC ERROR: is given. > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c > [6]PETSC ERROR: [6] PCSetUp line 810 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c > [6]PETSC ERROR: [6] KSPSetUp line 182 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c > [6]PETSC ERROR: [6] KSPSolve line 351 /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c > [6]PETSC ERROR: --------------------- Error Message ------------------------------------ > [6]PETSC ERROR: Signal received! > [6]PETSC ERROR: ------------------------------------------------------------------------ > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 15:10:41 CST 2012 > [6]PETSC ERROR: See docs/changes/index.html for recent updates. > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [6]PETSC ERROR: ------------------------------------------------------------------------ > [6]PETSC ERROR: ------------------------------------------------------------------------ > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway Sun May 19 23:01:52 2013 > [6]PETSC ERROR: Libraries linked from /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 > [6]PETSC ERROR: Configure options --with-shared-libraries --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic > [6]PETSC ERROR: ------------------------------------------------------------------------ > > If the number of blocks is greater than or equal to the number of processors it runs fine. I'm using version 3.3-p5. > > The options as listed in the paper are: > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi -flow_pc_bjacobi_blocks ngp > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type bjacobi > -flow_sub_sub_pc_type ilu > > Any suggestions would be greatly appreciated. > > Thank you, > > Gaetan Kenway > > From hao.yu at peraglobal.com Mon May 20 04:08:05 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Mon, 20 May 2013 17:08:05 +0800 Subject: [petsc-users] =?gb2312?b?tPC4tDogtPC4tDogILTwuLQ6ILTwuLQ6INeqt6I6?= =?gb2312?b?IFBFVHNjIHByb2JsZW0=?= In-Reply-To: References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn>, Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> I can use cl to compile file under Windows. But even if I include path which cl is in in the enviromental variable of Cygwin, it still shows "Win32fe cl" does not work. The attached is the configure.log and make.log Thanks! Hao ________________________________ ???: Matthew Knepley [knepley at gmail.com] ????: 2013?5?11? 4:32 ???: ??; petsc-users at mcs.anl.gov ??: Re: ??: [petsc-users] ??: ??: ??: PETsc problem On Fri, May 10, 2013 at 12:11 PM, ?? > wrote: What do you mean by compiling something with it ? I don't know. you mean 'win32fe cl' is not installed correctly? I mean, can you compile a file using cl from the command line? Matt Hao ________________________________ ???: Matthew Knepley [knepley at gmail.com] ????: 2013?5?10? 19:55 ???: ?? ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] ??: ??: ??: PETsc problem On Fri, May 10, 2013 at 6:20 AM, ?? > wrote: The configure.log and make.log are attached. it shows that 'win32fe cl' does not work. I dont' know what the problem is. Can you compile something with it? It looks like it is not installed correctly. Matt Thanks! Hao ________________________________________ ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] ????: 2013?5?10? 15:16 ???: ??; petsc-users ??: Re: [petsc-users] ??: ??: PETsc problem ?? > writes: > it shows that 'win32fe cl' does not work. I dont' know what the problem is. You couldn't have found a less helpful way to report this. Note the bold part: http://www.mcs.anl.gov/petsc/documentation/bugreporting.html -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 49318 bytes Desc: configure.log URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: application/octet-stream Size: 82 bytes Desc: make.log URL: From gaetank at gmail.com Mon May 20 08:26:48 2013 From: gaetank at gmail.com (Gaetan Kenway) Date: Mon, 20 May 2013 09:26:48 -0400 Subject: [petsc-users] h-FGMRES In-Reply-To: References: Message-ID: Hi again I installed petsc3.4.0 and I am still getting the following error when running with the following options (on 64 procs) # Matrix Options -matload_block_size 5 -mat_type mpibaij # KSP solver options -ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 # Nested GMRES Options -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1 Any thoughts? Thank you, Gaetan [44]PETSC ERROR: ------------------------------------------------------------------------ [44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [44]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [44]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSCERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [44]PETSC ERROR: likely location of problem given in stack below [44]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [44]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [44]PETSC ERROR: INSTEAD the line number of the start of the function [44]PETSC ERROR: is given. [44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c [44]PETSC ERROR: [44] PCSetUp_BJacobi line 24 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c [44]PETSC ERROR: [44] PCSetUp line 868 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c [44]PETSC ERROR: [44] KSPSetUp line 192 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c [44]PETSC ERROR: [44] KSPSolve line 356 /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c [43]PETSC ERROR: ------------------------------------------------------------------------ On Sun, May 19, 2013 at 11:15 PM, Barry Smith wrote: > > You should be using PETSc version 3.4 which was recently released and > is what the paper is based on. > > Barry > > On May 19, 2013, at 10:11 PM, Gaetan Kenway wrote: > > > Hi Everyone > > > > I am trying to replicate the type of preconditioner described in > "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". > > > > I have used the following options: (I'm using fortran so the following > is my petsc_options file) > > > > # Matrix Options > > -matload_block_size 5 > > -mat_type mpibaij > > > > # KSP solver options > > -ksp_type gmres > > -ksp_max_it 1000 > > -ksp_gmres_restart 200 > > -ksp_monitor > > -ksp_view > > -ksp_pc_side right > > -ksp_rtol 1e-6 > > > > # Nested GMRES Options > > -pc_type bjacobi > > -pc_bjacobi_blocks 4 > > -sub_ksp_type gmres > > -sub_ksp_max_it 5 > > -sub_pc_type bjacobi > > -sub_sub_pc_type ilu > > -sub_sub_pc_factor_mat_ordering_type rcm > > -sub_sub_pc_factor_levels 1 > > > > The test is run on 64 processors and the total number of block jacobi > blocks is 4 (less than nproc). The error I get is: > > > > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > > [6]PETSC ERROR: INSTEAD the line number of the start of the > function > > [6]PETSC ERROR: is given. > > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 > /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c > > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 > /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c > > [6]PETSC ERROR: [6] PCSetUp line 810 > /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c > > [6]PETSC ERROR: [6] KSPSetUp line 182 > /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c > > [6]PETSC ERROR: [6] KSPSolve line 351 > /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c > > [6]PETSC ERROR: --------------------- Error Message > ------------------------------------ > > [6]PETSC ERROR: Signal received! > > [6]PETSC ERROR: > ------------------------------------------------------------------------ > > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 > 15:10:41 CST 2012 > > [6]PETSC ERROR: See docs/changes/index.html for recent updates. > > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [6]PETSC ERROR: > ------------------------------------------------------------------------ > > [6]PETSC ERROR: > ------------------------------------------------------------------------ > > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway Sun > May 19 23:01:52 2013 > > [6]PETSC ERROR: Libraries linked from > /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib > > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 > > [6]PETSC ERROR: Configure options --with-shared-libraries > --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes > --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real > -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic > > [6]PETSC ERROR: > ------------------------------------------------------------------------ > > > > If the number of blocks is greater than or equal to the number of > processors it runs fine. I'm using version 3.3-p5. > > > > The options as listed in the paper are: > > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi > -flow_pc_bjacobi_blocks ngp > > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type bjacobi > > -flow_sub_sub_pc_type ilu > > > > Any suggestions would be greatly appreciated. > > > > Thank you, > > > > Gaetan Kenway > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaetank at gmail.com Mon May 20 08:51:24 2013 From: gaetank at gmail.com (Gaetan Kenway) Date: Mon, 20 May 2013 09:51:24 -0400 Subject: [petsc-users] h-FGMRES In-Reply-To: References: Message-ID: Hi again It runs if the mattype is mpiaij instead of mpibaij. I gather this is not implemented for the blocked matrix types? Gaetan On Mon, May 20, 2013 at 9:26 AM, Gaetan Kenway wrote: > Hi again > > I installed petsc3.4.0 and I am still getting the following error when > running with the following options (on 64 procs) > > # Matrix Options > -matload_block_size 5 -mat_type mpibaij > > # KSP solver options > -ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor > -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 > > # Nested GMRES Options > -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres -sub_ksp_max_it > 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu > -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1 > > Any thoughts? > > Thank you, > > Gaetan > > [44]PETSC ERROR: > ------------------------------------------------------------------------ > [44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [44]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [44]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSCERROR: or try > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > corruption errors > [44]PETSC ERROR: likely location of problem given in stack below > [44]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [44]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [44]PETSC ERROR: INSTEAD the line number of the start of the function > [44]PETSC ERROR: is given. > [44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197 > /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c > [44]PETSC ERROR: [44] PCSetUp_BJacobi line 24 > /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c > [44]PETSC ERROR: [44] PCSetUp line 868 > /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c > [44]PETSC ERROR: [44] KSPSetUp line 192 > /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c > [44]PETSC ERROR: [44] KSPSolve line 356 > /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c > [43]PETSC ERROR: > ------------------------------------------------------------------------ > > > On Sun, May 19, 2013 at 11:15 PM, Barry Smith wrote: > >> >> You should be using PETSc version 3.4 which was recently released and >> is what the paper is based on. >> >> Barry >> >> On May 19, 2013, at 10:11 PM, Gaetan Kenway wrote: >> >> > Hi Everyone >> > >> > I am trying to replicate the type of preconditioner described in >> "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". >> > >> > I have used the following options: (I'm using fortran so the following >> is my petsc_options file) >> > >> > # Matrix Options >> > -matload_block_size 5 >> > -mat_type mpibaij >> > >> > # KSP solver options >> > -ksp_type gmres >> > -ksp_max_it 1000 >> > -ksp_gmres_restart 200 >> > -ksp_monitor >> > -ksp_view >> > -ksp_pc_side right >> > -ksp_rtol 1e-6 >> > >> > # Nested GMRES Options >> > -pc_type bjacobi >> > -pc_bjacobi_blocks 4 >> > -sub_ksp_type gmres >> > -sub_ksp_max_it 5 >> > -sub_pc_type bjacobi >> > -sub_sub_pc_type ilu >> > -sub_sub_pc_factor_mat_ordering_type rcm >> > -sub_sub_pc_factor_levels 1 >> > >> > The test is run on 64 processors and the total number of block jacobi >> blocks is 4 (less than nproc). The error I get is: >> > >> > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> > [6]PETSC ERROR: INSTEAD the line number of the start of the >> function >> > [6]PETSC ERROR: is given. >> > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 >> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >> > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 >> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >> > [6]PETSC ERROR: [6] PCSetUp line 810 >> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c >> > [6]PETSC ERROR: [6] KSPSetUp line 182 >> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >> > [6]PETSC ERROR: [6] KSPSolve line 351 >> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >> > [6]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> > [6]PETSC ERROR: Signal received! >> > [6]PETSC ERROR: >> ------------------------------------------------------------------------ >> > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 >> 15:10:41 CST 2012 >> > [6]PETSC ERROR: See docs/changes/index.html for recent updates. >> > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> > [6]PETSC ERROR: >> ------------------------------------------------------------------------ >> > [6]PETSC ERROR: >> ------------------------------------------------------------------------ >> > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway Sun >> May 19 23:01:52 2013 >> > [6]PETSC ERROR: Libraries linked from >> /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib >> > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 >> > [6]PETSC ERROR: Configure options --with-shared-libraries >> --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes >> --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real >> -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic >> > [6]PETSC ERROR: >> ------------------------------------------------------------------------ >> > >> > If the number of blocks is greater than or equal to the number of >> processors it runs fine. I'm using version 3.3-p5. >> > >> > The options as listed in the paper are: >> > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi >> -flow_pc_bjacobi_blocks ngp >> > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type >> bjacobi >> > -flow_sub_sub_pc_type ilu >> > >> > Any suggestions would be greatly appreciated. >> > >> > Thank you, >> > >> > Gaetan Kenway >> > >> > >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Mon May 20 09:34:29 2013 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Mon, 20 May 2013 09:34:29 -0500 Subject: [petsc-users] h-FGMRES In-Reply-To: References: Message-ID: Gaetan : > > It runs if the mattype is mpiaij instead of mpibaij. I gather this is not > implemented for the blocked matrix types? It is not tested for mpibaij format yet. I'll check it. The paper uses mpiaij format. Hong > > Gaetan > > On Mon, May 20, 2013 at 9:26 AM, Gaetan Kenway wrote: > >> Hi again >> >> I installed petsc3.4.0 and I am still getting the following error when >> running with the following options (on 64 procs) >> >> # Matrix Options >> -matload_block_size 5 -mat_type mpibaij >> >> # KSP solver options >> -ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor >> -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 >> >> # Nested GMRES Options >> -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres >> -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu >> -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1 >> >> Any thoughts? >> >> Thank you, >> >> Gaetan >> >> [44]PETSC ERROR: >> ------------------------------------------------------------------------ >> [44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range >> [44]PETSC ERROR: Try option -start_in_debugger or >> -on_error_attach_debugger >> [44]PETSC ERROR: or see >> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSCERROR: or try >> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >> corruption errors >> [44]PETSC ERROR: likely location of problem given in stack below >> [44]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >> [44]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> [44]PETSC ERROR: INSTEAD the line number of the start of the >> function >> [44]PETSC ERROR: is given. >> [44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197 >> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >> [44]PETSC ERROR: [44] PCSetUp_BJacobi line 24 >> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >> [44]PETSC ERROR: [44] PCSetUp line 868 >> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c >> [44]PETSC ERROR: [44] KSPSetUp line 192 >> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >> [44]PETSC ERROR: [44] KSPSolve line 356 >> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >> [43]PETSC ERROR: >> ------------------------------------------------------------------------ >> >> >> On Sun, May 19, 2013 at 11:15 PM, Barry Smith wrote: >> >>> >>> You should be using PETSc version 3.4 which was recently released and >>> is what the paper is based on. >>> >>> Barry >>> >>> On May 19, 2013, at 10:11 PM, Gaetan Kenway wrote: >>> >>> > Hi Everyone >>> > >>> > I am trying to replicate the type of preconditioner described in >>> "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". >>> > >>> > I have used the following options: (I'm using fortran so the following >>> is my petsc_options file) >>> > >>> > # Matrix Options >>> > -matload_block_size 5 >>> > -mat_type mpibaij >>> > >>> > # KSP solver options >>> > -ksp_type gmres >>> > -ksp_max_it 1000 >>> > -ksp_gmres_restart 200 >>> > -ksp_monitor >>> > -ksp_view >>> > -ksp_pc_side right >>> > -ksp_rtol 1e-6 >>> > >>> > # Nested GMRES Options >>> > -pc_type bjacobi >>> > -pc_bjacobi_blocks 4 >>> > -sub_ksp_type gmres >>> > -sub_ksp_max_it 5 >>> > -sub_pc_type bjacobi >>> > -sub_sub_pc_type ilu >>> > -sub_sub_pc_factor_mat_ordering_type rcm >>> > -sub_sub_pc_factor_levels 1 >>> > >>> > The test is run on 64 processors and the total number of block jacobi >>> blocks is 4 (less than nproc). The error I get is: >>> > >>> > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>> available, >>> > [6]PETSC ERROR: INSTEAD the line number of the start of the >>> function >>> > [6]PETSC ERROR: is given. >>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 >>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 >>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>> > [6]PETSC ERROR: [6] PCSetUp line 810 >>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c >>> > [6]PETSC ERROR: [6] KSPSetUp line 182 >>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>> > [6]PETSC ERROR: [6] KSPSolve line 351 >>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>> > [6]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> > [6]PETSC ERROR: Signal received! >>> > [6]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 >>> 15:10:41 CST 2012 >>> > [6]PETSC ERROR: See docs/changes/index.html for recent updates. >>> > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> > [6]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> > [6]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway Sun >>> May 19 23:01:52 2013 >>> > [6]PETSC ERROR: Libraries linked from >>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib >>> > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 >>> > [6]PETSC ERROR: Configure options --with-shared-libraries >>> --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes >>> --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real >>> -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic >>> > [6]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> > >>> > If the number of blocks is greater than or equal to the number of >>> processors it runs fine. I'm using version 3.3-p5. >>> > >>> > The options as listed in the paper are: >>> > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi >>> -flow_pc_bjacobi_blocks ngp >>> > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type >>> bjacobi >>> > -flow_sub_sub_pc_type ilu >>> > >>> > Any suggestions would be greatly appreciated. >>> > >>> > Thank you, >>> > >>> > Gaetan Kenway >>> > >>> > >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From member at linkedin.com Mon May 20 09:57:00 2013 From: member at linkedin.com (Jesse Lu) Date: Mon, 20 May 2013 14:57:00 +0000 (UTC) Subject: [petsc-users] Matt, please join my network on LinkedIn Message-ID: <1691689921.23703491.1369061820038.JavaMail.app@ela4-app0129.prod> LinkedIn ------------ Jesse Lu requested to add you as a connection on LinkedIn: ------------------------------------------ Matt, I'd like to add you to my professional network on LinkedIn. - Jesse Accept invitation from Jesse Lu http://www.linkedin.com/e/-r9oj6w-hgxs254b-j/NPBLyes6_CJvfaFX95qTY0Fn_yVIxe9EWtXp/blk/I386819213_125/3wOtCVFbmdxnSVFbm8JrnpKqlZJrmZzbmNJpjRQnOpBtn9QfmhBt71BoSd1p65Lr6lOfPkOclYPcj8VcjwSe3d9bSZScmUJq75EbPoUcjAUc3kMdjALrCBxbOYWrSlI/eml-comm_invm-b-in_ac-inv28/?hs=false&tok=03wi3D85mxaRM1 View profile of Jesse Lu http://www.linkedin.com/e/-r9oj6w-hgxs254b-j/rso/214596891/WcGS/name/129573335_I386819213_125/?hs=false&tok=2Kfm819t6xaRM1 ------------------------------------------ You are receiving Invitation emails. This email was intended for Matt Funk. Learn why this is included: http://www.linkedin.com/e/-r9oj6w-hgxs254b-j/plh/http%3A%2F%2Fhelp%2Elinkedin%2Ecom%2Fapp%2Fanswers%2Fdetail%2Fa_id%2F4788/-GXI/?hs=false&tok=3tJBatk26xaRM1 (c) 2012, LinkedIn Corporation. 2029 Stierlin Ct, Mountain View, CA 94043, USA. -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon May 20 10:23:18 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 20 May 2013 10:23:18 -0500 (CDT) Subject: [petsc-users] =?gb2312?b?tPC4tDogtPC4tDogILTwuLQ6ILTwuLQ6INeqt6I6?= =?gb2312?b?IFBFVHNjIHByb2JsZW0=?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> Message-ID: On Mon, 20 May 2013, ?? wrote: > I can use cl to compile file under Windows. But even if I include path which cl is in in the enviromental variable of Cygwin, it still shows "Win32fe cl" does not work. Have you checked the windows installation instructions? https://www.mcs.anl.gov/petsc/documentation/installation.html#windows You should start the 'compiler cmd' - and then run 'bash --login' inside it [and not 'set cygwin path to include cl'] Satish > The attached is the configure.log and make.log > Thanks! > > Hao > ________________________________ > ???: Matthew Knepley [knepley at gmail.com] > ????: 2013?5?11? 4:32 > ???: ??; petsc-users at mcs.anl.gov > ??: Re: ??: [petsc-users] ??: ??: ??: PETsc problem > > On Fri, May 10, 2013 at 12:11 PM, ?? > wrote: > What do you mean by compiling something with it ? I don't know. you mean 'win32fe cl' is not installed correctly? > > I mean, can you compile a file using cl from the command line? > > Matt > > > > Hao > ________________________________ > ???: Matthew Knepley [knepley at gmail.com] > ????: 2013?5?10? 19:55 > ???: ?? > ??: petsc-users at mcs.anl.gov > ??: Re: [petsc-users] ??: ??: ??: PETsc problem > > On Fri, May 10, 2013 at 6:20 AM, ?? > wrote: > > The configure.log and make.log are attached. > > it shows that 'win32fe cl' does not work. I dont' know what the problem is. > > Can you compile something with it? It looks like it is not installed correctly. > > Matt > > Thanks! > > > > Hao > ________________________________________ > ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] > ????: 2013?5?10? 15:16 > ???: ??; petsc-users > ??: Re: [petsc-users] ??: ??: PETsc problem > > ?? > writes: > > > it shows that 'win32fe cl' does not work. I dont' know what the problem is. > > You couldn't have found a less helpful way to report this. > > Note the bold part: > > http://www.mcs.anl.gov/petsc/documentation/bugreporting.html > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > From gaetank at gmail.com Mon May 20 10:28:17 2013 From: gaetank at gmail.com (Gaetan Kenway) Date: Mon, 20 May 2013 11:28:17 -0400 Subject: [petsc-users] h-FGMRES In-Reply-To: References: Message-ID: Thanks. On a related note, I tried using the ASM version of the same approach; that is -pc_type asm -pc_asm_blocks 4 with the remainder of the options the same. This gives a message that the number of blocks is less than the number of processors (sorry I don't have the exact message anymore). I get this error with both mpiaij and mpibaij types. Has this approach been implemented/do you think there would be any benefit from the approach? Thank you, Gaetan On Mon, May 20, 2013 at 10:34 AM, Hong Zhang wrote: > Gaetan : > >> >> It runs if the mattype is mpiaij instead of mpibaij. I gather this is not >> implemented for the blocked matrix types? > > It is not tested for mpibaij format yet. I'll check it. > The paper uses mpiaij format. > > Hong > >> >> Gaetan >> >> On Mon, May 20, 2013 at 9:26 AM, Gaetan Kenway wrote: >> >>> Hi again >>> >>> I installed petsc3.4.0 and I am still getting the following error when >>> running with the following options (on 64 procs) >>> >>> # Matrix Options >>> -matload_block_size 5 -mat_type mpibaij >>> >>> # KSP solver options >>> -ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor >>> -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 >>> >>> # Nested GMRES Options >>> -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres >>> -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu >>> -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1 >>> >>> Any thoughts? >>> >>> Thank you, >>> >>> Gaetan >>> >>> [44]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>> probably memory access out of range >>> [44]PETSC ERROR: Try option -start_in_debugger or >>> -on_error_attach_debugger >>> [44]PETSC ERROR: or see >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSCERROR: or try >>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>> corruption errors >>> [44]PETSC ERROR: likely location of problem given in stack below >>> [44]PETSC ERROR: --------------------- Stack Frames >>> ------------------------------------ >>> [44]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>> available, >>> [44]PETSC ERROR: INSTEAD the line number of the start of the >>> function >>> [44]PETSC ERROR: is given. >>> [44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197 >>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >>> [44]PETSC ERROR: [44] PCSetUp_BJacobi line 24 >>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >>> [44]PETSC ERROR: [44] PCSetUp line 868 >>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c >>> [44]PETSC ERROR: [44] KSPSetUp line 192 >>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >>> [44]PETSC ERROR: [44] KSPSolve line 356 >>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >>> [43]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> >>> >>> On Sun, May 19, 2013 at 11:15 PM, Barry Smith wrote: >>> >>>> >>>> You should be using PETSc version 3.4 which was recently released >>>> and is what the paper is based on. >>>> >>>> Barry >>>> >>>> On May 19, 2013, at 10:11 PM, Gaetan Kenway wrote: >>>> >>>> > Hi Everyone >>>> > >>>> > I am trying to replicate the type of preconditioner described in >>>> "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". >>>> > >>>> > I have used the following options: (I'm using fortran so the >>>> following is my petsc_options file) >>>> > >>>> > # Matrix Options >>>> > -matload_block_size 5 >>>> > -mat_type mpibaij >>>> > >>>> > # KSP solver options >>>> > -ksp_type gmres >>>> > -ksp_max_it 1000 >>>> > -ksp_gmres_restart 200 >>>> > -ksp_monitor >>>> > -ksp_view >>>> > -ksp_pc_side right >>>> > -ksp_rtol 1e-6 >>>> > >>>> > # Nested GMRES Options >>>> > -pc_type bjacobi >>>> > -pc_bjacobi_blocks 4 >>>> > -sub_ksp_type gmres >>>> > -sub_ksp_max_it 5 >>>> > -sub_pc_type bjacobi >>>> > -sub_sub_pc_type ilu >>>> > -sub_sub_pc_factor_mat_ordering_type rcm >>>> > -sub_sub_pc_factor_levels 1 >>>> > >>>> > The test is run on 64 processors and the total number of block >>>> jacobi blocks is 4 (less than nproc). The error I get is: >>>> > >>>> > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>> available, >>>> > [6]PETSC ERROR: INSTEAD the line number of the start of the >>>> function >>>> > [6]PETSC ERROR: is given. >>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 >>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 >>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>>> > [6]PETSC ERROR: [6] PCSetUp line 810 >>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c >>>> > [6]PETSC ERROR: [6] KSPSetUp line 182 >>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>>> > [6]PETSC ERROR: [6] KSPSolve line 351 >>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>>> > [6]PETSC ERROR: --------------------- Error Message >>>> ------------------------------------ >>>> > [6]PETSC ERROR: Signal received! >>>> > [6]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 >>>> 15:10:41 CST 2012 >>>> > [6]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> > [6]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> > [6]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway >>>> Sun May 19 23:01:52 2013 >>>> > [6]PETSC ERROR: Libraries linked from >>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib >>>> > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 >>>> > [6]PETSC ERROR: Configure options --with-shared-libraries >>>> --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes >>>> --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real >>>> -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic >>>> > [6]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> > >>>> > If the number of blocks is greater than or equal to the number of >>>> processors it runs fine. I'm using version 3.3-p5. >>>> > >>>> > The options as listed in the paper are: >>>> > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi >>>> -flow_pc_bjacobi_blocks ngp >>>> > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type >>>> bjacobi >>>> > -flow_sub_sub_pc_type ilu >>>> > >>>> > Any suggestions would be greatly appreciated. >>>> > >>>> > Thank you, >>>> > >>>> > Gaetan Kenway >>>> > >>>> > >>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon May 20 10:38:53 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 20 May 2013 10:38:53 -0500 (CDT) Subject: [petsc-users] Matt, please join my network on LinkedIn In-Reply-To: <1691689921.23703491.1369061820038.JavaMail.app@ela4-app0129.prod> References: <1691689921.23703491.1369061820038.JavaMail.app@ela4-app0129.prod> Message-ID: I have 'member at linkedin.com' listed in mainlman's "discard_these_nonmembers (privacy)" option - but still this e-mail came through? Should we be removing e-mail ids from subscriber list [from where these requests keep coming?]. It looks more like spam-bots doing this. Satish ---------- Date: Mon, 20 May 2013 14:57:00 +0000 (UTC) From: Jesse Lu Reply-To: Jesse Lu To: Matt F. Subject: [petsc-users] Matt, please join my network on LinkedIn On Mon, 20 May 2013, Jesse Lu wrote: > Jesse Lu requested to add you as a connection on LinkedIn: > > > ------------------------------------------ > > Matt, > > I'd like to add you to my professional network on LinkedIn. > > - Jesse > From kaus at uni-mainz.de Mon May 20 10:52:47 2013 From: kaus at uni-mainz.de (Kaus, Boris) Date: Mon, 20 May 2013 15:52:47 +0000 Subject: [petsc-users] Recursive fieldsplit In-Reply-To: References: <4FEC7BCD.7030803@uni-mainz.de> Message-ID: <5976EEB8-DFCC-4260-B124-1F46420D00CB@uni-mainz.de> Hi, Example 42 (src/ksp/ksp/examples/tutorials) with recursive fieldsplit used to work in petsc 3.2 and petsc 3.3p2 with the patch suggested below by Barry. mpiexec -np 4 ./ex42 \ -stokes_ksp_type gcr \ -stokes_ksp_rtol 1.0e-6 \ -stokes_pc_type fieldsplit \ -stokes_pc_fieldsplit_type SCHUR \ -stokes_pc_fieldsplit_schur_factorization_type UPPER \ -stokes_fieldsplit_u_ksp_type fgmres \ -stokes_fieldsplit_u_ksp_rtol 1e-3 \ -stokes_fieldsplit_u_pc_type fieldsplit \ -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \ -stokes_fieldsplit_u_fieldsplit_pc_type ml \ -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \ -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \ -stokes_fieldsplit_p_ksp_type preonly \ -stokes_fieldsplit_p_pc_type jacobi \ -stokes_ksp_monitor_blocks \ -mx 16 \ -model 3 It no longer works in petsc 3.4.0. Is this something that can be fixed and potentially added to 3.4.1? Our production code relies on similar functionality. thanks a lot! Boris On Jun 28, 2012, at 8:55 PM, Barry Smith wrote: > Anton, > > This came about because we are now being much more pedantic about the blocksizes of PETSc objects and not allowing them to be causally changed when they shouldn't be. > > You can resolve this problem by editing the file src/ksp/pc/impls/fieldsplit/fieldsplit.c locate the function > > #undef __FUNCT__ > #define __FUNCT__ "PCApply_FieldSplit" > static PetscErrorCode PCApply_FieldSplit(PC pc,Vec x,Vec y) > { > PC_FieldSplit *jac = (PC_FieldSplit*)pc->data; > PetscErrorCode ierr; > PC_FieldSplitLink ilink = jac->head; > PetscInt cnt,bs; > > PetscFunctionBegin; > > and add the two lines right here > > x->map->bs = jac->bs; > y->map->bs = jac->bs; > > > then run make cmake in that directory. > > To resolve this permanently we will need to figure out how to insure those inner vectors are created with the correct block size. Are you willing to share your code with petsc-maint at mcs.anl.gov so that we can reproduce the problem and fix it properly for the long run? (The problem is in PETSc not in your code). > > Barry > > > > On Jun 28, 2012, at 10:44 AM, Anton Popov wrote: > >> Dear petsc team, >> >> I'm trying to use fieldsplit preconditioner for the velocity block in the Stokes system which is also preconditioned by >> fieldsplit (kind of recursive). >> >> Running example 42 from src/ksp/ksp/examples/tutorials with petsc-3.2, as follows: >> >> mpiexec -np 4 ./ex42 \ >> -stokes_ksp_type gcr \ >> -stokes_ksp_rtol 1.0e-6 \ >> -stokes_pc_type fieldsplit \ >> -stokes_pc_fieldsplit_type SCHUR \ >> -stokes_pc_fieldsplit_schur_factorization_type UPPER \ >> -stokes_fieldsplit_u_ksp_type fgmres \ >> -stokes_fieldsplit_u_ksp_rtol 1e-3 \ >> -stokes_fieldsplit_u_pc_type fieldsplit \ >> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \ >> -stokes_fieldsplit_u_fieldsplit_pc_type ml \ >> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \ >> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \ >> -stokes_fieldsplit_p_ksp_type preonly \ >> -stokes_fieldsplit_p_pc_type jacobi \ >> -stokes_ksp_monitor_blocks \ >> -mx 16 \ >> -model 3 >> >> gives nicely looking output. >> >> But! Repeating the same exercise with petsc-3.3, like this (actually, there is only one difference: factorization -> fact): >> >> mpiexec -np 4 ./ex42 \ >> -stokes_ksp_type gcr \ >> -stokes_ksp_rtol 1.0e-6 \ >> -stokes_pc_type fieldsplit \ >> -stokes_pc_fieldsplit_type SCHUR \ >> -stokes-pc_fieldsplit_schur_fact_type UPPER \ >> -stokes_fieldsplit_u_ksp_type fgmres \ >> -stokes_fieldsplit_u_ksp_rtol 1e-3 \ >> -stokes_fieldsplit_u_pc_type fieldsplit \ >> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \ >> -stokes_fieldsplit_u_fieldsplit_pc_type ml \ >> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \ >> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \ >> -stokes_fieldsplit_p_ksp_type preonly \ >> -stokes_fieldsplit_p_pc_type jacobi \ >> -stokes_ksp_monitor_blocks \ >> -mx 16 \ >> -model 3 >> >> curses me hardly by claiming: >> >> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >> [0]PETSC ERROR: Object is in wrong state! >> [0]PETSC ERROR: Blocksize of x vector 1 does not match fieldsplit blocksize 3! >> [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 0, Tue Jun 5 14:20:42 CDT 2012 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./ex42 on a int32-deb named mac11-005.geo.uni-mainz.de by anton Thu Jun 28 17:06:53 2012 >> [0]PETSC ERROR: Libraries linked from /Users/anton/LIB/petsc-3.3-p0/int32-debug/lib >> [0]PETSC ERROR: Configure run at Tue Jun 12 15:32:21 2012 >> [0]PETSC ERROR: Configure options PETSC_DIR=/Users/anton/LIB/petsc-3.3-p0 PETSC_ARCH=int32-debug --download-f-blas-lapack=1 --with-debugging=1 --COPTFLAGS="-g -O0" --FOPTFLAGS="-g -O0" --CXXOPTFLAGS="-g -O0" --with-c++-support=1 --with-fortran=1 --with-fortran-kernels=1 --with-large-file-io=1 --with-mpi-compilers=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-ml=1 --download-hypre=1 --download-blacs=1 --download-scalapack=1 --download-metis=1 --download-parmetis=1 --download-mumps=1 --download-superlu_dist=1 >> [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0]PETSC ERROR: PCApply_FieldSplit() line 726 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c >> [0]PETSC ERROR: PCApply() line 384 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPFGMRESCycle() line 169 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c >> [0]PETSC ERROR: KSPSolve_FGMRES() line 294 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c >> [0]PETSC ERROR: KSPSolve() line 446 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCApply_FieldSplit_Schur() line 693 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c >> [0]PETSC ERROR: PCApply() line 384 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSolve_GCR_cycle() line 47 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c >> [0]PETSC ERROR: KSPSolve_GCR() line 117 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c >> [0]PETSC ERROR: KSPSolve() line 446 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: solve_stokes_3d_coupled() line 2045 in src/ksp/ksp/examples/tutorials/ex42.c >> [0]PETSC ERROR: main() line 2106 in src/ksp/ksp/examples/tutorials/ex42.c >> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >> >> Similar error appeared in our code after upgrading to petsc-3.3, and we're using similar functionality and options as I posted above. >> >> Please explain this issue. An advice how to get rid of the error is also appreciated. >> >> Thanks a lot >> >> Anton > ----------------------------------------------------------------------------- Boris J.P. Kaus Institute of Geosciences, Geocycles Research Center & Center for Computational Sciences. University of Mainz, Mainz, Germany Office: 00-285 Tel: +49.6131.392.4527 http://www.geophysik.uni-mainz.de ----------------------------------------------------------------------------- From hzhang at mcs.anl.gov Mon May 20 11:08:35 2013 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Mon, 20 May 2013 11:08:35 -0500 Subject: [petsc-users] h-FGMRES In-Reply-To: References: Message-ID: Gaetan: > > On a related note, I tried using the ASM version of the same approach; > that is -pc_type asm -pc_asm_blocks 4 with the remainder of the options > the same. This gives a message that the number of blocks is less than the > number of processors (sorry I don't have the exact message anymore). I get > this error with both mpiaij and mpibaij types. > Thanks for reporting it. The '_Multiproc' version of asm is not implemented yet, although it is logically same as bjacobi_Multiproc. I added to our 'to-do' list https://bitbucket.org/petsc/petsc/issue/42/implement-pcsetup_asm_multiproc > > Has this approach been implemented/do you think there would be any benefit > from the approach? > It would be beneficial for applications that give better convergence and performance with asm than bjacobi. It is necessary to provide such support in petsc library. We'll let you know when it is implemented. Hong > > On Mon, May 20, 2013 at 10:34 AM, Hong Zhang wrote: > >> Gaetan : >> >>> >>> It runs if the mattype is mpiaij instead of mpibaij. I gather this is >>> not implemented for the blocked matrix types? >> >> It is not tested for mpibaij format yet. I'll check it. >> The paper uses mpiaij format. >> >> Hong >> >>> >>> Gaetan >>> >>> On Mon, May 20, 2013 at 9:26 AM, Gaetan Kenway wrote: >>> >>>> Hi again >>>> >>>> I installed petsc3.4.0 and I am still getting the following error when >>>> running with the following options (on 64 procs) >>>> >>>> # Matrix Options >>>> -matload_block_size 5 -mat_type mpibaij >>>> >>>> # KSP solver options >>>> -ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor >>>> -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 >>>> >>>> # Nested GMRES Options >>>> -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres >>>> -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu >>>> -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1 >>>> >>>> Any thoughts? >>>> >>>> Thank you, >>>> >>>> Gaetan >>>> >>>> [44]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>> probably memory access out of range >>>> [44]PETSC ERROR: Try option -start_in_debugger or >>>> -on_error_attach_debugger >>>> [44]PETSC ERROR: or see >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSCERROR: or try >>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>> corruption errors >>>> [44]PETSC ERROR: likely location of problem given in stack below >>>> [44]PETSC ERROR: --------------------- Stack Frames >>>> ------------------------------------ >>>> [44]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>> available, >>>> [44]PETSC ERROR: INSTEAD the line number of the start of the >>>> function >>>> [44]PETSC ERROR: is given. >>>> [44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197 >>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >>>> [44]PETSC ERROR: [44] PCSetUp_BJacobi line 24 >>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >>>> [44]PETSC ERROR: [44] PCSetUp line 868 >>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c >>>> [44]PETSC ERROR: [44] KSPSetUp line 192 >>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >>>> [44]PETSC ERROR: [44] KSPSolve line 356 >>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >>>> [43]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> >>>> >>>> On Sun, May 19, 2013 at 11:15 PM, Barry Smith wrote: >>>> >>>>> >>>>> You should be using PETSc version 3.4 which was recently released >>>>> and is what the paper is based on. >>>>> >>>>> Barry >>>>> >>>>> On May 19, 2013, at 10:11 PM, Gaetan Kenway wrote: >>>>> >>>>> > Hi Everyone >>>>> > >>>>> > I am trying to replicate the type of preconditioner described in >>>>> "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". >>>>> > >>>>> > I have used the following options: (I'm using fortran so the >>>>> following is my petsc_options file) >>>>> > >>>>> > # Matrix Options >>>>> > -matload_block_size 5 >>>>> > -mat_type mpibaij >>>>> > >>>>> > # KSP solver options >>>>> > -ksp_type gmres >>>>> > -ksp_max_it 1000 >>>>> > -ksp_gmres_restart 200 >>>>> > -ksp_monitor >>>>> > -ksp_view >>>>> > -ksp_pc_side right >>>>> > -ksp_rtol 1e-6 >>>>> > >>>>> > # Nested GMRES Options >>>>> > -pc_type bjacobi >>>>> > -pc_bjacobi_blocks 4 >>>>> > -sub_ksp_type gmres >>>>> > -sub_ksp_max_it 5 >>>>> > -sub_pc_type bjacobi >>>>> > -sub_sub_pc_type ilu >>>>> > -sub_sub_pc_factor_mat_ordering_type rcm >>>>> > -sub_sub_pc_factor_levels 1 >>>>> > >>>>> > The test is run on 64 processors and the total number of block >>>>> jacobi blocks is 4 (less than nproc). The error I get is: >>>>> > >>>>> > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>> available, >>>>> > [6]PETSC ERROR: INSTEAD the line number of the start of the >>>>> function >>>>> > [6]PETSC ERROR: is given. >>>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 >>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 >>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>>>> > [6]PETSC ERROR: [6] PCSetUp line 810 >>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c >>>>> > [6]PETSC ERROR: [6] KSPSetUp line 182 >>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>>>> > [6]PETSC ERROR: [6] KSPSolve line 351 >>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>>>> > [6]PETSC ERROR: --------------------- Error Message >>>>> ------------------------------------ >>>>> > [6]PETSC ERROR: Signal received! >>>>> > [6]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 >>>>> 15:10:41 CST 2012 >>>>> > [6]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>> > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>> > [6]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> > [6]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway >>>>> Sun May 19 23:01:52 2013 >>>>> > [6]PETSC ERROR: Libraries linked from >>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib >>>>> > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 >>>>> > [6]PETSC ERROR: Configure options --with-shared-libraries >>>>> --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes >>>>> --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real >>>>> -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic >>>>> > [6]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> > >>>>> > If the number of blocks is greater than or equal to the number of >>>>> processors it runs fine. I'm using version 3.3-p5. >>>>> > >>>>> > The options as listed in the paper are: >>>>> > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi >>>>> -flow_pc_bjacobi_blocks ngp >>>>> > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type >>>>> bjacobi >>>>> > -flow_sub_sub_pc_type ilu >>>>> > >>>>> > Any suggestions would be greatly appreciated. >>>>> > >>>>> > Thank you, >>>>> > >>>>> > Gaetan Kenway >>>>> > >>>>> > >>>>> >>>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaetank at gmail.com Mon May 20 11:12:35 2013 From: gaetank at gmail.com (Gaetan Kenway) Date: Mon, 20 May 2013 12:12:35 -0400 Subject: [petsc-users] h-FGMRES In-Reply-To: References: Message-ID: Thanks for having a look at it. I look forward to giving it a try when it implemented. And like bjacobi, support for mpibaij would be very beneficial to me. Thank you Gaetan On Mon, May 20, 2013 at 12:08 PM, Hong Zhang wrote: > Gaetan: > >> >> On a related note, I tried using the ASM version of the same approach; >> that is -pc_type asm -pc_asm_blocks 4 with the remainder of the options >> the same. This gives a message that the number of blocks is less than the >> number of processors (sorry I don't have the exact message anymore). I get >> this error with both mpiaij and mpibaij types. >> > > Thanks for reporting it. > The '_Multiproc' version of asm is not implemented yet, although it is > logically same as bjacobi_Multiproc. > I added to our 'to-do' list > https://bitbucket.org/petsc/petsc/issue/42/implement-pcsetup_asm_multiproc > >> >> Has this approach been implemented/do you think there would be any >> benefit from the approach? >> > It would be beneficial for applications that give better convergence and > performance with asm than bjacobi. > It is necessary to provide such support in petsc library. > We'll let you know when it is implemented. > > Hong > >> >> On Mon, May 20, 2013 at 10:34 AM, Hong Zhang wrote: >> >>> Gaetan : >>> >>>> >>>> It runs if the mattype is mpiaij instead of mpibaij. I gather this is >>>> not implemented for the blocked matrix types? >>> >>> It is not tested for mpibaij format yet. I'll check it. >>> The paper uses mpiaij format. >>> >>> Hong >>> >>>> >>>> Gaetan >>>> >>>> On Mon, May 20, 2013 at 9:26 AM, Gaetan Kenway wrote: >>>> >>>>> Hi again >>>>> >>>>> I installed petsc3.4.0 and I am still getting the following error when >>>>> running with the following options (on 64 procs) >>>>> >>>>> # Matrix Options >>>>> -matload_block_size 5 -mat_type mpibaij >>>>> >>>>> # KSP solver options >>>>> -ksp_type fgmres -ksp_max_it 1000 -ksp_gmres_restart 200 -ksp_monitor >>>>> -ksp_view -ksp_pc_side right -ksp_rtol 1e-6 >>>>> >>>>> # Nested GMRES Options >>>>> -pc_type bjacobi -pc_bjacobi_blocks 4 -sub_ksp_type gmres >>>>> -sub_ksp_max_it 5 -sub_pc_type bjacobi -sub_sub_pc_type ilu >>>>> -sub_sub_pc_factor_mat_ordering_type rcm -sub_sub_pc_factor_levels 1 >>>>> >>>>> Any thoughts? >>>>> >>>>> Thank you, >>>>> >>>>> Gaetan >>>>> >>>>> [44]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> [44]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>>> probably memory access out of range >>>>> [44]PETSC ERROR: Try option -start_in_debugger or >>>>> -on_error_attach_debugger >>>>> [44]PETSC ERROR: or see >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[44]PETSCERROR: or try >>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >>>>> corruption errors >>>>> [44]PETSC ERROR: likely location of problem given in stack below >>>>> [44]PETSC ERROR: --------------------- Stack Frames >>>>> ------------------------------------ >>>>> [44]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>> available, >>>>> [44]PETSC ERROR: INSTEAD the line number of the start of the >>>>> function >>>>> [44]PETSC ERROR: is given. >>>>> [44]PETSC ERROR: [44] PCSetUp_BJacobi_Multiproc line 1197 >>>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >>>>> [44]PETSC ERROR: [44] PCSetUp_BJacobi line 24 >>>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/impls/bjacobi/bjacobi.c >>>>> [44]PETSC ERROR: [44] PCSetUp line 868 >>>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/pc/interface/precon.c >>>>> [44]PETSC ERROR: [44] KSPSetUp line 192 >>>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >>>>> [44]PETSC ERROR: [44] KSPSolve line 356 >>>>> /home/j/jmartins/kenway/packages/petsc-3.4.0/src/ksp/ksp/interface/itfunc.c >>>>> [43]PETSC ERROR: >>>>> ------------------------------------------------------------------------ >>>>> >>>>> >>>>> On Sun, May 19, 2013 at 11:15 PM, Barry Smith wrote: >>>>> >>>>>> >>>>>> You should be using PETSc version 3.4 which was recently released >>>>>> and is what the paper is based on. >>>>>> >>>>>> Barry >>>>>> >>>>>> On May 19, 2013, at 10:11 PM, Gaetan Kenway >>>>>> wrote: >>>>>> >>>>>> > Hi Everyone >>>>>> > >>>>>> > I am trying to replicate the type of preconditioner described in >>>>>> "Hierarchical and Nested Krylov Methods for Extreme-Scale Computing". >>>>>> > >>>>>> > I have used the following options: (I'm using fortran so the >>>>>> following is my petsc_options file) >>>>>> > >>>>>> > # Matrix Options >>>>>> > -matload_block_size 5 >>>>>> > -mat_type mpibaij >>>>>> > >>>>>> > # KSP solver options >>>>>> > -ksp_type gmres >>>>>> > -ksp_max_it 1000 >>>>>> > -ksp_gmres_restart 200 >>>>>> > -ksp_monitor >>>>>> > -ksp_view >>>>>> > -ksp_pc_side right >>>>>> > -ksp_rtol 1e-6 >>>>>> > >>>>>> > # Nested GMRES Options >>>>>> > -pc_type bjacobi >>>>>> > -pc_bjacobi_blocks 4 >>>>>> > -sub_ksp_type gmres >>>>>> > -sub_ksp_max_it 5 >>>>>> > -sub_pc_type bjacobi >>>>>> > -sub_sub_pc_type ilu >>>>>> > -sub_sub_pc_factor_mat_ordering_type rcm >>>>>> > -sub_sub_pc_factor_levels 1 >>>>>> > >>>>>> > The test is run on 64 processors and the total number of block >>>>>> jacobi blocks is 4 (less than nproc). The error I get is: >>>>>> > >>>>>> > [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>> available, >>>>>> > [6]PETSC ERROR: INSTEAD the line number of the start of the >>>>>> function >>>>>> > [6]PETSC ERROR: is given. >>>>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi_Multiproc line 1269 >>>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>>>>> > [6]PETSC ERROR: [6] PCSetUp_BJacobi line 24 >>>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/impls/bjacobi/bjacobi.c >>>>>> > [6]PETSC ERROR: [6] PCSetUp line 810 >>>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/pc/interface/precon.c >>>>>> > [6]PETSC ERROR: [6] KSPSetUp line 182 >>>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>>>>> > [6]PETSC ERROR: [6] KSPSolve line 351 >>>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/src/ksp/ksp/interface/itfunc.c >>>>>> > [6]PETSC ERROR: --------------------- Error Message >>>>>> ------------------------------------ >>>>>> > [6]PETSC ERROR: Signal received! >>>>>> > [6]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> > [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 5, Sat Dec 1 >>>>>> 15:10:41 CST 2012 >>>>>> > [6]PETSC ERROR: See docs/changes/index.html for recent updates. >>>>>> > [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>>>> > [6]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> > [6]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> > [6]PETSC ERROR: ./main on a intel-rea named gpc-f109n001 by kenway >>>>>> Sun May 19 23:01:52 2013 >>>>>> > [6]PETSC ERROR: Libraries linked from >>>>>> /home/j/jmartins/kenway/packages/petsc-3.3-p5/intel-real-debug/lib >>>>>> > [6]PETSC ERROR: Configure run at Sun Jan 20 15:52:20 2013 >>>>>> > [6]PETSC ERROR: Configure options --with-shared-libraries >>>>>> --download-superlu_dist=yes --download-parmetis=yes --download-metis=yes >>>>>> --with-fortran-interfaces=1 --with-debugging=yes --with-scalar-type=real >>>>>> -with-petsc-arch=intel-real-debug --with-blas-lapack-dir= --with-pic >>>>>> > [6]PETSC ERROR: >>>>>> ------------------------------------------------------------------------ >>>>>> > >>>>>> > If the number of blocks is greater than or equal to the number of >>>>>> processors it runs fine. I'm using version 3.3-p5. >>>>>> > >>>>>> > The options as listed in the paper are: >>>>>> > -flow_ksp_type fgmres -flow_ksp_pc_side right -flow_pc_type bjacobi >>>>>> -flow_pc_bjacobi_blocks ngp >>>>>> > -flow_sub_ksp_type gmres -flow_sub_ksp_max_it 6 -flow_sub_pc_type >>>>>> bjacobi >>>>>> > -flow_sub_sub_pc_type ilu >>>>>> > >>>>>> > Any suggestions would be greatly appreciated. >>>>>> > >>>>>> > Thank you, >>>>>> > >>>>>> > Gaetan Kenway >>>>>> > >>>>>> > >>>>>> >>>>>> >>>>> >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Mon May 20 11:13:27 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 20 May 2013 11:13:27 -0500 Subject: [petsc-users] Recursive fieldsplit In-Reply-To: <5976EEB8-DFCC-4260-B124-1F46420D00CB@uni-mainz.de> References: <4FEC7BCD.7030803@uni-mainz.de> <5976EEB8-DFCC-4260-B124-1F46420D00CB@uni-mainz.de> Message-ID: <87bo85hbo8.fsf@mcs.anl.gov> "Kaus, Boris" writes: > Hi, > > Example 42 (src/ksp/ksp/examples/tutorials) with recursive fieldsplit used to work in petsc 3.2 and petsc 3.3p2 with the patch suggested below by Barry. Damn, https://bitbucket.org/petsc/petsc/commits/ce780c64fa296d753b6ed81263eb2f3164d5f63f The two reference commits are: https://bitbucket.org/petsc/petsc/commits/5847835f01a83a65f10e398c0972b3c7a1e1c5f4 https://bitbucket.org/petsc/petsc/commits/4442dace165024532baeb2e4a78f8791bee44482 We cannot just force the new block size into the input vector because that breaks other tests. We can fix it with disgusting hacks: x_bs = x->map->bs; x->map->bs = jac->bs; VecStrideGatherAll(x,jac->x,INSERT_VALUES); x->map->bs = x_bs; or we can replace our use of VecStrideGatherAll and VecStrideScatterAll with something else (or add an argument to the VecStride functions so that we can pass the block size---traditional users would pass PETSC_DECIDE). I don't consider making PetscLayout intentionally mutable is acceptable because it is shared many different ways. > mpiexec -np 4 ./ex42 \ > -stokes_ksp_type gcr \ > -stokes_ksp_rtol 1.0e-6 \ > -stokes_pc_type fieldsplit \ > -stokes_pc_fieldsplit_type SCHUR \ > -stokes_pc_fieldsplit_schur_factorization_type UPPER \ > -stokes_fieldsplit_u_ksp_type fgmres \ > -stokes_fieldsplit_u_ksp_rtol 1e-3 \ > -stokes_fieldsplit_u_pc_type fieldsplit \ > -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \ > -stokes_fieldsplit_u_fieldsplit_pc_type ml \ > -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \ > -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \ > -stokes_fieldsplit_p_ksp_type preonly \ > -stokes_fieldsplit_p_pc_type jacobi \ > -stokes_ksp_monitor_blocks \ > -mx 16 \ > -model 3 > > It no longer works in petsc 3.4.0. Is this something that can be fixed and potentially added to 3.4.1? > Our production code relies on similar functionality. > > thanks a lot! > > Boris > > > > On Jun 28, 2012, at 8:55 PM, Barry Smith wrote: > >> Anton, >> >> This came about because we are now being much more pedantic about the blocksizes of PETSc objects and not allowing them to be causally changed when they shouldn't be. >> >> You can resolve this problem by editing the file src/ksp/pc/impls/fieldsplit/fieldsplit.c locate the function >> >> #undef __FUNCT__ >> #define __FUNCT__ "PCApply_FieldSplit" >> static PetscErrorCode PCApply_FieldSplit(PC pc,Vec x,Vec y) >> { >> PC_FieldSplit *jac = (PC_FieldSplit*)pc->data; >> PetscErrorCode ierr; >> PC_FieldSplitLink ilink = jac->head; >> PetscInt cnt,bs; >> >> PetscFunctionBegin; >> >> and add the two lines right here >> >> x->map->bs = jac->bs; >> y->map->bs = jac->bs; >> >> >> then run make cmake in that directory. >> >> To resolve this permanently we will need to figure out how to insure those inner vectors are created with the correct block size. Are you willing to share your code with petsc-maint at mcs.anl.gov so that we can reproduce the problem and fix it properly for the long run? (The problem is in PETSc not in your code). >> >> Barry >> >> >> >> On Jun 28, 2012, at 10:44 AM, Anton Popov wrote: >> >>> Dear petsc team, >>> >>> I'm trying to use fieldsplit preconditioner for the velocity block in the Stokes system which is also preconditioned by >>> fieldsplit (kind of recursive). >>> >>> Running example 42 from src/ksp/ksp/examples/tutorials with petsc-3.2, as follows: >>> >>> mpiexec -np 4 ./ex42 \ >>> -stokes_ksp_type gcr \ >>> -stokes_ksp_rtol 1.0e-6 \ >>> -stokes_pc_type fieldsplit \ >>> -stokes_pc_fieldsplit_type SCHUR \ >>> -stokes_pc_fieldsplit_schur_factorization_type UPPER \ >>> -stokes_fieldsplit_u_ksp_type fgmres \ >>> -stokes_fieldsplit_u_ksp_rtol 1e-3 \ >>> -stokes_fieldsplit_u_pc_type fieldsplit \ >>> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \ >>> -stokes_fieldsplit_u_fieldsplit_pc_type ml \ >>> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \ >>> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \ >>> -stokes_fieldsplit_p_ksp_type preonly \ >>> -stokes_fieldsplit_p_pc_type jacobi \ >>> -stokes_ksp_monitor_blocks \ >>> -mx 16 \ >>> -model 3 >>> >>> gives nicely looking output. >>> >>> But! Repeating the same exercise with petsc-3.3, like this (actually, there is only one difference: factorization -> fact): >>> >>> mpiexec -np 4 ./ex42 \ >>> -stokes_ksp_type gcr \ >>> -stokes_ksp_rtol 1.0e-6 \ >>> -stokes_pc_type fieldsplit \ >>> -stokes_pc_fieldsplit_type SCHUR \ >>> -stokes-pc_fieldsplit_schur_fact_type UPPER \ >>> -stokes_fieldsplit_u_ksp_type fgmres \ >>> -stokes_fieldsplit_u_ksp_rtol 1e-3 \ >>> -stokes_fieldsplit_u_pc_type fieldsplit \ >>> -stokes_fieldsplit_u_fieldsplit_ksp_type preonly \ >>> -stokes_fieldsplit_u_fieldsplit_pc_type ml \ >>> -stokes_fieldsplit_u_pc_fieldsplit_block_size 3 \ >>> -stokes_fieldsplit_u_pc_fieldsplit_type ADDITIVE \ >>> -stokes_fieldsplit_p_ksp_type preonly \ >>> -stokes_fieldsplit_p_pc_type jacobi \ >>> -stokes_ksp_monitor_blocks \ >>> -mx 16 \ >>> -model 3 >>> >>> curses me hardly by claiming: >>> >>> [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >>> [0]PETSC ERROR: Object is in wrong state! >>> [0]PETSC ERROR: Blocksize of x vector 1 does not match fieldsplit blocksize 3! >>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 0, Tue Jun 5 14:20:42 CDT 2012 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./ex42 on a int32-deb named mac11-005.geo.uni-mainz.de by anton Thu Jun 28 17:06:53 2012 >>> [0]PETSC ERROR: Libraries linked from /Users/anton/LIB/petsc-3.3-p0/int32-debug/lib >>> [0]PETSC ERROR: Configure run at Tue Jun 12 15:32:21 2012 >>> [0]PETSC ERROR: Configure options PETSC_DIR=/Users/anton/LIB/petsc-3.3-p0 PETSC_ARCH=int32-debug --download-f-blas-lapack=1 --with-debugging=1 --COPTFLAGS="-g -O0" --FOPTFLAGS="-g -O0" --CXXOPTFLAGS="-g -O0" --with-c++-support=1 --with-fortran=1 --with-fortran-kernels=1 --with-large-file-io=1 --with-mpi-compilers=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-ml=1 --download-hypre=1 --download-blacs=1 --download-scalapack=1 --download-metis=1 --download-parmetis=1 --download-mumps=1 --download-superlu_dist=1 >>> [0]PETSC ERROR: ------------------------------------------------------------------------ >>> [0]PETSC ERROR: PCApply_FieldSplit() line 726 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>> [0]PETSC ERROR: PCApply() line 384 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPFGMRESCycle() line 169 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c >>> [0]PETSC ERROR: KSPSolve_FGMRES() line 294 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gmres/fgmres/fgmres.c >>> [0]PETSC ERROR: KSPSolve() line 446 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCApply_FieldSplit_Schur() line 693 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/impls/fieldsplit/fieldsplit.c >>> [0]PETSC ERROR: PCApply() line 384 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSolve_GCR_cycle() line 47 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c >>> [0]PETSC ERROR: KSPSolve_GCR() line 117 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/impls/gcr/gcr.c >>> [0]PETSC ERROR: KSPSolve() line 446 in /Users/anton/LIB/petsc-3.3-p0/src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: solve_stokes_3d_coupled() line 2045 in src/ksp/ksp/examples/tutorials/ex42.c >>> [0]PETSC ERROR: main() line 2106 in src/ksp/ksp/examples/tutorials/ex42.c >>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 >>> >>> Similar error appeared in our code after upgrading to petsc-3.3, and we're using similar functionality and options as I posted above. >>> >>> Please explain this issue. An advice how to get rid of the error is also appreciated. >>> >>> Thanks a lot >>> >>> Anton >> > > > > ----------------------------------------------------------------------------- > Boris J.P. Kaus > > Institute of Geosciences, > Geocycles Research Center & > Center for Computational Sciences. > University of Mainz, Mainz, Germany > Office: 00-285 > Tel: +49.6131.392.4527 > > http://www.geophysik.uni-mainz.de > ----------------------------------------------------------------------------- From mpovolot at purdue.edu Mon May 20 13:25:49 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Mon, 20 May 2013 14:25:49 -0400 Subject: [petsc-users] PARDISO and petsc Message-ID: <519A6AAD.7040805@purdue.edu> Hello everybody, does PETSc support interface to MKL PARDISO linear solver? I did not find PARDISO in the documentation of PETSc, but may be somebody tried this out already? Thank you, Michael. -- Michael Povolotskyi, PhD Research Assistant Professor Network for Computational Nanotechnology 207 S Martin Jischke Drive Purdue University, DLR, room 441-10 West Lafayette, Indiana 47907 phone: +1-765-494-9396 fax: +1-765-496-6026 From jedbrown at mcs.anl.gov Mon May 20 13:31:45 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 20 May 2013 13:31:45 -0500 Subject: [petsc-users] PARDISO and petsc In-Reply-To: <519A6AAD.7040805@purdue.edu> References: <519A6AAD.7040805@purdue.edu> Message-ID: <874ndxh59q.fsf@mcs.anl.gov> Michael Povolotskyi writes: > Hello everybody, > does PETSc support interface to MKL PARDISO linear solver? > I did not find PARDISO in the documentation of PETSc, but may be > somebody tried this out already? Licensing is the main reason I have had no motivation to write an interface. We would accept patches if someone would like to write an interface. It should not be difficult and we can advise if you get stuck. From mpovolot at purdue.edu Mon May 20 13:37:37 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Mon, 20 May 2013 14:37:37 -0400 Subject: [petsc-users] PARDISO and petsc In-Reply-To: <874ndxh59q.fsf@mcs.anl.gov> References: <519A6AAD.7040805@purdue.edu> <874ndxh59q.fsf@mcs.anl.gov> Message-ID: <519A6D71.9040803@purdue.edu> On 05/20/2013 02:31 PM, Jed Brown wrote: > Michael Povolotskyi writes: > >> Hello everybody, >> does PETSc support interface to MKL PARDISO linear solver? >> I did not find PARDISO in the documentation of PETSc, but may be >> somebody tried this out already? > Licensing is the main reason I have had no motivation to write an > interface. We would accept patches if someone would like to write an > interface. It should not be difficult and we can advise if you get > stuck. Sounds great. Yes, I'm going to write an interface to the MPI version of PARDISO. Will be in touch, Michael. -- Michael Povolotskyi, PhD Research Assistant Professor Network for Computational Nanotechnology 207 S Martin Jischke Drive Purdue University, DLR, room 441-10 West Lafayette, Indiana 47907 phone: +1-765-494-9396 fax: +1-765-496-6026 From mpovolot at purdue.edu Mon May 20 14:16:13 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Mon, 20 May 2013 15:16:13 -0400 Subject: [petsc-users] SNESSolve question Message-ID: <519A767D.1000705@purdue.edu> Hello, does SNESSolve closes the solution vector? Thank you, Michael. -- Michael Povolotskyi, PhD Research Assistant Professor Network for Computational Nanotechnology 207 S Martin Jischke Drive Purdue University, DLR, room 441-10 West Lafayette, Indiana 47907 phone: +1-765-494-9396 fax: +1-765-496-6026 From jedbrown at mcs.anl.gov Mon May 20 14:18:26 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 20 May 2013 14:18:26 -0500 Subject: [petsc-users] SNESSolve question In-Reply-To: <519A767D.1000705@purdue.edu> References: <519A767D.1000705@purdue.edu> Message-ID: <871u91h33x.fsf@mcs.anl.gov> Michael Povolotskyi writes: > Hello, > does SNESSolve closes the solution vector? What do you mean "closes"? From mpovolot at purdue.edu Mon May 20 14:43:06 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Mon, 20 May 2013 15:43:06 -0400 Subject: [petsc-users] SNESSolve question In-Reply-To: <871u91h33x.fsf@mcs.anl.gov> References: <519A767D.1000705@purdue.edu> <871u91h33x.fsf@mcs.anl.gov> Message-ID: <519A7CCA.1050804@purdue.edu> On 05/20/2013 03:18 PM, Jed Brown wrote: > Michael Povolotskyi writes: > >> Hello, >> does SNESSolve closes the solution vector? > What do you mean "closes"? I mean VecAssemblyBegin() and VecAssemblyEnd() -- Michael Povolotskyi, PhD Research Assistant Professor Network for Computational Nanotechnology 207 S Martin Jischke Drive Purdue University, DLR, room 441-10 West Lafayette, Indiana 47907 phone: +1-765-494-9396 fax: +1-765-496-6026 From jedbrown at mcs.anl.gov Mon May 20 14:46:02 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 20 May 2013 14:46:02 -0500 Subject: [petsc-users] SNESSolve question In-Reply-To: <519A7CCA.1050804@purdue.edu> References: <519A767D.1000705@purdue.edu> <871u91h33x.fsf@mcs.anl.gov> <519A7CCA.1050804@purdue.edu> Message-ID: <87vc6dfn9h.fsf@mcs.anl.gov> Michael Povolotskyi writes: > On 05/20/2013 03:18 PM, Jed Brown wrote: >> Michael Povolotskyi writes: >> >>> Hello, >>> does SNESSolve closes the solution vector? >> What do you mean "closes"? > I mean VecAssemblyBegin() and VecAssemblyEnd() It's not relevant because the solver doesn't use that interface. The Vec is in a normal state. From choi240 at purdue.edu Mon May 20 14:54:29 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Mon, 20 May 2013 15:54:29 -0400 (EDT) Subject: [petsc-users] Creating and indexing large matrix with indexes exceeding the limit of the PetscInt In-Reply-To: <1846949326.136047.1369078675114.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <1129128200.136075.1369079669492.JavaMail.root@mailhub028.itcs.purdue.edu> Hello all, I am setting up very large matrix(10^7 x 10^15) using MatCreateSeqAIJ and MatSetValues. However, in my computer, the maximum of PetscInt is about 4*10^9. So I cannot express the size and index of the matrix. By the way, I think I found a way from PETSC doc, and the doc is as follows: * CHANGES in PETSc 2.2.1 - Introduced 4 new PETSc data types: PetscInt, PetscErrorCode, PetscMPIInt and PetscBLASInt. For 99% of users these are just int or integer*4 and you do not need to change your code. - For users with more than roughly 2 billion unknowns you can run configure with --with-64-bit-ints and then PetscInt will represent 64 bit integers, long long int in C and integer*8 in Fortran. But the the other 3 types remain 32 bit (i.e. int in C and integer*4 in Fortran). Now you can index into vectors and matrices with virtually unlimited sizes. However, I don't understand what this means exactly. How can I index into matrix with virtually unlimited sizes? Also, how can I use MatCreateSeqAIJ(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt nz,const PetscInt nnz[],Mat *A) with n=10^15? If you know the solution of this, then let me know. Thank you, Joon From jedbrown at mcs.anl.gov Mon May 20 14:56:46 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 20 May 2013 14:56:46 -0500 Subject: [petsc-users] Creating and indexing large matrix with indexes exceeding the limit of the PetscInt In-Reply-To: <1129128200.136075.1369079669492.JavaMail.root@mailhub028.itcs.purdue.edu> References: <1129128200.136075.1369079669492.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <87sj1hfmrl.fsf@mcs.anl.gov> Joon hee Choi writes: > Hello all, > > I am setting up very large matrix(10^7 x 10^15) using MatCreateSeqAIJ > and MatSetValues. However, in my computer, the maximum of PetscInt is > about 4*10^9. So I cannot express the size and index of the matrix. http://www.mcs.anl.gov/petsc/documentation/faq.html#with-64-bit-indices > By the way, I think I found a way from PETSC doc, and the doc is as > follows: > > * CHANGES in PETSc 2.2.1 > - Introduced 4 new PETSc data types: PetscInt, PetscErrorCode, PetscMPIInt and PetscBLASInt. For 99% of users these are just int or integer*4 and you do not need to change your code. > - For users with more than roughly 2 billion unknowns you can run configure with --with-64-bit-ints and then PetscInt will represent 64 bit integers, long long int in C and integer*8 in Fortran. But the the other 3 types remain 32 bit (i.e. int in C and integer*4 in Fortran). Now you can index into vectors and matrices with virtually unlimited sizes. > > However, I don't understand what this means exactly. How can I index into matrix with virtually unlimited sizes? Also, how can I use MatCreateSeqAIJ(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt nz,const PetscInt nnz[],Mat *A) with n=10^15? If you know the solution of this, then let me know. > > Thank you, > > Joon From jianchaoge at gmail.com Mon May 20 17:47:19 2013 From: jianchaoge at gmail.com (Gavin Ge) Date: Mon, 20 May 2013 17:47:19 -0500 Subject: [petsc-users] Issue about using MUMPS with PETSc Message-ID: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> Hi: I just have my PETSc updated from 3.1 to 3.4, then find problem with using MUMPs. I have no problem before in 3.1 and also no problem with builtin iterative solvers for current version. I got the following error message: (0): ERROR: stratParserParse: invalid method parameter name "type", before "h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=0,cmax=100000,frat=0.0},ose=g},unc=n{sep=/(vert>120)?m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=15,cmax=100000,frat=0.0},ose=g}}" (0): ERROR: SCOTCH_stratGraphOrder: error in ordering strategy Error:no root nodes in ROOTLIST Error in DISTRIBUTE , layernmb= 0 procedure reporting the error: ROOTLIST [48]PETSC ERROR: --------------------- Error Message ------------------------------------ [48]PETSC ERROR: Error in external library! [48]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 ! [48]PETSC ERROR: ------------------------------------------------------------------------ [48]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013 [48]PETSC ERROR: See docs/changes/index.html for recent updates. [48]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [48]PETSC ERROR: See docs/index.html for manual pages. [48]PETSC ERROR: ------------------------------------------------------------------------ [48]PETSC ERROR: ./code on a linux-gnu-c-debug named bhc.tamu.edu by jge Mon May 20 17:24:22 2013 [48]PETSC ERROR: Libraries linked from /data/jge/petsc/linux-gnu-c-debug/lib [48]PETSC ERROR: Configure run at Thu May 16 17:15:26 2013 [48]PETSC ERROR: Configure options --with-cc=gcc --with-fc-gfortran --download-f-blas-lapack --download-mpich --download-scalapack --download-ptscotch --download-mumps --with-scalar-type=complex [48]PETSC ERROR: ------------------------------------------------------------------------ [48]PETSC ERROR: MatLUFactorSymbolic_AIJMUMPS() line 960 in src/mat/impls/aij/mpi/mumps/mumps.c [49]PETSC ERROR: --------------------- Error Message ------------------------------------ [49]PETSC ERROR: Error in external library! [49]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 ! Could anyone help me solve this? Thanks! Regards, Gavin From knepley at gmail.com Mon May 20 18:07:24 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 20 May 2013 18:07:24 -0500 Subject: [petsc-users] Issue about using MUMPS with PETSc In-Reply-To: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> References: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> Message-ID: On Mon, May 20, 2013 at 5:47 PM, Gavin Ge wrote: > Hi: > > I just have my PETSc updated from 3.1 to 3.4, then find problem with using > MUMPs. I have no problem before in 3.1 and also no problem with builtin > iterative solvers for current version. I got the following error message: > > (0): ERROR: stratParserParse: invalid method parameter name "type", before > "h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=0,cmax=100000,frat=0.0},ose=g},unc=n{sep=/(vert>120)?m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=15,cmax=100000,frat=0.0},ose=g}}" > (0): ERROR: SCOTCH_stratGraphOrder: error in ordering strategy > This appears to be an internal MUMPS error related to their use of Scotch for graph partitioning. I would send this to the MUMPS user list. Thanks, Matt > Error:no root nodes in ROOTLIST > Error in DISTRIBUTE , layernmb= > 0 > procedure reporting the error: ROOTLIST > [48]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [48]PETSC ERROR: Error in external library! > [48]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 > ! > [48]PETSC ERROR: > ------------------------------------------------------------------------ > [48]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013 > [48]PETSC ERROR: See docs/changes/index.html for recent updates. > [48]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [48]PETSC ERROR: See docs/index.html for manual pages. > [48]PETSC ERROR: > ------------------------------------------------------------------------ > [48]PETSC ERROR: ./code on a linux-gnu-c-debug named bhc.tamu.edu by jge > Mon May 20 17:24:22 2013 > [48]PETSC ERROR: Libraries linked from > /data/jge/petsc/linux-gnu-c-debug/lib > [48]PETSC ERROR: Configure run at Thu May 16 17:15:26 2013 > [48]PETSC ERROR: Configure options --with-cc=gcc --with-fc-gfortran > --download-f-blas-lapack --download-mpich --download-scalapack > --download-ptscotch --download-mumps --with-scalar-type=complex > [48]PETSC ERROR: > ------------------------------------------------------------------------ > [48]PETSC ERROR: MatLUFactorSymbolic_AIJMUMPS() line 960 in > src/mat/impls/aij/mpi/mumps/mumps.c > [49]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [49]PETSC ERROR: Error in external library! > [49]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 > ! > > Could anyone help me solve this? Thanks! > > Regards, > Gavin > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon May 20 18:14:23 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 20 May 2013 18:14:23 -0500 (CDT) Subject: [petsc-users] Issue about using MUMPS with PETSc In-Reply-To: References: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> Message-ID: And petsc-3.1 presumably was using 'metis' - and not 'scotch' - so you could try rebuilding mumps with metis/parmetis. right now we do: if self.parmetis.found: orderingsc += ' -Dmetis -Dparmetis' Should this splitup to a separate dependency on metis? Satish On Mon, 20 May 2013, Matthew Knepley wrote: > On Mon, May 20, 2013 at 5:47 PM, Gavin Ge wrote: > > > Hi: > > > > I just have my PETSc updated from 3.1 to 3.4, then find problem with using > > MUMPs. I have no problem before in 3.1 and also no problem with builtin > > iterative solvers for current version. I got the following error message: > > > > (0): ERROR: stratParserParse: invalid method parameter name "type", before > > "h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=0,cmax=100000,frat=0.0},ose=g},unc=n{sep=/(vert>120)?m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=15,cmax=100000,frat=0.0},ose=g}}" > > (0): ERROR: SCOTCH_stratGraphOrder: error in ordering strategy > > > > This appears to be an internal MUMPS error related to their use of Scotch > for graph partitioning. I would > send this to the MUMPS user list. > > Thanks, > > Matt > > > > Error:no root nodes in ROOTLIST > > Error in DISTRIBUTE , layernmb= > > 0 > > procedure reporting the error: ROOTLIST > > [48]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [48]PETSC ERROR: Error in external library! > > [48]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 > > ! > > [48]PETSC ERROR: > > ------------------------------------------------------------------------ > > [48]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013 > > [48]PETSC ERROR: See docs/changes/index.html for recent updates. > > [48]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [48]PETSC ERROR: See docs/index.html for manual pages. > > [48]PETSC ERROR: > > ------------------------------------------------------------------------ > > [48]PETSC ERROR: ./code on a linux-gnu-c-debug named bhc.tamu.edu by jge > > Mon May 20 17:24:22 2013 > > [48]PETSC ERROR: Libraries linked from > > /data/jge/petsc/linux-gnu-c-debug/lib > > [48]PETSC ERROR: Configure run at Thu May 16 17:15:26 2013 > > [48]PETSC ERROR: Configure options --with-cc=gcc --with-fc-gfortran > > --download-f-blas-lapack --download-mpich --download-scalapack > > --download-ptscotch --download-mumps --with-scalar-type=complex > > [48]PETSC ERROR: > > ------------------------------------------------------------------------ > > [48]PETSC ERROR: MatLUFactorSymbolic_AIJMUMPS() line 960 in > > src/mat/impls/aij/mpi/mumps/mumps.c > > [49]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [49]PETSC ERROR: Error in external library! > > [49]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 > > ! > > > > Could anyone help me solve this? Thanks! > > > > Regards, > > Gavin > > > > > > > > > > > > > > > > > From suifengls at gmail.com Mon May 20 22:38:29 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Mon, 20 May 2013 20:38:29 -0700 Subject: [petsc-users] Set the options NOT from argc and argv Message-ID: To whom it may concern, I am using KSP to solve Ax=b. The main() is in Fortran, and it calls a function I write in C.The parameter is array A, x, b. void P_solve(double x[ ], double b[ ], double A[ ], int size); In the function, I should call PetscInitialize() before I create the matrix and vectors for A, x, b, and also call MatSetFromOptions() and VecSetFromOptions(). But I don't have the argc and argv from main function. I just want to fix the KSP type to bcgs and the PC type. Is there another way that I can set the options not through argc and argv, just set them in the program. e.g. options[] = {"-ksp_type", "bcgs"}. Thanks, Longxiang -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon May 20 22:52:59 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 20 May 2013 22:52:59 -0500 Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: References: Message-ID: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> On May 20, 2013, at 10:38 PM, Longxiang Chen wrote: > To whom it may concern, > > I am using KSP to solve Ax=b. > The main() is in Fortran, and it calls a function I write in C.The parameter is array A, x, b. > > void P_solve(double x[ ], double b[ ], double A[ ], int size); > > In the function, I should call PetscInitialize() before I create the matrix and vectors for A, x, b, and also call MatSetFromOptions() and VecSetFromOptions(). > > But I don't have the argc and argv from main function. If Fortran is the main program the options database still has access to the command line arguments. You should still be able to use command line arguments and not need to set them in the program. Does this not work? Can you send a sample program where it does not work? Barry > > I just want to fix the KSP type to bcgs and the PC type. > Is there another way that I can set the options not through argc and argv, just set them in the program. > > e.g. options[] = {"-ksp_type", "bcgs"}. > > Thanks, > Longxiang From suifengls at gmail.com Mon May 20 23:01:33 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Mon, 20 May 2013 21:01:33 -0700 Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> References: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> Message-ID: The fortran program is too long (several files with more than 10,000 lines). And I just want to insert the P_solve inside one of the subroutine to solve Ax=b. Or it can only use argc and argv? Thanks. Longxiang Best regards, Longxiang Chen Do something every day that gets you closer to being done. -------------------------------------------------------------- 465 Winston Chung Hall Computer Science Engineering University of California, Riverside On Mon, May 20, 2013 at 8:52 PM, Barry Smith wrote: > > On May 20, 2013, at 10:38 PM, Longxiang Chen wrote: > > > To whom it may concern, > > > > I am using KSP to solve Ax=b. > > The main() is in Fortran, and it calls a function I write in C.The > parameter is array A, x, b. > > > > void P_solve(double x[ ], double b[ ], double A[ ], int size); > > > > In the function, I should call PetscInitialize() before I create the > matrix and vectors for A, x, b, and also call MatSetFromOptions() and > VecSetFromOptions(). > > > > But I don't have the argc and argv from main function. > > If Fortran is the main program the options database still has access > to the command line arguments. You should still be able to use command line > arguments and not need to set them in the program. > > Does this not work? Can you send a sample program where it does not > work? > > Barry > > > > > I just want to fix the KSP type to bcgs and the PC type. > > Is there another way that I can set the options not through argc and > argv, just set them in the program. > > > > e.g. options[] = {"-ksp_type", "bcgs"}. > > > > Thanks, > > Longxiang > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon May 20 23:10:04 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 20 May 2013 23:10:04 -0500 Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: References: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> Message-ID: <9B952647-B602-4771-B6C1-AF77C35FE958@mcs.anl.gov> PetscOptionsSetValue() call it for each option you want to set BEFORE the option will get used On May 20, 2013, at 11:01 PM, Longxiang Chen wrote: > The fortran program is too long (several files with more than 10,000 lines). > And I just want to insert the P_solve inside one of the subroutine to solve Ax=b. > Or it can only use argc and argv? > > Thanks. > Longxiang > > Best regards, > Longxiang Chen > > Do something every day that gets you closer to being done. > -------------------------------------------------------------- > 465 Winston Chung Hall > Computer Science Engineering > University of California, Riverside > > > > On Mon, May 20, 2013 at 8:52 PM, Barry Smith wrote: > > On May 20, 2013, at 10:38 PM, Longxiang Chen wrote: > > > To whom it may concern, > > > > I am using KSP to solve Ax=b. > > The main() is in Fortran, and it calls a function I write in C.The parameter is array A, x, b. > > > > void P_solve(double x[ ], double b[ ], double A[ ], int size); > > > > In the function, I should call PetscInitialize() before I create the matrix and vectors for A, x, b, and also call MatSetFromOptions() and VecSetFromOptions(). > > > > But I don't have the argc and argv from main function. > > If Fortran is the main program the options database still has access to the command line arguments. You should still be able to use command line arguments and not need to set them in the program. > > Does this not work? Can you send a sample program where it does not work? > > Barry > > > > > I just want to fix the KSP type to bcgs and the PC type. > > Is there another way that I can set the options not through argc and argv, just set them in the program. > > > > e.g. options[] = {"-ksp_type", "bcgs"}. > > > > Thanks, > > Longxiang > > From jianchaoge at gmail.com Mon May 20 23:14:03 2013 From: jianchaoge at gmail.com (Gavin Ge) Date: Mon, 20 May 2013 23:14:03 -0500 Subject: [petsc-users] Issue about using MUMPS with PETSc In-Reply-To: References: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> Message-ID: Thanks Matt. Hope they can get it work. Gavin On May 20, 2013, at 6:07 PM, Matthew Knepley wrote: > On Mon, May 20, 2013 at 5:47 PM, Gavin Ge wrote: > Hi: > > I just have my PETSc updated from 3.1 to 3.4, then find problem with using MUMPs. I have no problem before in 3.1 and also no problem with builtin iterative solvers for current version. I got the following error message: > > (0): ERROR: stratParserParse: invalid method parameter name "type", before "h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=0,cmax=100000,frat=0.0},ose=g},unc=n{sep=/(vert>120)?m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=15,cmax=100000,frat=0.0},ose=g}}" > (0): ERROR: SCOTCH_stratGraphOrder: error in ordering strategy > > This appears to be an internal MUMPS error related to their use of Scotch for graph partitioning. I would > send this to the MUMPS user list. > > Thanks, > > Matt > > Error:no root nodes in ROOTLIST > Error in DISTRIBUTE , layernmb= 0 > procedure reporting the error: ROOTLIST > [48]PETSC ERROR: --------------------- Error Message ------------------------------------ > [48]PETSC ERROR: Error in external library! > [48]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 > ! > [48]PETSC ERROR: ------------------------------------------------------------------------ > [48]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013 > [48]PETSC ERROR: See docs/changes/index.html for recent updates. > [48]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [48]PETSC ERROR: See docs/index.html for manual pages. > [48]PETSC ERROR: ------------------------------------------------------------------------ > [48]PETSC ERROR: ./code on a linux-gnu-c-debug named bhc.tamu.edu by jge Mon May 20 17:24:22 2013 > [48]PETSC ERROR: Libraries linked from /data/jge/petsc/linux-gnu-c-debug/lib > [48]PETSC ERROR: Configure run at Thu May 16 17:15:26 2013 > [48]PETSC ERROR: Configure options --with-cc=gcc --with-fc-gfortran --download-f-blas-lapack --download-mpich --download-scalapack --download-ptscotch --download-mumps --with-scalar-type=complex > [48]PETSC ERROR: ------------------------------------------------------------------------ > [48]PETSC ERROR: MatLUFactorSymbolic_AIJMUMPS() line 960 in src/mat/impls/aij/mpi/mumps/mumps.c > [49]PETSC ERROR: --------------------- Error Message ------------------------------------ > [49]PETSC ERROR: Error in external library! > [49]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 > ! > > Could anyone help me solve this? Thanks! > > Regards, > Gavin > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jianchaoge at gmail.com Mon May 20 23:17:15 2013 From: jianchaoge at gmail.com (Gavin Ge) Date: Mon, 20 May 2013 23:17:15 -0500 Subject: [petsc-users] Issue about using MUMPS with PETSc In-Reply-To: References: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> Message-ID: Hi Satish: Yes I used to use metis. But this update require a cmake 2.5 or higher, which I don't have a privilege to install on the cluster.. But I think you are right, this problem could be fixed by install metis instead. Thanks! Gavin On May 20, 2013, at 6:14 PM, Satish Balay wrote: > And petsc-3.1 presumably was using 'metis' - and not 'scotch' - so you > could try rebuilding mumps with metis/parmetis. > > right now we do: > > if self.parmetis.found: > orderingsc += ' -Dmetis -Dparmetis' > > Should this splitup to a separate dependency on metis? > > Satish > > > On Mon, 20 May 2013, Matthew Knepley wrote: > >> On Mon, May 20, 2013 at 5:47 PM, Gavin Ge wrote: >> >>> Hi: >>> >>> I just have my PETSc updated from 3.1 to 3.4, then find problem with using >>> MUMPs. I have no problem before in 3.1 and also no problem with builtin >>> iterative solvers for current version. I got the following error message: >>> >>> (0): ERROR: stratParserParse: invalid method parameter name "type", before >>> "h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=0,cmax=100000,frat=0.0},ose=g},unc=n{sep=/(vert>120)?m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}}|m{type=h,rat=0.7,vert=100,low=h{pass=10},asc=b{width=3,bnd=f{bal=0.2},org=(|h{pass=10})f{bal=0.2}}};,ole=f{cmin=15,cmax=100000,frat=0.0},ose=g}}" >>> (0): ERROR: SCOTCH_stratGraphOrder: error in ordering strategy >>> >> >> This appears to be an internal MUMPS error related to their use of Scotch >> for graph partitioning. I would >> send this to the MUMPS user list. >> >> Thanks, >> >> Matt >> >> >>> Error:no root nodes in ROOTLIST >>> Error in DISTRIBUTE , layernmb= >>> 0 >>> procedure reporting the error: ROOTLIST >>> [48]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [48]PETSC ERROR: Error in external library! >>> [48]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 >>> ! >>> [48]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [48]PETSC ERROR: Petsc Release Version 3.4.0, May, 13, 2013 >>> [48]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [48]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [48]PETSC ERROR: See docs/index.html for manual pages. >>> [48]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [48]PETSC ERROR: ./code on a linux-gnu-c-debug named bhc.tamu.edu by jge >>> Mon May 20 17:24:22 2013 >>> [48]PETSC ERROR: Libraries linked from >>> /data/jge/petsc/linux-gnu-c-debug/lib >>> [48]PETSC ERROR: Configure run at Thu May 16 17:15:26 2013 >>> [48]PETSC ERROR: Configure options --with-cc=gcc --with-fc-gfortran >>> --download-f-blas-lapack --download-mpich --download-scalapack >>> --download-ptscotch --download-mumps --with-scalar-type=complex >>> [48]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [48]PETSC ERROR: MatLUFactorSymbolic_AIJMUMPS() line 960 in >>> src/mat/impls/aij/mpi/mumps/mumps.c >>> [49]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [49]PETSC ERROR: Error in external library! >>> [49]PETSC ERROR: Error reported by MUMPS in analysis phase: INFOG(1)=-135 >>> ! >>> >>> Could anyone help me solve this? Thanks! >>> >>> Regards, >>> Gavin >>> >>> >>> >>> >>> >>> >>> >> >> >> > From jedbrown at mcs.anl.gov Mon May 20 23:21:37 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 20 May 2013 23:21:37 -0500 Subject: [petsc-users] Issue about using MUMPS with PETSc In-Reply-To: References: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> Message-ID: <87a9npdktq.fsf@mcs.anl.gov> Gavin Ge writes: > Hi Satish: > > Yes I used to use metis. But this update require a cmake 2.5 or > higher, which I don't have a privilege to install on the cluster.. Common misconception, but you can install that stuff just fine. Configure PETSc with --download-cmake, for example. I recommend using --download-metis because we have patched several upstream bugs. (Upstream has pretty much stopped maintaining the package.) From jianchaoge at gmail.com Mon May 20 23:45:39 2013 From: jianchaoge at gmail.com (Gavin Ge) Date: Mon, 20 May 2013 23:45:39 -0500 Subject: [petsc-users] Issue about using MUMPS with PETSc In-Reply-To: <87a9npdktq.fsf@mcs.anl.gov> References: <69D8851F-6DFD-4BD6-A835-A1366002F095@gmail.com> <87a9npdktq.fsf@mcs.anl.gov> Message-ID: <5D411F6B-DED3-4D66-8E30-98BA8ABC296B@gmail.com> Thanks. It works On May 20, 2013, at 11:21 PM, Jed Brown wrote: > Common misconception, but you can install that stuff just fine. > Configure PETSc with --download-cmake, for example. -------------- next part -------------- An HTML attachment was scrubbed... URL: From hao.yu at peraglobal.com Tue May 21 04:52:23 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Tue, 21 May 2013 17:52:23 +0800 Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ILTwuLQ6ICC08Li0OiC08Li0?= =?gb2312?b?OiDXqreiOiBQRVRzYyBwcm9ibGVt?= In-Reply-To: References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn>, Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn> It still has problem. I open VS2010 cmd, then login in bash, then configure following the instruction, it shows "if you use batch system, you should ./configure --with-batch. otherwise, the complier has some problem...." But cl really works. Thanks! Hao ________________________________________ ???: Satish Balay [balay at mcs.anl.gov] ????: 2013?5?20? 23:23 ???: ?? ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] ??: ??: ??: ??: ??: PETsc problem On Mon, 20 May 2013, ?? wrote: > I can use cl to compile file under Windows. But even if I include path which cl is in in the enviromental variable of Cygwin, it still shows "Win32fe cl" does not work. Have you checked the windows installation instructions? https://www.mcs.anl.gov/petsc/documentation/installation.html#windows You should start the 'compiler cmd' - and then run 'bash --login' inside it [and not 'set cygwin path to include cl'] Satish > The attached is the configure.log and make.log > Thanks! > > Hao > ________________________________ > ???: Matthew Knepley [knepley at gmail.com] > ????: 2013?5?11? 4:32 > ???: ??; petsc-users at mcs.anl.gov > ??: Re: ??: [petsc-users] ??: ??: ??: PETsc problem > > On Fri, May 10, 2013 at 12:11 PM, ?? > wrote: > What do you mean by compiling something with it ? I don't know. you mean 'win32fe cl' is not installed correctly? > > I mean, can you compile a file using cl from the command line? > > Matt > > > > Hao > ________________________________ > ???: Matthew Knepley [knepley at gmail.com] > ????: 2013?5?10? 19:55 > ???: ?? > ??: petsc-users at mcs.anl.gov > ??: Re: [petsc-users] ??: ??: ??: PETsc problem > > On Fri, May 10, 2013 at 6:20 AM, ?? > wrote: > > The configure.log and make.log are attached. > > it shows that 'win32fe cl' does not work. I dont' know what the problem is. > > Can you compile something with it? It looks like it is not installed correctly. > > Matt > > Thanks! > > > > Hao > ________________________________________ > ???: Jed Brown [five9a2 at gmail.com] ?? Jed Brown [jedbrown at mcs.anl.gov] > ????: 2013?5?10? 15:16 > ???: ??; petsc-users > ??: Re: [petsc-users] ??: ??: PETsc problem > > ?? > writes: > > > it shows that 'win32fe cl' does not work. I dont' know what the problem is. > > You couldn't have found a less helpful way to report this. > > Note the bold part: > > http://www.mcs.anl.gov/petsc/documentation/bugreporting.html > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > From knepley at gmail.com Tue May 21 05:44:06 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 21 May 2013 05:44:06 -0500 Subject: [petsc-users] =?gb2312?b?tPC4tDogtPC4tDogtPC4tDogtPC4tDogtPC4tDog?= =?gb2312?b?16q3ojogUEVUc2MgcHJvYmxlbQ==?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn> Message-ID: On Tue, May 21, 2013 at 4:52 AM, ?? wrote: > It still has problem. I open VS2010 cmd, then login in bash, then > configure following the instruction, it shows "if you use batch system, you > should ./configure --with-batch. otherwise, the complier has some > problem...." But cl really works. > Please show us an example of your cl working from the command line. This means a) Open a command shell b) Create a simple program, just an empty main() c) Compile it with cl d) Mail us the program and all the output from this to petsc-maint at mcs.anl.gov Thanks, Matt > Thanks! > > > Hao > > ________________________________________ > ???: Satish Balay [balay at mcs.anl.gov] > ????: 2013?5?20? 23:23 > ???: ?? > ??: petsc-users at mcs.anl.gov > ??: Re: [petsc-users] ??: ??: ??: ??: ??: PETsc problem > > On Mon, 20 May 2013, ?? wrote: > > > I can use cl to compile file under Windows. But even if I include path > which cl is in in the enviromental variable of Cygwin, it still shows > "Win32fe cl" does not work. > > Have you checked the windows installation instructions? > > https://www.mcs.anl.gov/petsc/documentation/installation.html#windows > > You should start the 'compiler cmd' - and then run 'bash --login' inside it > [and not 'set cygwin path to include cl'] > > Satish > > > The attached is the configure.log and make.log > > Thanks! > > > > Hao > > ________________________________ > > ???: Matthew Knepley [knepley at gmail.com] > > ????: 2013?5?11? 4:32 > > ???: ??; petsc-users at mcs.anl.gov > > ??: Re: ??: [petsc-users] ??: ??: ??: PETsc problem > > > > On Fri, May 10, 2013 at 12:11 PM, ?? hao.yu at peraglobal.com>> wrote: > > What do you mean by compiling something with it ? I don't know. you mean > 'win32fe cl' is not installed correctly? > > > > I mean, can you compile a file using cl from the command line? > > > > Matt > > > > > > > > Hao > > ________________________________ > > ???: Matthew Knepley [knepley at gmail.com] > > ????: 2013?5?10? 19:55 > > ???: ?? > > ??: petsc-users at mcs.anl.gov > > ??: Re: [petsc-users] ??: ??: ??: PETsc problem > > > > On Fri, May 10, 2013 at 6:20 AM, ?? hao.yu at peraglobal.com>> wrote: > > > > The configure.log and make.log are attached. > > > > it shows that 'win32fe cl' does not work. I dont' know what the problem > is. > > > > Can you compile something with it? It looks like it is not installed > correctly. > > > > Matt > > > > Thanks! > > > > > > > > Hao > > ________________________________________ > > ???: Jed Brown [five9a2 at gmail.com] ?? Jed > Brown [jedbrown at mcs.anl.gov] > > ????: 2013?5?10? 15:16 > > ???: ??; petsc-users > > ??: Re: [petsc-users] ??: ??: PETsc problem > > > > ?? > writes: > > > > > it shows that 'win32fe cl' does not work. I dont' know what the > problem is. > > > > You couldn't have found a less helpful way to report this. > > > > Note the bold part: > > > > http://www.mcs.anl.gov/petsc/documentation/bugreporting.html > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue May 21 05:47:07 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 21 May 2013 05:47:07 -0500 Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: References: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> Message-ID: On Mon, May 20, 2013 at 11:01 PM, Longxiang Chen wrote: > The fortran program is too long (several files with more than 10,000 > lines). > And I just want to insert the P_solve inside one of the subroutine to > solve Ax=b. > Or it can only use argc and argv? > Barry is correct that you can use PetscOptionsSetValue(), however I think you are still missing the point. Command line options are processed in PetscInitialize(). You do not have to do anything else. You do not have to change your huge Fortran code. You just use the command line. It is much better than hard coding a solver type since you can use ALL solver types. There is no reason not to use it. Thanks, Matt > Thanks. > Longxiang > > Best regards, > Longxiang Chen > > Do something every day that gets you closer to being done. > -------------------------------------------------------------- > 465 Winston Chung Hall > Computer Science Engineering > University of California, Riverside > > > > On Mon, May 20, 2013 at 8:52 PM, Barry Smith wrote: > >> >> On May 20, 2013, at 10:38 PM, Longxiang Chen wrote: >> >> > To whom it may concern, >> > >> > I am using KSP to solve Ax=b. >> > The main() is in Fortran, and it calls a function I write in C.The >> parameter is array A, x, b. >> > >> > void P_solve(double x[ ], double b[ ], double A[ ], int size); >> > >> > In the function, I should call PetscInitialize() before I create the >> matrix and vectors for A, x, b, and also call MatSetFromOptions() and >> VecSetFromOptions(). >> > >> > But I don't have the argc and argv from main function. >> >> If Fortran is the main program the options database still has access >> to the command line arguments. You should still be able to use command line >> arguments and not need to set them in the program. >> >> Does this not work? Can you send a sample program where it does not >> work? >> >> Barry >> >> > >> > I just want to fix the KSP type to bcgs and the PC type. >> > Is there another way that I can set the options not through argc and >> argv, just set them in the program. >> > >> > e.g. options[] = {"-ksp_type", "bcgs"}. >> > >> > Thanks, >> > Longxiang >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From hao.yu at peraglobal.com Tue May 21 09:25:08 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Tue, 21 May 2013 22:25:08 +0800 Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ILTwuLQ6ILTwuLQ6ILTwuLQ6?= =?gb2312?b?ILTwuLQ6INeqt6I6IFBFVHNjIHByb2JsZW0=?= In-Reply-To: References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn>, Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2EF@peramail.mail.cn> I have a program OK.cpp in the directory d:/Program Files/VC/ #incldue int main() { printf("OK"); return 0; } after cl OK.cpp , Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 Copyright Microsoft Corposration. All rights reserved. OK.cpp Microsoft Incremental Linker Version 10.00.30319.01 Copyright MIcrosoft Corporation. All rights reserved. /out:OK.exe OK.obj Thanks! Hao ________________________________ ???: Matthew Knepley [knepley at gmail.com] ????: 2013?5?21? 18:44 ???: ?? ??: petsc-users ??: Re: [petsc-users] ??: ??: ??: ??: ??: ??: PETsc problem On Tue, May 21, 2013 at 4:52 AM, ?? > wrote: It still has problem. I open VS2010 cmd, then login in bash, then configure following the instruction, it shows "if you use batch system, you should ./configure --with-batch. otherwise, the complier has some problem...." But cl really works. Please show us an example of your cl working from the command line. This means a) Open a command shell b) Create a simple program, just an empty main() c) Compile it with cl d) Mail us the program and all the output from this to petsc-maint at mcs.anl.gov Thanks, Matt Thanks! Hao ________________________________________ ???: Satish Balay [balay at mcs.anl.gov] ????: 2013?5?20? 23:23 ???: ?? ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] ??: ??: ??: ??: ??: PETsc problem On Mon, 20 May 2013, ?? wrote: > I can use cl to compile file under Windows. But even if I include path which cl is in in the enviromental variable of Cygwin, it still shows "Win32fe cl" does not work. Have you checked the windows installation instructions? https://www.mcs.anl.gov/petsc/documentation/installation.html#windows You should start the 'compiler cmd' - and then run 'bash --login' inside it [and not 'set cygwin path to include cl'] Satish > The attached is the configure.log and make.log > Thanks! > > Hao > ________________________________ > ???: Matthew Knepley [knepley at gmail.com] > ????: 2013?5?11? 4:32 > ???: ??; petsc-users at mcs.anl.gov > ??: Re: ??: [petsc-users] ??: ??: ??: PETsc problem > > On Fri, May 10, 2013 at 12:11 PM, ?? >> wrote: > What do you mean by compiling something with it ? I don't know. you mean 'win32fe cl' is not installed correctly? > > I mean, can you compile a file using cl from the command line? > > Matt > > > > Hao > ________________________________ > ???: Matthew Knepley [knepley at gmail.com>] > ????: 2013?5?10? 19:55 > ???: ?? > ??: petsc-users at mcs.anl.gov> > ??: Re: [petsc-users] ??: ??: ??: PETsc problem > > On Fri, May 10, 2013 at 6:20 AM, ?? >> wrote: > > The configure.log and make.log are attached. > > it shows that 'win32fe cl' does not work. I dont' know what the problem is. > > Can you compile something with it? It looks like it is not installed correctly. > > Matt > > Thanks! > > > > Hao > ________________________________________ > ???: Jed Brown [five9a2 at gmail.com>] ?? Jed Brown [jedbrown at mcs.anl.gov>] > ????: 2013?5?10? 15:16 > ???: ??; petsc-users > ??: Re: [petsc-users] ??: ??: PETsc problem > > ?? >> writes: > > > it shows that 'win32fe cl' does not work. I dont' know what the problem is. > > You couldn't have found a less helpful way to report this. > > Note the bold part: > > http://www.mcs.anl.gov/petsc/documentation/bugreporting.html > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue May 21 09:42:57 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 21 May 2013 09:42:57 -0500 (CDT) Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ILTwuLQ6ILTwuLQ6ILTwuLQ6?= =?gb2312?b?ILTwuLQ6INeqt6I6IFBFVHNjIHByb2JsZW0=?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2EF@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2EF@peramail.mail.cn> Message-ID: On Tue, 21 May 2013, ?? wrote: > I have a program OK.cpp in the directory d:/Program Files/VC/ thats a strange location to do compiles. > > #incldue hm - typo? > int main() > { > printf("OK"); > return 0; > } > > after cl OK.cpp , > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > Copyright Microsoft Corposration. All rights reserved. > > OK.cpp > Microsoft Incremental Linker Version 10.00.30319.01 > Copyright MIcrosoft Corporation. All rights reserved. > > /out:OK.exe > OK.obj > > > Thanks! Please copy/paste the complete output from the following sequence of commands. Satish --------- C:?cygwin?home?balay?junk>cl OK.c Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 Copyright (C) Microsoft Corporation. All rights reserved. OK.c Microsoft (R) Incremental Linker Version 8.00.50727.762 Copyright (C) Microsoft Corporation. All rights reserved. /out:OK.exe OK.obj C:?cygwin?home?balay?junk>echo %errorlevel% 0 C:?cygwin?home?balay?junk>c:?cygwin?bin?bash --login balay at msnehalem2 ? $ cd junk/ balay at msnehalem2 ?/junk $ cl OK.c Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 Copyright (C) Microsoft Corporation. All rights reserved. OK.c Microsoft (R) Incremental Linker Version 8.00.50727.762 Copyright (C) Microsoft Corporation. All rights reserved. /out:OK.exe OK.obj balay at msnehalem2 ?/junk $ echo $? 0 balay at msnehalem2 ?/junk $ ?/petsc.clone/bin/win32fe/win32fe cl -c OK.c OK.c balay at msnehalem2 ?/junk $ echo $? 0 From balay at mcs.anl.gov Tue May 21 09:48:48 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 21 May 2013 09:48:48 -0500 (CDT) Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: References: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> Message-ID: If PetscInitialize() is called from fortran - command line should work. An easy alternative is to have a file 'petscrc' in the executable run dir - with the command line options listed [one option per line] Satish On Tue, 21 May 2013, Matthew Knepley wrote: > On Mon, May 20, 2013 at 11:01 PM, Longxiang Chen wrote: > > > The fortran program is too long (several files with more than 10,000 > > lines). > > And I just want to insert the P_solve inside one of the subroutine to > > solve Ax=b. > > Or it can only use argc and argv? > > > > Barry is correct that you can use PetscOptionsSetValue(), however I think > you are > still missing the point. > > Command line options are processed in PetscInitialize(). You do not have to > do anything > else. You do not have to change your huge Fortran code. You just use the > command line. > It is much better than hard coding a solver type since you can use ALL > solver types. There > is no reason not to use it. > > Thanks, > > Matt > > > > Thanks. > > Longxiang > > > > Best regards, > > Longxiang Chen > > > > Do something every day that gets you closer to being done. > > -------------------------------------------------------------- > > 465 Winston Chung Hall > > Computer Science Engineering > > University of California, Riverside > > > > > > > > On Mon, May 20, 2013 at 8:52 PM, Barry Smith wrote: > > > >> > >> On May 20, 2013, at 10:38 PM, Longxiang Chen wrote: > >> > >> > To whom it may concern, > >> > > >> > I am using KSP to solve Ax=b. > >> > The main() is in Fortran, and it calls a function I write in C.The > >> parameter is array A, x, b. > >> > > >> > void P_solve(double x[ ], double b[ ], double A[ ], int size); > >> > > >> > In the function, I should call PetscInitialize() before I create the > >> matrix and vectors for A, x, b, and also call MatSetFromOptions() and > >> VecSetFromOptions(). > >> > > >> > But I don't have the argc and argv from main function. > >> > >> If Fortran is the main program the options database still has access > >> to the command line arguments. You should still be able to use command line > >> arguments and not need to set them in the program. > >> > >> Does this not work? Can you send a sample program where it does not > >> work? > >> > >> Barry > >> > >> > > >> > I just want to fix the KSP type to bcgs and the PC type. > >> > Is there another way that I can set the options not through argc and > >> argv, just set them in the program. > >> > > >> > e.g. options[] = {"-ksp_type", "bcgs"}. > >> > > >> > Thanks, > >> > Longxiang > >> > >> > > > > > From heikki.a.virtanen at hotmail.com Tue May 21 11:17:30 2013 From: heikki.a.virtanen at hotmail.com (Heikki Virtanen) Date: Tue, 21 May 2013 19:17:30 +0300 Subject: [petsc-users] How to use PETSc preconditioners in SLEPc? Message-ID: Hi, I try to solve eigenvalue problem using EPS object and Krylov-Schur solver. Everything goes well and I can solve the problems that I want. Anyway, I want to use other preconditioners than the default preconditioner. It is a little bit unclear for me how to do this. For example, I have installed PETSc with Hypre. How can use BoomerAMG preconditioner instead of ILU? ( SLEPc chooses this by default ) -Heikki -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Tue May 21 11:22:42 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Tue, 21 May 2013 18:22:42 +0200 Subject: [petsc-users] How to use PETSc preconditioners in SLEPc? In-Reply-To: References: Message-ID: <40F3D6FC-8101-4A9C-8B9A-CA6618122B3E@dsic.upv.es> El 21/05/2013, a las 18:17, Heikki Virtanen escribi?: > Hi, I try to solve eigenvalue problem using EPS object > and Krylov-Schur solver. Everything goes well and I > can solve the problems that I want. > > Anyway, I want to use other preconditioners than the > default preconditioner. It is a little bit unclear for > me how to do this. For example, I have installed > PETSc with Hypre. How can use BoomerAMG preconditioner > instead of ILU? ( SLEPc chooses this by default ) > > -Heikki Have a look at section 3.4.1 of SLEPc's users guide. For instance, you can try something like this: $ ./program -st_ksp_type bcgs -st_pc_type hypre -st_pc_hypre_type boomeramg Jose From heikki.a.virtanen at hotmail.com Tue May 21 12:18:10 2013 From: heikki.a.virtanen at hotmail.com (Heikki Virtanen) Date: Tue, 21 May 2013 20:18:10 +0300 Subject: [petsc-users] How to use PETSc preconditioners in SLEPc? In-Reply-To: <40F3D6FC-8101-4A9C-8B9A-CA6618122B3E@dsic.upv.es> References: , <40F3D6FC-8101-4A9C-8B9A-CA6618122B3E@dsic.upv.es> Message-ID: Thanks! If there is a way to avoid command line options, it would be excellent. Sorry, that I didn't remember to mention this one. Your example has command line options "st_ksp_type","st_pc_type","st_pc_hypre_type" Do they have an equivalent function calls? -Heikki > > Hi, I try to solve eigenvalue problem using EPS object > > and Krylov-Schur solver. Everything goes well and I > > can solve the problems that I want. > > > > Anyway, I want to use other preconditioners than the > > default preconditioner. It is a little bit unclear for > > me how to do this. For example, I have installed > > PETSc with Hypre. How can use BoomerAMG preconditioner > > instead of ILU? ( SLEPc chooses this by default ) > > > > -Heikki > > Have a look at section 3.4.1 of SLEPc's users guide. For instance, you can try something like this: > > $ ./program -st_ksp_type bcgs -st_pc_type hypre -st_pc_hypre_type boomeramg > > Jose > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue May 21 12:32:14 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 21 May 2013 12:32:14 -0500 Subject: [petsc-users] How to use PETSc preconditioners in SLEPc? In-Reply-To: References: <40F3D6FC-8101-4A9C-8B9A-CA6618122B3E@dsic.upv.es> Message-ID: On Tue, May 21, 2013 at 12:18 PM, Heikki Virtanen < heikki.a.virtanen at hotmail.com> wrote: > Thanks! If there is a way to avoid command line options, > it would be excellent. Sorry, that I didn't remember to > Why? > mention this one. Your example has command line options > "st_ksp_type","st_pc_type","st_pc_hypre_type" > Do they have an equivalent function calls? > Yes, all of them do (see online manual). However, we discourage you from doing this because then you hardcode a single solver. The aim should be to try a range of solvers and find the right one for your problem. Matt > > -Heikki > > > > > Hi, I try to solve eigenvalue problem using EPS object > > > and Krylov-Schur solver. Everything goes well and I > > > can solve the problems that I want. > > > > > > Anyway, I want to use other preconditioners than the > > > default preconditioner. It is a little bit unclear for > > > me how to do this. For example, I have installed > > > PETSc with Hypre. How can use BoomerAMG preconditioner > > > instead of ILU? ( SLEPc chooses this by default ) > > > > > > -Heikki > > > > Have a look at section 3.4.1 of SLEPc's users guide. For instance, you > can try something like this: > > > > $ ./program -st_ksp_type bcgs -st_pc_type hypre -st_pc_hypre_type > boomeramg > > > > Jose > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Tue May 21 13:29:48 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Tue, 21 May 2013 13:29:48 -0500 Subject: [petsc-users] How to use PETSc preconditioners in SLEPc? In-Reply-To: References: <40F3D6FC-8101-4A9C-8B9A-CA6618122B3E@dsic.upv.es> Message-ID: <8761ycchk3.fsf@mcs.anl.gov> Heikki Virtanen writes: > Thanks! If there is a way to avoid command line options, > it would be excellent. Sorry, that I didn't remember to > mention this one. Your example has command line options > "st_ksp_type","st_pc_type","st_pc_hypre_type" Note that you can use the PETSC_OPTIONS environment variable, a petscrc file, or any other file. See the user's manual and this page for details. http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscOptionsInsertFile.html > Do they have an equivalent function calls? > > -Heikki > >> > Hi, I try to solve eigenvalue problem using EPS object >> > and Krylov-Schur solver. Everything goes well and I >> > can solve the problems that I want. >> > >> > Anyway, I want to use other preconditioners than the >> > default preconditioner. It is a little bit unclear for >> > me how to do this. For example, I have installed >> > PETSc with Hypre. How can use BoomerAMG preconditioner >> > instead of ILU? ( SLEPc chooses this by default ) >> > >> > -Heikki >> >> Have a look at section 3.4.1 of SLEPc's users guide. For instance, you can try something like this: >> >> $ ./program -st_ksp_type bcgs -st_pc_type hypre -st_pc_hypre_type boomeramg >> >> Jose >> > From suifengls at gmail.com Tue May 21 15:33:13 2013 From: suifengls at gmail.com (Longxiang Chen) Date: Tue, 21 May 2013 13:33:13 -0700 Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: References: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> Message-ID: Here is an example: I only have a function written in C(and it is called in fortran, I cannot change fortran code): PetscInitialize() is called in C function. void P_solve(A, b, x, size) { // definition and declaration PetscInitialize(); // petsc KSP functions } I use PetscOptionsSetValue() to set the options. But need to compile it every time I change options, I would fix it in the end. Could I get the command line options in this situation? Like petscrc in fortran for this C function? Thanks, Best regards, Longxiang Chen Do something every day that gets you closer to being done. -------------------------------------------------------------- 465 Winston Chung Hall Computer Science Engineering University of California, Riverside On Tue, May 21, 2013 at 7:48 AM, Satish Balay wrote: > If PetscInitialize() is called from fortran - command line should work. > > An easy alternative is to have a file 'petscrc' in the executable run > dir - with the command line options listed [one option per line] > > Satish > > On Tue, 21 May 2013, Matthew Knepley wrote: > > > On Mon, May 20, 2013 at 11:01 PM, Longxiang Chen >wrote: > > > > > The fortran program is too long (several files with more than 10,000 > > > lines). > > > And I just want to insert the P_solve inside one of the subroutine to > > > solve Ax=b. > > > Or it can only use argc and argv? > > > > > > > Barry is correct that you can use PetscOptionsSetValue(), however I think > > you are > > still missing the point. > > > > Command line options are processed in PetscInitialize(). You do not have > to > > do anything > > else. You do not have to change your huge Fortran code. You just use the > > command line. > > It is much better than hard coding a solver type since you can use ALL > > solver types. There > > is no reason not to use it. > > > > Thanks, > > > > Matt > > > > > > > Thanks. > > > Longxiang > > > > > > Best regards, > > > Longxiang Chen > > > > > > Do something every day that gets you closer to being done. > > > -------------------------------------------------------------- > > > 465 Winston Chung Hall > > > Computer Science Engineering > > > University of California, Riverside > > > > > > > > > > > > On Mon, May 20, 2013 at 8:52 PM, Barry Smith > wrote: > > > > > >> > > >> On May 20, 2013, at 10:38 PM, Longxiang Chen > wrote: > > >> > > >> > To whom it may concern, > > >> > > > >> > I am using KSP to solve Ax=b. > > >> > The main() is in Fortran, and it calls a function I write in C.The > > >> parameter is array A, x, b. > > >> > > > >> > void P_solve(double x[ ], double b[ ], double A[ ], int size); > > >> > > > >> > In the function, I should call PetscInitialize() before I create the > > >> matrix and vectors for A, x, b, and also call MatSetFromOptions() and > > >> VecSetFromOptions(). > > >> > > > >> > But I don't have the argc and argv from main function. > > >> > > >> If Fortran is the main program the options database still has > access > > >> to the command line arguments. You should still be able to use > command line > > >> arguments and not need to set them in the program. > > >> > > >> Does this not work? Can you send a sample program where it does > not > > >> work? > > >> > > >> Barry > > >> > > >> > > > >> > I just want to fix the KSP type to bcgs and the PC type. > > >> > Is there another way that I can set the options not through argc and > > >> argv, just set them in the program. > > >> > > > >> > e.g. options[] = {"-ksp_type", "bcgs"}. > > >> > > > >> > Thanks, > > >> > Longxiang > > >> > > >> > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue May 21 16:08:53 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 21 May 2013 16:08:53 -0500 (CDT) Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: References: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> Message-ID: You should be able to use 'petscrc' file as mentioned before. [without any code changes] Alternative is to call fortran initialize/finalize from your c code - but this is a bit convoluted.. [but appears to work for me] Satish ---------- balay at asterix /home/balay/download-pine $ cat main.F program main call foo() end balay at asterix /home/balay/download-pine $ cat sub.c #include #include #if defined(PETSC_HAVE_FORTRAN_CAPS) #define petscinitialize_ PETSCINITIALIZE #define petscfinalize_ PETSCFINALIZE #define foo_ FOO #elif !defined(PETSC_HAVE_FORTRAN_UNDERSCORE) #define petscinitialize_ petscinitialize #define petscfinalize_ petscfinalize #define foo_ foo #endif PETSC_EXTERN void PETSC_STDCALL petscinitialize_(CHAR filename PETSC_MIXED_LEN(len),PetscErrorCode *ierr PETSC_END_LEN(len)); PETSC_EXTERN void PETSC_STDCALL petscfinalize_(PetscErrorCode *ierr); void foo_(void) { PetscErrorCode ierr; #if defined(PETSC_HAVE_FORTRAN_MIXED_STR_ARG) petscinitialize_(PETSC_NULL_CHARACTER_Fortran,0,&ierr); #else petscinitialize_(PETSC_NULL_CHARACTER_Fortran,&ierr,0); #endif petscfinalize_(&ierr); return; } balay at asterix /home/balay/download-pine $ On Tue, 21 May 2013, Longxiang Chen wrote: > Here is an example: > > I only have a function written in C(and it is called in fortran, I cannot > change fortran code): > PetscInitialize() is called in C function. > > void P_solve(A, b, x, size) > { > // definition and declaration > > PetscInitialize(); > > // petsc KSP functions > > } > > I use PetscOptionsSetValue() to set the options. But need to compile it > every time I change options, I would fix it in the end. > Could I get the command line options in this situation? Like petscrc in > fortran for this C function? > > Thanks, > > Best regards, > Longxiang Chen > > Do something every day that gets you closer to being done. > -------------------------------------------------------------- > 465 Winston Chung Hall > Computer Science Engineering > University of California, Riverside > > > > On Tue, May 21, 2013 at 7:48 AM, Satish Balay wrote: > > > If PetscInitialize() is called from fortran - command line should work. > > > > An easy alternative is to have a file 'petscrc' in the executable run > > dir - with the command line options listed [one option per line] > > > > Satish > > > > On Tue, 21 May 2013, Matthew Knepley wrote: > > > > > On Mon, May 20, 2013 at 11:01 PM, Longxiang Chen > >wrote: > > > > > > > The fortran program is too long (several files with more than 10,000 > > > > lines). > > > > And I just want to insert the P_solve inside one of the subroutine to > > > > solve Ax=b. > > > > Or it can only use argc and argv? > > > > > > > > > > Barry is correct that you can use PetscOptionsSetValue(), however I think > > > you are > > > still missing the point. > > > > > > Command line options are processed in PetscInitialize(). You do not have > > to > > > do anything > > > else. You do not have to change your huge Fortran code. You just use the > > > command line. > > > It is much better than hard coding a solver type since you can use ALL > > > solver types. There > > > is no reason not to use it. > > > > > > Thanks, > > > > > > Matt > > > > > > > > > > Thanks. > > > > Longxiang > > > > > > > > Best regards, > > > > Longxiang Chen > > > > > > > > Do something every day that gets you closer to being done. > > > > -------------------------------------------------------------- > > > > 465 Winston Chung Hall > > > > Computer Science Engineering > > > > University of California, Riverside > > > > > > > > > > > > > > > > On Mon, May 20, 2013 at 8:52 PM, Barry Smith > > wrote: > > > > > > > >> > > > >> On May 20, 2013, at 10:38 PM, Longxiang Chen > > wrote: > > > >> > > > >> > To whom it may concern, > > > >> > > > > >> > I am using KSP to solve Ax=b. > > > >> > The main() is in Fortran, and it calls a function I write in C.The > > > >> parameter is array A, x, b. > > > >> > > > > >> > void P_solve(double x[ ], double b[ ], double A[ ], int size); > > > >> > > > > >> > In the function, I should call PetscInitialize() before I create the > > > >> matrix and vectors for A, x, b, and also call MatSetFromOptions() and > > > >> VecSetFromOptions(). > > > >> > > > > >> > But I don't have the argc and argv from main function. > > > >> > > > >> If Fortran is the main program the options database still has > > access > > > >> to the command line arguments. You should still be able to use > > command line > > > >> arguments and not need to set them in the program. > > > >> > > > >> Does this not work? Can you send a sample program where it does > > not > > > >> work? > > > >> > > > >> Barry > > > >> > > > >> > > > > >> > I just want to fix the KSP type to bcgs and the PC type. > > > >> > Is there another way that I can set the options not through argc and > > > >> argv, just set them in the program. > > > >> > > > > >> > e.g. options[] = {"-ksp_type", "bcgs"}. > > > >> > > > > >> > Thanks, > > > >> > Longxiang > > > >> > > > >> > > > > > > > > > > > > > > > > > > From balay at mcs.anl.gov Tue May 21 16:17:55 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 21 May 2013 16:17:55 -0500 (CDT) Subject: [petsc-users] Set the options NOT from argc and argv In-Reply-To: References: <2CF207C8-C669-48E3-BB35-8F7C5F043BE0@mcs.anl.gov> Message-ID: On Tue, 21 May 2013, Satish Balay wrote: > You should be able to use 'petscrc' file as mentioned before. [without > any code changes] > > Alternative is to call fortran initialize/finalize from your c code - > but this is a bit convoluted.. [but appears to work for me] Or you can do the following... by having a fortran intermediate file.. Satish --------- balay at asterix /home/balay/download-pine $ cat main.F program main call wrap() end balay at asterix /home/balay/download-pine $ cat wrap.F subroutine wrap() implicit none #include PetscErrorCode ierr call PetscInitialize(PETSC_NULL_CHARACTER,ierr) call foo() call PetscFinalize(ierr) end balay at asterix /home/balay/download-pine $ cat sub.c #include #if defined(PETSC_HAVE_FORTRAN_CAPS) #define foo_ FOO #elif !defined(PETSC_HAVE_FORTRAN_UNDERSCORE) #define foo_ foo #endif void foo_(void) { /* petsc code */ return; } From heikki.a.virtanen at hotmail.com Tue May 21 16:27:12 2013 From: heikki.a.virtanen at hotmail.com (Heikki Virtanen) Date: Wed, 22 May 2013 00:27:12 +0300 Subject: [petsc-users] How to use PETSc preconditioners in SLEPc? In-Reply-To: <8761ycchk3.fsf@mcs.anl.gov> References: <40F3D6FC-8101-4A9C-8B9A-CA6618122B3E@dsic.upv.es> , <8761ycchk3.fsf@mcs.anl.gov> Message-ID: Hi, Ok. Thanks for all of you! I decided to stay in command line options. I just had to change a couple of things in my code. and now everything works. -Heikki > > Thanks! If there is a way to avoid command line options, > > it would be excellent. Sorry, that I didn't remember to > > mention this one. Your example has command line options > > "st_ksp_type","st_pc_type","st_pc_hypre_type" > > Note that you can use the PETSC_OPTIONS environment variable, a petscrc > file, or any other file. See the user's manual and this page for > details. > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscOptionsInsertFile.html > > > Do they have an equivalent function calls? > > > > -Heikki > > > >> > Hi, I try to solve eigenvalue problem using EPS object > >> > and Krylov-Schur solver. Everything goes well and I > >> > can solve the problems that I want. > >> > > >> > Anyway, I want to use other preconditioners than the > >> > default preconditioner. It is a little bit unclear for > >> > me how to do this. For example, I have installed > >> > PETSc with Hypre. How can use BoomerAMG preconditioner > >> > instead of ILU? ( SLEPc chooses this by default ) > >> > > >> > -Heikki > >> > >> Have a look at section 3.4.1 of SLEPc's users guide. For instance, you can try something like this: > >> > >> $ ./program -st_ksp_type bcgs -st_pc_type hypre -st_pc_hypre_type boomeramg > >> > >> Jose > >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hao.yu at peraglobal.com Wed May 22 03:07:39 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Wed, 22 May 2013 16:07:39 +0800 Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ICC08Li0OiC08Li0OiC08Li0?= =?gb2312?b?OiC08Li0OiC08Li0OiDXqreiOiBQRVRzYyBwcm9ibGVt?= In-Reply-To: References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <878v3opa43.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B3@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2EF@peramail.mail.cn>, Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2F6@peramail.mail.cn> Because the difference in directory, firstly, I enter in d:/Program Files/VC (in VS cmd) cl OK.cpp it shows: > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > Copyright Microsoft Corposration. All rights reserved. > > OK.cpp > Microsoft Incremental Linker Version 10.00.30319.01 > Copyright MIcrosoft Corporation. All rights reserved. > > /out:OK.exe > OK.obj then echo %errorlevel% 0 then, I login into cygwin in VS cmd, enter cd /cygdrive/d/Program\ Files/VC cl OK.cpp with exactly the same output as you wrote. echo $? 0 then login into the directory that petsc is in: $ /home/petsc-3.3-p6/bin/win32fe/win32fe cl -c OK.cpp OK.cpp $echo $? 0 Thanks! Hao Hao ________________________________________ ???: Satish Balay [balay at mcs.anl.gov] ????: 2013?5?21? 22:42 ???: ?? ??: petsc-users at mcs.anl.gov ??: Re: [petsc-users] ??: ??: ??: ??: ??: ??: ??: PETsc problem On Tue, 21 May 2013, ?? wrote: > I have a program OK.cpp in the directory d:/Program Files/VC/ thats a strange location to do compiles. > > #incldue hm - typo? > int main() > { > printf("OK"); > return 0; > } > > after cl OK.cpp , > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > Copyright Microsoft Corposration. All rights reserved. > > OK.cpp > Microsoft Incremental Linker Version 10.00.30319.01 > Copyright MIcrosoft Corporation. All rights reserved. > > /out:OK.exe > OK.obj > > > Thanks! Please copy/paste the complete output from the following sequence of commands. Satish --------- C:\cygwin\home\balay\junk>cl OK.c Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 Copyright (C) Microsoft Corporation. All rights reserved. OK.c Microsoft (R) Incremental Linker Version 8.00.50727.762 Copyright (C) Microsoft Corporation. All rights reserved. /out:OK.exe OK.obj C:\cygwin\home\balay\junk>echo %errorlevel% 0 C:\cygwin\home\balay\junk>c:\cygwin\bin\bash --login balay at msnehalem2 ~ $ cd junk/ balay at msnehalem2 ~/junk $ cl OK.c Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 Copyright (C) Microsoft Corporation. All rights reserved. OK.c Microsoft (R) Incremental Linker Version 8.00.50727.762 Copyright (C) Microsoft Corporation. All rights reserved. /out:OK.exe OK.obj balay at msnehalem2 ~/junk $ echo $? 0 balay at msnehalem2 ~/junk $ ~/petsc.clone/bin/win32fe/win32fe cl -c OK.c OK.c balay at msnehalem2 ~/junk $ echo $? 0 From balay at mcs.anl.gov Wed May 22 09:16:49 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 22 May 2013 09:16:49 -0500 (CDT) Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ICC08Li0OiC08Li0OiC08Li0?= =?gb2312?b?OiC08Li0OiC08Li0OiDXqreiOiBQRVRzYyBwcm9ibGVt?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2F6@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2EF@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2F6@peramail.mail.cn> Message-ID: On Wed, 22 May 2013, ?? wrote: > Because the difference in directory, firstly, I enter in d:/Program Files/VC (in VS cmd) > cl OK.cpp > it shows: > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > > Copyright Microsoft Corposration. All rights reserved. > > > > OK.cpp > > Microsoft Incremental Linker Version 10.00.30319.01 > > Copyright MIcrosoft Corporation. All rights reserved. > > > > /out:OK.exe > > OK.obj > > then > echo %errorlevel% > 0 > > then, I login into cygwin in VS cmd, enter > cd /cygdrive/d/Program\ Files/VC > cl OK.cpp > with exactly the same output as you wrote. > > echo $? > 0 > > then login into the directory that petsc is in: I would have prefered a copy/paste of everything on the terminal - instead of this type of transcribing. :( > $ /home/petsc-3.3-p6/bin/win32fe/win32fe cl -c OK.cpp > OK.cpp Are you sure you didn't type some of ther command inbetween these two commands? Again a proper copy/paste would have helped. > > > $echo $? > 0 ok [if there was no other command inbetween the above two commands] its indicating that the compiler is ok here. Can you now try configure form this same terminal? [but first recheck the above command's return code] Satish > > Thanks! > > Hao > > > > Hao > > ________________________________________ > ???: Satish Balay [balay at mcs.anl.gov] > ????: 2013?5?21? 22:42 > ???: ?? > ??: petsc-users at mcs.anl.gov > ??: Re: [petsc-users] ??: ??: ??: ??: ??: ??: ??: PETsc problem > > On Tue, 21 May 2013, ?? wrote: > > > I have a program OK.cpp in the directory d:/Program Files/VC/ > > thats a strange location to do compiles. > > > > #incldue > > hm - typo? > > > int main() > > { > > printf("OK"); > > return 0; > > } > > > > after cl OK.cpp , > > > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > > Copyright Microsoft Corposration. All rights reserved. > > > > OK.cpp > > Microsoft Incremental Linker Version 10.00.30319.01 > > Copyright MIcrosoft Corporation. All rights reserved. > > > > /out:OK.exe > > OK.obj > > > > > > Thanks! > > Please copy/paste the complete output from the following sequence of > commands. > > Satish > > --------- > > C:\cygwin\home\balay\junk>cl OK.c > Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 > Copyright (C) Microsoft Corporation. All rights reserved. > > OK.c > Microsoft (R) Incremental Linker Version 8.00.50727.762 > Copyright (C) Microsoft Corporation. All rights reserved. > > /out:OK.exe > OK.obj > > C:\cygwin\home\balay\junk>echo %errorlevel% > 0 > > C:\cygwin\home\balay\junk>c:\cygwin\bin\bash --login > > balay at msnehalem2 ~ > $ cd junk/ > > balay at msnehalem2 ~/junk > $ cl OK.c > Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 > Copyright (C) Microsoft Corporation. All rights reserved. > > OK.c > Microsoft (R) Incremental Linker Version 8.00.50727.762 > Copyright (C) Microsoft Corporation. All rights reserved. > > /out:OK.exe > OK.obj > > balay at msnehalem2 ~/junk > $ echo $? > 0 > > balay at msnehalem2 ~/junk > $ ~/petsc.clone/bin/win32fe/win32fe cl -c OK.c > OK.c > > balay at msnehalem2 ~/junk > $ echo $? > 0 > From heikki.a.virtanen at hotmail.com Wed May 22 15:15:43 2013 From: heikki.a.virtanen at hotmail.com (Heikki Virtanen) Date: Wed, 22 May 2013 23:15:43 +0300 Subject: [petsc-users] Generalized eigenvalue problem with spectral transformations and mpiaij compatible preconditioners Message-ID: Hi, I have been trying to solve a generalized eigenvalue problem using SLEPc's EPS object. I have tried to parallelize my solver, so I should use PETSc's mpiaij matrices instead of seqaij matrices. Unfortunately, PETSc's LU preconditioner does not support MPI matrices. (this is said in the manual and if I use it with mpiaij matrices I get an error message) I get the eigenvalue problem solved with LU preconditioner and seqaij matrices but, I also have to use a spectral transformation ( shift-invert ) to improve convergence. But, as I mentioned before I cannot use LU preconditioner with mpiaij matrices. Thus, I have changed preconditioner to Hypre's BoomerAMG, for example. ( which supports mpiaij matrices ) When I use this combination ( BoomerAMG/shift-invert transformation/ Krylov-Schur solver) I get an early convergence failure after a couple of iterations. I have also tried other solvers and preconditioners, (Hypre's pilut, euclid, bjacobi and Jacobi-Davidson solver) but the result is the same. Without any preconditioner I also get the early convergence failure and without the spectral transformation convergence is too slow. Any comments or suggestions? -Heikki -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Wed May 22 15:22:30 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Wed, 22 May 2013 22:22:30 +0200 Subject: [petsc-users] Generalized eigenvalue problem with spectral transformations and mpiaij compatible preconditioners In-Reply-To: References: Message-ID: El 22/05/2013, a las 22:15, Heikki Virtanen escribi?: > Hi, I have been trying to solve a generalized eigenvalue problem using > SLEPc's EPS object. I have tried to parallelize my solver, so I should > use PETSc's mpiaij matrices instead of seqaij matrices. > Unfortunately, PETSc's LU preconditioner does not support MPI matrices. > (this is said in the manual and if I use it with mpiaij matrices I > get an error message) I get the eigenvalue problem solved with > LU preconditioner and seqaij matrices but, I also have to use a > spectral transformation ( shift-invert ) to improve convergence. > > But, as I mentioned before I cannot use LU preconditioner with mpiaij > matrices. Thus, I have changed preconditioner to Hypre's BoomerAMG, > for example. ( which supports mpiaij matrices ) When I use this > combination ( BoomerAMG/shift-invert transformation/ > Krylov-Schur solver) I get an early convergence failure after a couple > of iterations. I have also tried other solvers and preconditioners, > (Hypre's pilut, euclid, bjacobi and Jacobi-Davidson solver) but the result is > the same. Without any preconditioner I also get the early convergence > failure and without the spectral transformation convergence is > too slow. Any comments or suggestions? > > -Heikki Try a parallel LU such as MUMPS. This is mentioned in the manual. Jose From hzhang at mcs.anl.gov Wed May 22 15:22:58 2013 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Wed, 22 May 2013 15:22:58 -0500 Subject: [petsc-users] Generalized eigenvalue problem with spectral transformations and mpiaij compatible preconditioners In-Reply-To: References: Message-ID: Petsc does not support parallel LU. You can use Superlu_dist or mumps for parallel LU via petsc interface. Hong On Wed, May 22, 2013 at 3:15 PM, Heikki Virtanen < heikki.a.virtanen at hotmail.com> wrote: > Hi, I have been trying to solve a generalized eigenvalue problem using > SLEPc's EPS object. I have tried to parallelize my solver, so I should > use PETSc's mpiaij matrices instead of seqaij matrices. > Unfortunately, PETSc's LU preconditioner does not support MPI matrices. > (this is said in the manual and if I use it with mpiaij matrices I > get an error message) I get the eigenvalue problem solved with > LU preconditioner and seqaij matrices but, I also have to use a > spectral transformation ( shift-invert ) to improve convergence. > > But, as I mentioned before I cannot use LU preconditioner with mpiaij > matrices. Thus, I have changed preconditioner to Hypre's BoomerAMG, > for example. ( which supports mpiaij matrices ) When I use this > combination ( BoomerAMG/shift-invert transformation/ > Krylov-Schur solver) I get an early convergence failure after a couple > of iterations. I have also tried other solvers and preconditioners, > (Hypre's pilut, euclid, bjacobi and Jacobi-Davidson solver) but the result > is > the same. Without any preconditioner I also get the early convergence > failure and without the spectral transformation convergence is > too slow. Any comments or suggestions? > > -Heikki > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ling.zou at inl.gov Wed May 22 16:03:18 2013 From: ling.zou at inl.gov (Zou (Non-US), Ling) Date: Wed, 22 May 2013 15:03:18 -0600 Subject: [petsc-users] SNESSetJacobian and TSSetRHSJacobian Message-ID: Hi All, In 'SNESSetJacobian' I could pass 'SNESDefaultComputeJacobian' to let PETSc do the finite differencing for me to calculate Jacobian, i.e., SNESSetJacobian(snes, J, J, SNESDefaultComputeJacobian, PETSC_NULL); In TSSetRHSJacobian, this options seems to be invalid, i.e., it requires a function to evaluate the Jacobian, i.e., TSSetRHSJacobian(ts, J, J, FormRHSJacobian, PETSC_NULL); Is it possible that TS could also use PETSc finite differencing Jacobian as SNES does? Best, Ling -------------- next part -------------- An HTML attachment was scrubbed... URL: From prbrune at gmail.com Wed May 22 16:17:56 2013 From: prbrune at gmail.com (Peter Brune) Date: Wed, 22 May 2013 16:17:56 -0500 Subject: [petsc-users] SNESSetJacobian and TSSetRHSJacobian In-Reply-To: References: Message-ID: This may be done with the command-line option: -snes_fd - Peter On Wed, May 22, 2013 at 4:03 PM, Zou (Non-US), Ling wrote: > Hi All, > > In 'SNESSetJacobian' I could pass 'SNESDefaultComputeJacobian' to let > PETSc do the finite differencing for me to calculate Jacobian, i.e., > > SNESSetJacobian(snes, J, J, SNESDefaultComputeJacobian, PETSC_NULL); > > > In TSSetRHSJacobian, this options seems to be invalid, i.e., it requires a > function to evaluate the Jacobian, i.e., > > > TSSetRHSJacobian(ts, J, J, FormRHSJacobian, PETSC_NULL); > > > Is it possible that TS could also use PETSc finite differencing Jacobian > as SNES does? > > > Best, > > > Ling > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ling.zou at inl.gov Wed May 22 16:30:21 2013 From: ling.zou at inl.gov (Zou (Non-US), Ling) Date: Wed, 22 May 2013 15:30:21 -0600 Subject: [petsc-users] SNESSetJacobian and TSSetRHSJacobian In-Reply-To: References: Message-ID: Thank you Peter. Can I simply ignore this step: TSSetRHSJacobian(ts, J, J, SNESDefaultComputeJacobian, PETSC_NULL); And get the SNES context from TS, then set the Jacobian option this way: SNES snes; TSGetSNES(ts, &snes); SNESSetJacobian(snes, J, J, SNESDefaultComputeJacobian, PETSC_NULL); (Guess my last email didn't work well) Ling On Wed, May 22, 2013 at 3:17 PM, Peter Brune wrote: > This may be done with the command-line option: > > -snes_fd > > - Peter > > > On Wed, May 22, 2013 at 4:03 PM, Zou (Non-US), Ling wrote: > >> Hi All, >> >> In 'SNESSetJacobian' I could pass 'SNESDefaultComputeJacobian' to let >> PETSc do the finite differencing for me to calculate Jacobian, i.e., >> >> SNESSetJacobian(snes, J, J, SNESDefaultComputeJacobian, PETSC_NULL); >> >> >> In TSSetRHSJacobian, this options seems to be invalid, i.e., it requires >> a function to evaluate the Jacobian, i.e., >> >> >> TSSetRHSJacobian(ts, J, J, FormRHSJacobian, PETSC_NULL); >> >> >> Is it possible that TS could also use PETSc finite differencing Jacobian >> as SNES does? >> >> >> Best, >> >> >> Ling >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From prbrune at gmail.com Wed May 22 16:45:18 2013 From: prbrune at gmail.com (Peter Brune) Date: Wed, 22 May 2013 16:45:18 -0500 Subject: [petsc-users] SNESSetJacobian and TSSetRHSJacobian In-Reply-To: References: Message-ID: On Wed, May 22, 2013 at 4:30 PM, Zou (Non-US), Ling wrote: > Thank you Peter. > > Can I simply ignore this step: > > TSSetRHSJacobian(ts, J, J, SNESDefaultComputeJacobian, PETSC_NULL); > > > And get the SNES context from TS, then set the Jacobian option this way: > > SNES snes; > > TSGetSNES(ts, &snes); > > SNESSetJacobian(snes, J, J, SNESDefaultComputeJacobian, PETSC_NULL); > > > This should work as well. Note that SNESDefaultComputeJacobian is quite expensive and should be used for testing only, which is why I would hesitate to hard-code its use. - Peter (Guess my last email didn't work well) > > > Ling > > > > On Wed, May 22, 2013 at 3:17 PM, Peter Brune wrote: > >> This may be done with the command-line option: >> >> -snes_fd >> >> - Peter >> >> >> On Wed, May 22, 2013 at 4:03 PM, Zou (Non-US), Ling wrote: >> >>> Hi All, >>> >>> In 'SNESSetJacobian' I could pass 'SNESDefaultComputeJacobian' to let >>> PETSc do the finite differencing for me to calculate Jacobian, i.e., >>> >>> SNESSetJacobian(snes, J, J, SNESDefaultComputeJacobian, PETSC_NULL); >>> >>> >>> In TSSetRHSJacobian, this options seems to be invalid, i.e., it requires >>> a function to evaluate the Jacobian, i.e., >>> >>> >>> TSSetRHSJacobian(ts, J, J, FormRHSJacobian, PETSC_NULL); >>> >>> >>> Is it possible that TS could also use PETSc finite differencing Jacobian >>> as SNES does? >>> >>> >>> Best, >>> >>> >>> Ling >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hao.yu at peraglobal.com Wed May 22 18:15:33 2013 From: hao.yu at peraglobal.com (=?gb2312?B?0+C6xg==?=) Date: Thu, 23 May 2013 07:15:33 +0800 Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ICC08Li0OiAgtPC4tDogtPA=?= =?gb2312?b?uLQ6ILTwuLQ6ILTwuLQ6ILTwuLQ6INeqt6I6IFBFVHNjIHByb2JsZW0=?= In-Reply-To: References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B4@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2EF@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2F6@peramail.mail.cn>, Message-ID: <6318D45649EFFA44BCB5CE480854635E04E1E567C2F8@peramail.mail.cn> How can I just copy/paste what you wrote? I run configure, and it shows: cannot download--install MPICH under Windows. Suggest to install MPICH manually Thanks! Hao ________________________________________ ???: Satish Balay [balay at mcs.anl.gov] ????: 2013?5?22? 22:16 ???: ?? ??: petsc-users ??: Re: [petsc-users] ??: ??: ??: ??: ??: ??: ??: ??: PETsc problem On Wed, 22 May 2013, ?? wrote: > Because the difference in directory, firstly, I enter in d:/Program Files/VC (in VS cmd) > cl OK.cpp > it shows: > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > > Copyright Microsoft Corposration. All rights reserved. > > > > OK.cpp > > Microsoft Incremental Linker Version 10.00.30319.01 > > Copyright MIcrosoft Corporation. All rights reserved. > > > > /out:OK.exe > > OK.obj > > then > echo %errorlevel% > 0 > > then, I login into cygwin in VS cmd, enter > cd /cygdrive/d/Program\ Files/VC > cl OK.cpp > with exactly the same output as you wrote. > > echo $? > 0 > > then login into the directory that petsc is in: I would have prefered a copy/paste of everything on the terminal - instead of this type of transcribing. :( > $ /home/petsc-3.3-p6/bin/win32fe/win32fe cl -c OK.cpp > OK.cpp Are you sure you didn't type some of ther command inbetween these two commands? Again a proper copy/paste would have helped. > > > $echo $? > 0 ok [if there was no other command inbetween the above two commands] its indicating that the compiler is ok here. Can you now try configure form this same terminal? [but first recheck the above command's return code] Satish > > Thanks! > > Hao > > > > Hao > > ________________________________________ > ???: Satish Balay [balay at mcs.anl.gov] > ????: 2013?5?21? 22:42 > ???: ?? > ??: petsc-users at mcs.anl.gov > ??: Re: [petsc-users] ??: ??: ??: ??: ??: ??: ??: PETsc problem > > On Tue, 21 May 2013, ?? wrote: > > > I have a program OK.cpp in the directory d:/Program Files/VC/ > > thats a strange location to do compiles. > > > > #incldue > > hm - typo? > > > int main() > > { > > printf("OK"); > > return 0; > > } > > > > after cl OK.cpp , > > > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > > Copyright Microsoft Corposration. All rights reserved. > > > > OK.cpp > > Microsoft Incremental Linker Version 10.00.30319.01 > > Copyright MIcrosoft Corporation. All rights reserved. > > > > /out:OK.exe > > OK.obj > > > > > > Thanks! > > Please copy/paste the complete output from the following sequence of > commands. > > Satish > > --------- > > C:\cygwin\home\balay\junk>cl OK.c > Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 > Copyright (C) Microsoft Corporation. All rights reserved. > > OK.c > Microsoft (R) Incremental Linker Version 8.00.50727.762 > Copyright (C) Microsoft Corporation. All rights reserved. > > /out:OK.exe > OK.obj > > C:\cygwin\home\balay\junk>echo %errorlevel% > 0 > > C:\cygwin\home\balay\junk>c:\cygwin\bin\bash --login > > balay at msnehalem2 ~ > $ cd junk/ > > balay at msnehalem2 ~/junk > $ cl OK.c > Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 > Copyright (C) Microsoft Corporation. All rights reserved. > > OK.c > Microsoft (R) Incremental Linker Version 8.00.50727.762 > Copyright (C) Microsoft Corporation. All rights reserved. > > /out:OK.exe > OK.obj > > balay at msnehalem2 ~/junk > $ echo $? > 0 > > balay at msnehalem2 ~/junk > $ ~/petsc.clone/bin/win32fe/win32fe cl -c OK.c > OK.c > > balay at msnehalem2 ~/junk > $ echo $? > 0 > From cjm2176 at columbia.edu Wed May 22 18:31:14 2013 From: cjm2176 at columbia.edu (Colin McAuliffe) Date: Wed, 22 May 2013 19:31:14 -0400 Subject: [petsc-users] Memory logging in fortran Message-ID: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> Hi all, When calling PetscMemoryGetCurrentUsage or PetscMemoryGetMaximumUsage in a fortran code the memory usage returned is always zero. Adding command line options such as -malloc_log and -memory_info doesn't change this result. Is there something else I'm missing? Also, will use of these two functions give memory used by external packaged called by petsc? All the best, Colin -- Colin McAuliffe PhD Candidate Columbia University Department of Civil Engineering and Engineering Mechanics From bsmith at mcs.anl.gov Wed May 22 18:39:46 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 22 May 2013 18:39:46 -0500 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> Message-ID: <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> Colin, PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to the underlying operating system to get how much memory the process is using, as such they will also include external package memory. Unfortunately it depends on the operating system providing this information and often it does not, this is why you get 0. If you are a hacker you could look at the source for PetscMemoryGetCurrentUsage() and see how to get the information for your OS and add it to this routine. Barry On May 22, 2013, at 6:31 PM, Colin McAuliffe wrote: > Hi all, > > When calling PetscMemoryGetCurrentUsage or PetscMemoryGetMaximumUsage in a fortran code the memory usage returned is always zero. Adding command line options such as -malloc_log and -memory_info doesn't change this result. Is there something else I'm missing? Also, will use of these two functions give memory used by external packaged called by petsc? > > All the best, > Colin > > -- > Colin McAuliffe > PhD Candidate > Columbia University > Department of Civil Engineering and Engineering Mechanics From cjm2176 at columbia.edu Wed May 22 19:06:37 2013 From: cjm2176 at columbia.edu (Colin McAuliffe) Date: Wed, 22 May 2013 20:06:37 -0400 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> Message-ID: <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> Hi Barry, thanks for the quick response. I am using os x and so I would expect that petsc would be able to use getrusage in the following lines of mem.c: 115: #elif defined(PETSC_HAVE_GETRUSAGE) 116: getrusage(RUSAGE_SELF,&temp); Is this not the case? Here is the link to apple's getrusage manual page http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/10.7/man2/getrusage.2.html Quoting Barry Smith : > > Colin, > > PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to the > underlying operating system to get how much memory the process is > using, as such they will also include external package memory. > Unfortunately it depends on the operating system providing this > information and often it does not, this is why you get 0. If you are > a hacker you could look at the source for > PetscMemoryGetCurrentUsage() and see how to get the information for > your OS and add it to this routine. > > Barry > > > > On May 22, 2013, at 6:31 PM, Colin McAuliffe wrote: > >> Hi all, >> >> When calling PetscMemoryGetCurrentUsage or >> PetscMemoryGetMaximumUsage in a fortran code the memory usage >> returned is always zero. Adding command line options such as >> -malloc_log and -memory_info doesn't change this result. Is there >> something else I'm missing? Also, will use of these two functions >> give memory used by external packaged called by petsc? >> >> All the best, >> Colin >> >> -- >> Colin McAuliffe >> PhD Candidate >> Columbia University >> Department of Civil Engineering and Engineering Mechanics > > > -- Colin McAuliffe PhD Candidate Columbia University Department of Civil Engineering and Engineering Mechanics From bsmith at mcs.anl.gov Wed May 22 19:27:40 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 22 May 2013 19:27:40 -0500 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> Message-ID: <79102B9C-7A4D-49FC-97E7-1C7CA4BBFD44@mcs.anl.gov> On May 22, 2013, at 7:06 PM, Colin McAuliffe wrote: > Hi Barry, thanks for the quick response. > > I am using os x and so I would expect that petsc would be able to use getrusage in the following lines of mem.c: > > 115: #elif defined(PETSC_HAVE_GETRUSAGE) > 116: getrusage(RUSAGE_SELF,&temp); > > Is this not the case? You can run in the debugger, put a break point at that line to verify it goes there and if it does check the values of temp after the call. (Sometimes Apple's docs are more optimistic than reality.) Barry > > Here is the link to apple's getrusage manual page > http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/10.7/man2/getrusage.2.html > > Quoting Barry Smith : > >> >> Colin, >> >> PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to the underlying operating system to get how much memory the process is using, as such they will also include external package memory. Unfortunately it depends on the operating system providing this information and often it does not, this is why you get 0. If you are a hacker you could look at the source for PetscMemoryGetCurrentUsage() and see how to get the information for your OS and add it to this routine. >> >> Barry >> >> >> >> On May 22, 2013, at 6:31 PM, Colin McAuliffe wrote: >> >>> Hi all, >>> >>> When calling PetscMemoryGetCurrentUsage or PetscMemoryGetMaximumUsage in a fortran code the memory usage returned is always zero. Adding command line options such as -malloc_log and -memory_info doesn't change this result. Is there something else I'm missing? Also, will use of these two functions give memory used by external packaged called by petsc? >>> >>> All the best, >>> Colin >>> >>> -- >>> Colin McAuliffe >>> PhD Candidate >>> Columbia University >>> Department of Civil Engineering and Engineering Mechanics >> >> >> > > > > -- > Colin McAuliffe > PhD Candidate > Columbia University > Department of Civil Engineering and Engineering Mechanics From jedbrown at mcs.anl.gov Wed May 22 19:33:36 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 22 May 2013 19:33:36 -0500 Subject: [petsc-users] SNESSetJacobian and TSSetRHSJacobian In-Reply-To: References: Message-ID: <87d2sittzz.fsf@mcs.anl.gov> Peter Brune writes: >> And get the SNES context from TS, then set the Jacobian option this way: >> >> SNES snes; >> >> TSGetSNES(ts, &snes); >> >> SNESSetJacobian(snes, J, J, SNESDefaultComputeJacobian, PETSC_NULL); >> >> >> > This should work as well. Note that SNESDefaultComputeJacobian is quite > expensive and should be used for testing only, which is why I would > hesitate to hard-code its use. Also note that SNESDefaultComputeJacobian is named SNESComputeJacobianDefault now. If you already configured the nonzero structure, you can use -snes_fd_color or SNESComputeJacobianDefaultColor. From cjm2176 at columbia.edu Wed May 22 20:32:23 2013 From: cjm2176 at columbia.edu (Colin McAuliffe) Date: Wed, 22 May 2013 21:32:23 -0400 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <79102B9C-7A4D-49FC-97E7-1C7CA4BBFD44@mcs.anl.gov> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> <79102B9C-7A4D-49FC-97E7-1C7CA4BBFD44@mcs.anl.gov> Message-ID: <20130522213223.nltz7tpf9csc0k4w@cubmail.cc.columbia.edu> According to gdb the program goes here 110: #elif defined(PETSC_HAVE_TASK_INFO) 111: *mem = 0; The configuration flags from my petsc compilation indicate that both PETSC_HAVE_TASK_INFO and PETSC_HAVE_GETRUSAGE are 1. I'm not sure what have task info is supposed to do, is it safe to set it to 0 to see if this fixes the problem? Quoting Barry Smith : > > On May 22, 2013, at 7:06 PM, Colin McAuliffe wrote: > >> Hi Barry, thanks for the quick response. >> >> I am using os x and so I would expect that petsc would be able to >> use getrusage in the following lines of mem.c: >> >> 115: #elif defined(PETSC_HAVE_GETRUSAGE) >> 116: getrusage(RUSAGE_SELF,&temp); >> >> Is this not the case? > > You can run in the debugger, put a break point at that line to > verify it goes there and if it does check the values of temp after > the call. (Sometimes Apple's docs are more optimistic than reality.) > > Barry > >> >> Here is the link to apple's getrusage manual page >> http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/10.7/man2/getrusage.2.html >> >> Quoting Barry Smith : >> >>> >>> Colin, >>> >>> PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to the >>> underlying operating system to get how much memory the process >>> is using, as such they will also include external package >>> memory. Unfortunately it depends on the operating system >>> providing this information and often it does not, this is why >>> you get 0. If you are a hacker you could look at the source for >>> PetscMemoryGetCurrentUsage() and see how to get the information >>> for your OS and add it to this routine. >>> >>> Barry >>> >>> >>> >>> On May 22, 2013, at 6:31 PM, Colin McAuliffe wrote: >>> >>>> Hi all, >>>> >>>> When calling PetscMemoryGetCurrentUsage or >>>> PetscMemoryGetMaximumUsage in a fortran code the memory usage >>>> returned is always zero. Adding command line options such as >>>> -malloc_log and -memory_info doesn't change this result. Is there >>>> something else I'm missing? Also, will use of these two >>>> functions give memory used by external packaged called by petsc? >>>> >>>> All the best, >>>> Colin >>>> >>>> -- >>>> Colin McAuliffe >>>> PhD Candidate >>>> Columbia University >>>> Department of Civil Engineering and Engineering Mechanics >>> >>> >>> >> >> >> >> -- >> Colin McAuliffe >> PhD Candidate >> Columbia University >> Department of Civil Engineering and Engineering Mechanics > > > -- Colin McAuliffe PhD Candidate Columbia University Department of Civil Engineering and Engineering Mechanics From bsmith at mcs.anl.gov Wed May 22 20:48:45 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 22 May 2013 20:48:45 -0500 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <20130522213223.nltz7tpf9csc0k4w@cubmail.cc.columbia.edu> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> <79102B9C-7A4D-49FC-97E7-1C7CA4BBFD44@mcs.anl.gov> <20130522213223.nltz7tpf9csc0k4w@cubmail.cc.columbia.edu> Message-ID: <2E309698-F01F-4D2A-8C7B-0423571890A5@mcs.anl.gov> On May 22, 2013, at 8:32 PM, Colin McAuliffe wrote: > According to gdb the program goes here > > 110: #elif defined(PETSC_HAVE_TASK_INFO) > 111: *mem = 0; > > The configuration flags from my petsc compilation indicate that both PETSC_HAVE_TASK_INFO and PETSC_HAVE_GETRUSAGE are 1. I'm not sure what have task info is supposed to do, is it safe to set it to 0 to see if this fixes the problem? Remove the PETSC_HAVE_TASK_INFO stuff completely and then rerun make and see if it solves the problem. At some point the TASK_INFO stuff worked on the Apple some years ago, perhaps they turned if off. Barry > > Quoting Barry Smith : > >> >> On May 22, 2013, at 7:06 PM, Colin McAuliffe wrote: >> >>> Hi Barry, thanks for the quick response. >>> >>> I am using os x and so I would expect that petsc would be able to use getrusage in the following lines of mem.c: >>> >>> 115: #elif defined(PETSC_HAVE_GETRUSAGE) >>> 116: getrusage(RUSAGE_SELF,&temp); >>> >>> Is this not the case? >> >> You can run in the debugger, put a break point at that line to verify it goes there and if it does check the values of temp after the call. (Sometimes Apple's docs are more optimistic than reality.) >> >> Barry >> >>> >>> Here is the link to apple's getrusage manual page >>> http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/10.7/man2/getrusage.2.html >>> >>> Quoting Barry Smith : >>> >>>> >>>> Colin, >>>> >>>> PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to the underlying operating system to get how much memory the process is using, as such they will also include external package memory. Unfortunately it depends on the operating system providing this information and often it does not, this is why you get 0. If you are a hacker you could look at the source for PetscMemoryGetCurrentUsage() and see how to get the information for your OS and add it to this routine. >>>> >>>> Barry >>>> >>>> >>>> >>>> On May 22, 2013, at 6:31 PM, Colin McAuliffe wrote: >>>> >>>>> Hi all, >>>>> >>>>> When calling PetscMemoryGetCurrentUsage or PetscMemoryGetMaximumUsage in a fortran code the memory usage returned is always zero. Adding command line options such as -malloc_log and -memory_info doesn't change this result. Is there something else I'm missing? Also, will use of these two functions give memory used by external packaged called by petsc? >>>>> >>>>> All the best, >>>>> Colin >>>>> >>>>> -- >>>>> Colin McAuliffe >>>>> PhD Candidate >>>>> Columbia University >>>>> Department of Civil Engineering and Engineering Mechanics >>>> >>>> >>>> >>> >>> >>> >>> -- >>> Colin McAuliffe >>> PhD Candidate >>> Columbia University >>> Department of Civil Engineering and Engineering Mechanics >> >> >> > > > > -- > Colin McAuliffe > PhD Candidate > Columbia University > Department of Civil Engineering and Engineering Mechanics From balay at mcs.anl.gov Wed May 22 21:29:55 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 22 May 2013 21:29:55 -0500 (CDT) Subject: [petsc-users] =?gb2312?b?tPC4tDogILTwuLQ6ICC08Li0OiAgtPC4tDogtPA=?= =?gb2312?b?uLQ6ILTwuLQ6ILTwuLQ6ILTwuLQ6INeqt6I6IFBFVHNjIHByb2Js?= =?gb2312?b?ZW0=?= In-Reply-To: <6318D45649EFFA44BCB5CE480854635E04E1E567C2F8@peramail.mail.cn> References: <6318D45649EFFA44BCB5CE480854635E04E1E567C2AF@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B6@peramail.mail.cn> <87vc6rjoda.fsf@mcs.anl.gov> <6318D45649EFFA44BCB5CE480854635E04E1E567C2B8@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2BA@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E0@peramail.mail.cn> <6318D45649EFFA44BCB5CE480854635E04E1E567C2E9@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2EF@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2F6@peramail.mail.cn>, <6318D45649EFFA44BCB5CE480854635E04E1E567C2F8@peramail.mail.cn> Message-ID: On Thu, 23 May 2013, ?? wrote: > How can I just copy/paste what you wrote? Hm - thats not what I asked. Perhaps I'm not communicating properly. I was hoping *you* would copy/paste the complete session from your *cmd* terminal - this way I can see what you did. [insted of you typing in e-mail 'I did this and then I got that'] Or are you having trouble using copy/paste from cmd terminal? > I run configure, and it shows: > > cannot download--install MPICH under Windows. Suggest to install MPICH manually [again you are transcribing here - instead of copy/pasting the message from cmd terminal] As the message indicates - you should install MPICH for windows from http://www.mpich.org/downloads/ [and not use --download-mpich - when using MS compilers on windows] Satish > > Thanks! > > Hao > > > ________________________________________ > ???: Satish Balay [balay at mcs.anl.gov] > ????: 2013?5?22? 22:16 > ???: ?? > ??: petsc-users > ??: Re: [petsc-users] ??: ??: ??: ??: ??: ??: ??: ??: PETsc problem > > On Wed, 22 May 2013, ?? wrote: > > > Because the difference in directory, firstly, I enter in d:/Program Files/VC (in VS cmd) > > cl OK.cpp > > it shows: > > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > > > Copyright Microsoft Corposration. All rights reserved. > > > > > > OK.cpp > > > Microsoft Incremental Linker Version 10.00.30319.01 > > > Copyright MIcrosoft Corporation. All rights reserved. > > > > > > /out:OK.exe > > > OK.obj > > > > then > > echo %errorlevel% > > 0 > > > > then, I login into cygwin in VS cmd, enter > > cd /cygdrive/d/Program\ Files/VC > > cl OK.cpp > > with exactly the same output as you wrote. > > > > echo $? > > 0 > > > > then login into the directory that petsc is in: > > I would have prefered a copy/paste of everything on the terminal - > instead of this type of transcribing. :( > > > $ /home/petsc-3.3-p6/bin/win32fe/win32fe cl -c OK.cpp > > OK.cpp > > > Are you sure you didn't type some of ther command inbetween these two commands? Again a proper > copy/paste would have helped. > > > > > > > $echo $? > > 0 > > ok [if there was no other command inbetween the above two commands] > its indicating that the compiler is ok here. Can you now try configure > form this same terminal? [but first recheck the above command's return > code] > > Satish > > > > > > Thanks! > > > > Hao > > > > > > > > Hao > > > > ________________________________________ > > ???: Satish Balay [balay at mcs.anl.gov] > > ????: 2013?5?21? 22:42 > > ???: ?? > > ??: petsc-users at mcs.anl.gov > > ??: Re: [petsc-users] ??: ??: ??: ??: ??: ??: ??: PETsc problem > > > > On Tue, 21 May 2013, ?? wrote: > > > > > I have a program OK.cpp in the directory d:/Program Files/VC/ > > > > thats a strange location to do compiles. > > > > > > #incldue > > > > hm - typo? > > > > > int main() > > > { > > > printf("OK"); > > > return 0; > > > } > > > > > > after cl OK.cpp , > > > > > > Microsoft 32-bit C/C++ Optimizing Compiler Version 16.00.30319.01 for 80X86 > > > Copyright Microsoft Corposration. All rights reserved. > > > > > > OK.cpp > > > Microsoft Incremental Linker Version 10.00.30319.01 > > > Copyright MIcrosoft Corporation. All rights reserved. > > > > > > /out:OK.exe > > > OK.obj > > > > > > > > > Thanks! > > > > Please copy/paste the complete output from the following sequence of > > commands. > > > > Satish > > > > --------- > > > > C:\cygwin\home\balay\junk>cl OK.c > > Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 > > Copyright (C) Microsoft Corporation. All rights reserved. > > > > OK.c > > Microsoft (R) Incremental Linker Version 8.00.50727.762 > > Copyright (C) Microsoft Corporation. All rights reserved. > > > > /out:OK.exe > > OK.obj > > > > C:\cygwin\home\balay\junk>echo %errorlevel% > > 0 > > > > C:\cygwin\home\balay\junk>c:\cygwin\bin\bash --login > > > > balay at msnehalem2 ~ > > $ cd junk/ > > > > balay at msnehalem2 ~/junk > > $ cl OK.c > > Microsoft (R) C/C++ Optimizing Compiler Version 14.00.50727.762 for x64 > > Copyright (C) Microsoft Corporation. All rights reserved. > > > > OK.c > > Microsoft (R) Incremental Linker Version 8.00.50727.762 > > Copyright (C) Microsoft Corporation. All rights reserved. > > > > /out:OK.exe > > OK.obj > > > > balay at msnehalem2 ~/junk > > $ echo $? > > 0 > > > > balay at msnehalem2 ~/junk > > $ ~/petsc.clone/bin/win32fe/win32fe cl -c OK.c > > OK.c > > > > balay at msnehalem2 ~/junk > > $ echo $? > > 0 > > > From cjm2176 at columbia.edu Wed May 22 23:21:48 2013 From: cjm2176 at columbia.edu (Colin McAuliffe) Date: Thu, 23 May 2013 00:21:48 -0400 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <2E309698-F01F-4D2A-8C7B-0423571890A5@mcs.anl.gov> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> <79102B9C-7A4D-49FC-97E7-1C7CA4BBFD44@mcs.anl.gov> <20130522213223.nltz7tpf9csc0k4w@cubmail.cc.columbia.edu> <2E309698-F01F-4D2A-8C7B-0423571890A5@mcs.anl.gov> Message-ID: <20130523002148.9mo84ilqssko8g8g@cubmail.cc.columbia.edu> Removing the have task info stuff leads to getrusage being called. Though there must be a bug in the os x getrusage because the max resident set size it returns is too large to be right. I realize this is not related to petsc, but if you are interested the python script: import resource import sys t = range(0,10000) print "Size of t is ",sys.getsizeof(t),"bytes" print "Max resident set size is ",resource.getrusage(resource.RUSAGE_SELF)[2]*1024,"bytes" gives the output: Size of t is 80072 bytes Max resident set size is 4521459712 bytes I would guess the two values should be in the same ballpark, and actually the computed resident set size is larger than the amount of memory I have to begin with! Hopefully the apple developers would be able to help. Thanks Colin Quoting Barry Smith : > > On May 22, 2013, at 8:32 PM, Colin McAuliffe wrote: > >> According to gdb the program goes here >> >> 110: #elif defined(PETSC_HAVE_TASK_INFO) >> 111: *mem = 0; >> >> The configuration flags from my petsc compilation indicate that >> both PETSC_HAVE_TASK_INFO and PETSC_HAVE_GETRUSAGE are 1. I'm not >> sure what have task info is supposed to do, is it safe to set it to >> 0 to see if this fixes the problem? > > Remove the PETSC_HAVE_TASK_INFO stuff completely and then rerun > make and see if it solves the problem. > > At some point the TASK_INFO stuff worked on the Apple some > years ago, perhaps they turned if off. > > Barry > >> >> Quoting Barry Smith : >> >>> >>> On May 22, 2013, at 7:06 PM, Colin McAuliffe wrote: >>> >>>> Hi Barry, thanks for the quick response. >>>> >>>> I am using os x and so I would expect that petsc would be able to >>>> use getrusage in the following lines of mem.c: >>>> >>>> 115: #elif defined(PETSC_HAVE_GETRUSAGE) >>>> 116: getrusage(RUSAGE_SELF,&temp); >>>> >>>> Is this not the case? >>> >>> You can run in the debugger, put a break point at that line to >>> verify it goes there and if it does check the values of temp after >>> the call. (Sometimes Apple's docs are more optimistic than >>> reality.) >>> >>> Barry >>> >>>> >>>> Here is the link to apple's getrusage manual page >>>> http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/10.7/man2/getrusage.2.html >>>> >>>> Quoting Barry Smith : >>>> >>>>> >>>>> Colin, >>>>> >>>>> PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to >>>>> the underlying operating system to get how much memory the >>>>> process is using, as such they will also include external >>>>> package memory. Unfortunately it depends on the operating >>>>> system providing this information and often it does not, this >>>>> is why you get 0. If you are a hacker you could look at the >>>>> source for PetscMemoryGetCurrentUsage() and see how to get the >>>>> information for your OS and add it to this routine. >>>>> >>>>> Barry >>>>> >>>>> >>>>> >>>>> On May 22, 2013, at 6:31 PM, Colin McAuliffe >>>>> wrote: >>>>> >>>>>> Hi all, >>>>>> >>>>>> When calling PetscMemoryGetCurrentUsage or >>>>>> PetscMemoryGetMaximumUsage in a fortran code the memory usage >>>>>> returned is always zero. Adding command line options such as >>>>>> -malloc_log and -memory_info doesn't change this result. Is >>>>>> there something else I'm missing? Also, will use of these two >>>>>> functions give memory used by external packaged called by >>>>>> petsc? >>>>>> >>>>>> All the best, >>>>>> Colin >>>>>> >>>>>> -- >>>>>> Colin McAuliffe >>>>>> PhD Candidate >>>>>> Columbia University >>>>>> Department of Civil Engineering and Engineering Mechanics >>>>> >>>>> >>>>> >>>> >>>> >>>> >>>> -- >>>> Colin McAuliffe >>>> PhD Candidate >>>> Columbia University >>>> Department of Civil Engineering and Engineering Mechanics >>> >>> >>> >> >> >> >> -- >> Colin McAuliffe >> PhD Candidate >> Columbia University >> Department of Civil Engineering and Engineering Mechanics > > > -- Colin McAuliffe PhD Candidate Columbia University Department of Civil Engineering and Engineering Mechanics From pengxwang at hotmail.com Thu May 23 09:19:09 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Thu, 23 May 2013 09:19:09 -0500 Subject: [petsc-users] DMDACoor3d and VecView Message-ID: Hi, I am trying to plot the solution of a 3-D Poisson equation. The solution vector, Vec sol, is output to a file in binary format and then will be processed with a stand-alone program in which Vecload() is called. The number of processes required in the stand-alone post-processing program can be different from that in the solver program. That is, I can ask one process to created the data file for plotting. This way is much easier to combine the distributed vector in several processes into one process. The coordinates of nodes are needed to plot the 3-D distributions of the solution. The matrix and the vector are managed with DMDA, so the array for the coordinates can be obtained by DMDAGetCoordinateDA(da,&cda) and DMDAVecGetArray(cda,gc,&coors). Here, coors is defined as DMDACoor3d ***coors. Since the function VecVeiw can only output one vector, Vec sol, to a binary file , the array of coordinates must be output to another file and thus must be read separately. The problem is that the array of coordinates is local array and the number of processes in the post-processing program must be same as that in the solver. This means the Vec sol cannot be combined into one by only calling Vecload. Is there any efficient way to output the coordinates and combine them from several processes into one process like the way of writing and reading Vec sol? -------------- next part -------------- An HTML attachment was scrubbed... URL: From stali at geology.wisc.edu Thu May 23 09:42:41 2013 From: stali at geology.wisc.edu (Tabrez Ali) Date: Thu, 23 May 2013 09:42:41 -0500 Subject: [petsc-users] PETSc functions timeline Message-ID: <519E2AE1.5090404@geology.wisc.edu> Hello How can I quickly check when exactly a new function was added to a release. The changes webpage doesnt seem to list new functions. E.g., in which release was "vecgetsubvector" added? Thanks in advance. Tabrez From jedbrown at mcs.anl.gov Thu May 23 10:10:19 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 23 May 2013 10:10:19 -0500 Subject: [petsc-users] DMDACoor3d and VecView In-Reply-To: References: Message-ID: <87sj1dspes.fsf@mcs.anl.gov> Roc Wang writes: > The coordinates of nodes are needed to plot the 3-D distributions of > the solution. The matrix and the vector are managed with DMDA, so the > array for the coordinates can be obtained by > DMDAGetCoordinateDA(da,&cda) and DMDAVecGetArray(cda,gc,&coors). > Here, coors is defined as DMDACoor3d ***coors. Since the function > VecVeiw can only output one vector, Vec sol, to a binary file , the > array of coordinates must be output to another file and thus must be > read separately. Not true, just read them in the same order you wrote them. > The problem is that the array of coordinates is local array and the > number of processes in the post-processing program must be same as > that in the solver. DMGetCoordinates() returns a global Vec. Store that, not the local Vec. >From the names, you must have an older version of PETSc. Please upgrade to petsc-3.4. From jedbrown at mcs.anl.gov Thu May 23 10:15:41 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 23 May 2013 10:15:41 -0500 Subject: [petsc-users] PETSc functions timeline In-Reply-To: <519E2AE1.5090404@geology.wisc.edu> References: <519E2AE1.5090404@geology.wisc.edu> Message-ID: <87ppwhsp5u.fsf@mcs.anl.gov> Tabrez Ali writes: > Hello > > How can I quickly check when exactly a new function was added to a > release. The changes webpage doesnt seem to list new functions. > > E.g., in which release was "vecgetsubvector" added? Look for the commit that introduced that string: $ git log -S VecGetSubVector -- include/petscvec.h commit 10a9aa378a782793fd9f203c1c09fe3cad29bf1c Author: Jed Brown Date: Wed Nov 24 15:29:22 2010 +0100 Add VecGetSubVector and VecRestoreSubVector and test VecNest will implement this interface Hg-commit: eed11e97b1e3a27003b34b493491e2b849c3e0e4 Then describe it in terms of tags that contain it: $ git describe --contains 10a9aa378a782793fd9f203c1c09fe3cad29bf1c v3.2~1030^2~6 From pengxwang at hotmail.com Thu May 23 11:19:00 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Thu, 23 May 2013 11:19:00 -0500 Subject: [petsc-users] DMDACoor3d and VecView In-Reply-To: <87sj1dspes.fsf@mcs.anl.gov> References: , <87sj1dspes.fsf@mcs.anl.gov> Message-ID: > From: jedbrown at mcs.anl.gov > To: pengxwang at hotmail.com; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] DMDACoor3d and VecView > Date: Thu, 23 May 2013 10:10:19 -0500 > > Roc Wang writes: > > > The coordinates of nodes are needed to plot the 3-D distributions of > > the solution. The matrix and the vector are managed with DMDA, so the > > array for the coordinates can be obtained by > > DMDAGetCoordinateDA(da,&cda) and DMDAVecGetArray(cda,gc,&coors). > > Here, coors is defined as DMDACoor3d ***coors. Since the function > > VecVeiw can only output one vector, Vec sol, to a binary file , the > > array of coordinates must be output to another file and thus must be > > read separately. > > Not true, just read them in the same order you wrote them. Does this mean I can save the vectors in the same binary file? If yes, whether the following procedure is correct? /* writing vectors in solver program */ DM cda; Vec gc; ierr = DMDAGetCoordinateDA(da,&cda);CHKERRQ(ierr); ierr = DMDAGetCoordinates(cda, &gc);CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); ierr = VecView(sol,viewer);CHKERRQ(ierr); //write solution vector ierr = VecView(gc,viewer);CHKERRQ(ierr); //write coordinate vector ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); /* reading vectors in a stand-along program */ ierr = VecCreate(PETSC_COMM_WORLD,&sol);CHKERRQ(ierr); ierr = VecCreate(PETSC_COMM_WORLD,&gc);CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_READ,&viewer);CHKERRQ(ierr); ierr = VecLoad(sol,viewer);CHKERRQ(ierr); //read solution vector ierr = VecLoad(gc,viewer);CHKERRQ(ierr); //read coordinate vector ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > > The problem is that the array of coordinates is local array and the > > number of processes in the post-processing program must be same as > > that in the solver. > > DMGetCoordinates() returns a global Vec. Store that, not the local Vec. > > From the names, you must have an older version of PETSc. Please upgrade > to petsc-3.4. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu May 23 11:31:59 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 23 May 2013 11:31:59 -0500 Subject: [petsc-users] DMDACoor3d and VecView In-Reply-To: References: <87sj1dspes.fsf@mcs.anl.gov> Message-ID: <87ehcxslmo.fsf@mcs.anl.gov> Roc Wang writes: >> From: jedbrown at mcs.anl.gov >> To: pengxwang at hotmail.com; petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] DMDACoor3d and VecView >> Date: Thu, 23 May 2013 10:10:19 -0500 >> >> Roc Wang writes: >> >> > The coordinates of nodes are needed to plot the 3-D distributions of >> > the solution. The matrix and the vector are managed with DMDA, so the >> > array for the coordinates can be obtained by >> > DMDAGetCoordinateDA(da,&cda) and DMDAVecGetArray(cda,gc,&coors). >> > Here, coors is defined as DMDACoor3d ***coors. Since the function >> > VecVeiw can only output one vector, Vec sol, to a binary file , the >> > array of coordinates must be output to another file and thus must be >> > read separately. >> >> Not true, just read them in the same order you wrote them. > > Does this mean I can save the vectors in the same binary file? If yes, whether the following procedure is correct? > /* writing vectors in solver program */ > DM cda; > Vec gc; > ierr = DMDAGetCoordinateDA(da,&cda);CHKERRQ(ierr); > ierr = DMDAGetCoordinates(cda, &gc);CHKERRQ(ierr); > > > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); > ierr = VecView(sol,viewer);CHKERRQ(ierr); //write solution vector > ierr = VecView(gc,viewer);CHKERRQ(ierr); //write coordinate vector > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > /* reading vectors in a stand-along program */ > ierr = VecCreate(PETSC_COMM_WORLD,&sol);CHKERRQ(ierr); > ierr = VecCreate(PETSC_COMM_WORLD,&gc);CHKERRQ(ierr); Create your DM and coordinate DM, then DMCreateGlobalVector(da,&sol); DMCreateGlobalVector(cda,&gc); then below > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > ierr = VecLoad(sol,viewer);CHKERRQ(ierr); //read solution vector > ierr = VecLoad(gc,viewer);CHKERRQ(ierr); //read coordinate vector > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); Alternatively, write the DM too so that you can load it instead of making it from out-of-band knowledge of sizes: ierr = DMView(da,viewer);CHKERRQ(ierr); ierr = VecView(global,viewer);CHKERRQ(ierr); ierr = DMView(cda,viewer);CHKERRQ(ierr); ierr = VecView(gc,viewer);CHKERRQ(ierr); and read with ierr = DMLoad(da,bviewer);CHKERRQ(ierr); ierr = DMCreateGlobalVector(da,&sol);CHKERRQ(ierr); ierr = VecLoad(sol,viewer);CHKERRQ(ierr); ierr = DMLoad(cda,viewer);CHKERRQ(ierr); ierr = DMCreateGlobalVector(cda,&gc);CHKERRQ(ierr); ierr = VecLoad(gc,viewer);CHKERRQ(ierr); See src/dm/examples/tests/ex14.c and ex13.c. >> > The problem is that the array of coordinates is local array and the >> > number of processes in the post-processing program must be same as >> > that in the solver. >> >> DMGetCoordinates() returns a global Vec. Store that, not the local Vec. >> >> From the names, you must have an older version of PETSc. Please upgrade >> to petsc-3.4. > From cjm2176 at columbia.edu Thu May 23 16:11:17 2013 From: cjm2176 at columbia.edu (Colin McAuliffe) Date: Thu, 23 May 2013 17:11:17 -0400 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <2E309698-F01F-4D2A-8C7B-0423571890A5@mcs.anl.gov> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> <79102B9C-7A4D-49FC-97E7-1C7CA4BBFD44@mcs.anl.gov> <20130522213223.nltz7tpf9csc0k4w@cubmail.cc.columbia.edu> <2E309698-F01F-4D2A-8C7B-0423571890A5@mcs.anl.gov> Message-ID: <20130523171117.n4mvxw1lfkw4sw00@cubmail.cc.columbia.edu> So after comparing the results of getrusage with guppy, a python memory profiling tool, it seems the os x getrusage is returning bytes in contrast to the os x man pages which say the results are in kb. Doing the same tests on a linux machine, it looks like getrusage is returning kb as it says in the documentation. Anyway it is not problem to correct the results of the petsc memory functions from my application code. Thanks for all your help! Colin Quoting Barry Smith : > > On May 22, 2013, at 8:32 PM, Colin McAuliffe wrote: > >> According to gdb the program goes here >> >> 110: #elif defined(PETSC_HAVE_TASK_INFO) >> 111: *mem = 0; >> >> The configuration flags from my petsc compilation indicate that >> both PETSC_HAVE_TASK_INFO and PETSC_HAVE_GETRUSAGE are 1. I'm not >> sure what have task info is supposed to do, is it safe to set it to >> 0 to see if this fixes the problem? > > Remove the PETSC_HAVE_TASK_INFO stuff completely and then rerun > make and see if it solves the problem. > > At some point the TASK_INFO stuff worked on the Apple some > years ago, perhaps they turned if off. > > Barry > >> >> Quoting Barry Smith : >> >>> >>> On May 22, 2013, at 7:06 PM, Colin McAuliffe wrote: >>> >>>> Hi Barry, thanks for the quick response. >>>> >>>> I am using os x and so I would expect that petsc would be able to >>>> use getrusage in the following lines of mem.c: >>>> >>>> 115: #elif defined(PETSC_HAVE_GETRUSAGE) >>>> 116: getrusage(RUSAGE_SELF,&temp); >>>> >>>> Is this not the case? >>> >>> You can run in the debugger, put a break point at that line to >>> verify it goes there and if it does check the values of temp after >>> the call. (Sometimes Apple's docs are more optimistic than >>> reality.) >>> >>> Barry >>> >>>> >>>> Here is the link to apple's getrusage manual page >>>> http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/10.7/man2/getrusage.2.html >>>> >>>> Quoting Barry Smith : >>>> >>>>> >>>>> Colin, >>>>> >>>>> PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to >>>>> the underlying operating system to get how much memory the >>>>> process is using, as such they will also include external >>>>> package memory. Unfortunately it depends on the operating >>>>> system providing this information and often it does not, this >>>>> is why you get 0. If you are a hacker you could look at the >>>>> source for PetscMemoryGetCurrentUsage() and see how to get the >>>>> information for your OS and add it to this routine. >>>>> >>>>> Barry >>>>> >>>>> >>>>> >>>>> On May 22, 2013, at 6:31 PM, Colin McAuliffe >>>>> wrote: >>>>> >>>>>> Hi all, >>>>>> >>>>>> When calling PetscMemoryGetCurrentUsage or >>>>>> PetscMemoryGetMaximumUsage in a fortran code the memory usage >>>>>> returned is always zero. Adding command line options such as >>>>>> -malloc_log and -memory_info doesn't change this result. Is >>>>>> there something else I'm missing? Also, will use of these two >>>>>> functions give memory used by external packaged called by >>>>>> petsc? >>>>>> >>>>>> All the best, >>>>>> Colin >>>>>> >>>>>> -- >>>>>> Colin McAuliffe >>>>>> PhD Candidate >>>>>> Columbia University >>>>>> Department of Civil Engineering and Engineering Mechanics >>>>> >>>>> >>>>> >>>> >>>> >>>> >>>> -- >>>> Colin McAuliffe >>>> PhD Candidate >>>> Columbia University >>>> Department of Civil Engineering and Engineering Mechanics >>> >>> >>> >> >> >> >> -- >> Colin McAuliffe >> PhD Candidate >> Columbia University >> Department of Civil Engineering and Engineering Mechanics > > > -- Colin McAuliffe PhD Candidate Columbia University Department of Civil Engineering and Engineering Mechanics From bsmith at mcs.anl.gov Thu May 23 18:45:53 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 23 May 2013 18:45:53 -0500 Subject: [petsc-users] Memory logging in fortran In-Reply-To: <20130523171117.n4mvxw1lfkw4sw00@cubmail.cc.columbia.edu> References: <20130522193114.fhzr4cr5cs0wwgow@cubmail.cc.columbia.edu> <9ACC708D-A51C-4DD3-B94E-8A30D6E28F62@mcs.anl.gov> <20130522200637.qhvyqsblco8gswcs@cubmail.cc.columbia.edu> <79102B9C-7A4D-49FC-97E7-1C7CA4BBFD44@mcs.anl.gov> <20130522213223.nltz7tpf9csc0k4w@cubmail.cc.columbia.edu> <2E309698-F01F-4D2A-8C7B-0423571890A5@mcs.anl.gov> <20130523171117.n4mvxw1lfkw4sw00@cubmail.cc.columbia.edu> Message-ID: Colin, Thanks for the info. We'll try to reorganize getting the memory size on the Apple using your information and get it into a patch. Barry On May 23, 2013, at 4:11 PM, Colin McAuliffe wrote: > So after comparing the results of getrusage with guppy, a python memory profiling tool, it seems the os x getrusage is returning bytes in contrast to the os x man pages which say the results are in kb. Doing the same tests on a linux machine, it looks like getrusage is returning kb as it says in the documentation. Anyway it is not problem to correct the results of the petsc memory functions from my application code. Thanks for all your help! > > Colin > > Quoting Barry Smith : > >> >> On May 22, 2013, at 8:32 PM, Colin McAuliffe wrote: >> >>> According to gdb the program goes here >>> >>> 110: #elif defined(PETSC_HAVE_TASK_INFO) >>> 111: *mem = 0; >>> >>> The configuration flags from my petsc compilation indicate that both PETSC_HAVE_TASK_INFO and PETSC_HAVE_GETRUSAGE are 1. I'm not sure what have task info is supposed to do, is it safe to set it to 0 to see if this fixes the problem? >> >> Remove the PETSC_HAVE_TASK_INFO stuff completely and then rerun make and see if it solves the problem. >> >> At some point the TASK_INFO stuff worked on the Apple some years ago, perhaps they turned if off. >> >> Barry >> >>> >>> Quoting Barry Smith : >>> >>>> >>>> On May 22, 2013, at 7:06 PM, Colin McAuliffe wrote: >>>> >>>>> Hi Barry, thanks for the quick response. >>>>> >>>>> I am using os x and so I would expect that petsc would be able to use getrusage in the following lines of mem.c: >>>>> >>>>> 115: #elif defined(PETSC_HAVE_GETRUSAGE) >>>>> 116: getrusage(RUSAGE_SELF,&temp); >>>>> >>>>> Is this not the case? >>>> >>>> You can run in the debugger, put a break point at that line to verify it goes there and if it does check the values of temp after the call. (Sometimes Apple's docs are more optimistic than reality.) >>>> >>>> Barry >>>> >>>>> >>>>> Here is the link to apple's getrusage manual page >>>>> http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/10.7/man2/getrusage.2.html >>>>> >>>>> Quoting Barry Smith : >>>>> >>>>>> >>>>>> Colin, >>>>>> >>>>>> PetscMemoryGetCurrentUsage() and MaximumUsage() us calls to the underlying operating system to get how much memory the process is using, as such they will also include external package memory. Unfortunately it depends on the operating system providing this information and often it does not, this is why you get 0. If you are a hacker you could look at the source for PetscMemoryGetCurrentUsage() and see how to get the information for your OS and add it to this routine. >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> >>>>>> On May 22, 2013, at 6:31 PM, Colin McAuliffe wrote: >>>>>> >>>>>>> Hi all, >>>>>>> >>>>>>> When calling PetscMemoryGetCurrentUsage or PetscMemoryGetMaximumUsage in a fortran code the memory usage returned is always zero. Adding command line options such as -malloc_log and -memory_info doesn't change this result. Is there something else I'm missing? Also, will use of these two functions give memory used by external packaged called by petsc? >>>>>>> >>>>>>> All the best, >>>>>>> Colin >>>>>>> >>>>>>> -- >>>>>>> Colin McAuliffe >>>>>>> PhD Candidate >>>>>>> Columbia University >>>>>>> Department of Civil Engineering and Engineering Mechanics >>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Colin McAuliffe >>>>> PhD Candidate >>>>> Columbia University >>>>> Department of Civil Engineering and Engineering Mechanics >>>> >>>> >>>> >>> >>> >>> >>> -- >>> Colin McAuliffe >>> PhD Candidate >>> Columbia University >>> Department of Civil Engineering and Engineering Mechanics >> >> >> > > > > -- > Colin McAuliffe > PhD Candidate > Columbia University > Department of Civil Engineering and Engineering Mechanics From pengxwang at hotmail.com Thu May 23 21:25:51 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Thu, 23 May 2013 21:25:51 -0500 Subject: [petsc-users] DMDACoor3d and VecView In-Reply-To: <87ehcxslmo.fsf@mcs.anl.gov> References: , <87sj1dspes.fsf@mcs.anl.gov>, , <87ehcxslmo.fsf@mcs.anl.gov> Message-ID: Thanks, I followed the src/dm/examples/tests/ex14.c and ex13.c. There is still error. Please take a look at the followings: //The portion of writing vector in Solver program /* write da, solution Vec Coordinate Da and Vec cda in binary format */ ierr = PetscLogEventRegister("Generate Vector",VEC_CLASSID,&VECTOR_WRITE);CHKERRQ(ierr); ierr = PetscLogEventBegin(VECTOR_WRITE,0,0,0,0);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD,"writing vector in binary to vector.bin ...\n");CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); ierr = DMView(da,viewer);CHKERRQ(ierr); ierr = VecView(x,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); ierr = PetscLogEventEnd(VECTOR_WRITE,0,0,0,0);CHKERRQ(ierr); //The portion of reading vector in the post-processing program/* Read new vector in binary format */ DM da; Vec uu; ierr = PetscLogEventRegister("Read Vector",VEC_CLASSID,&VECTOR_READ);CHKERRQ(ierr); ierr = PetscLogEventBegin(VECTOR_READ,0,0,0,0);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from vector.bin ...\n");CHKERRQ(ierr); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_READ,&viewer);CHKERRQ(ierr); //ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A ierr = DMLoad(da,viewer);CHKERRQ(ierr); ierr = DMCreateGlobalVector(da,&uu);CHKERRQ(ierr); ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); ierr = PetscLogEventEnd(VECTOR_READ,0,0,0,0);CHKERRQ(ierr); If the line of ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A is commented out, the error is like:[2]PETSC ERROR: --------------------- Error Message ------------------------------------[2]PETSC ERROR: Invalid argument![2]PETSC ERROR: Wrong type of object: Parameter # 1![2]PETSC ERROR: ------------------------------------------------------------------------ If with ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A, the error is like:[0]PETSC ERROR: --------------------- Error Message ------------------------------------[0]PETSC ERROR: Arguments are incompatible![0]PETSC ERROR: Cannot change block size 3 to 1![0]PETSC ERROR: ------------------------------------------------------------------------ > From: jedbrown at mcs.anl.gov > To: pengxwang at hotmail.com; petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] DMDACoor3d and VecView > Date: Thu, 23 May 2013 11:31:59 -0500 > > Roc Wang writes: > > >> From: jedbrown at mcs.anl.gov > >> To: pengxwang at hotmail.com; petsc-users at mcs.anl.gov > >> Subject: Re: [petsc-users] DMDACoor3d and VecView > >> Date: Thu, 23 May 2013 10:10:19 -0500 > >> > >> Roc Wang writes: > >> > >> > The coordinates of nodes are needed to plot the 3-D distributions of > >> > the solution. The matrix and the vector are managed with DMDA, so the > >> > array for the coordinates can be obtained by > >> > DMDAGetCoordinateDA(da,&cda) and DMDAVecGetArray(cda,gc,&coors). > >> > Here, coors is defined as DMDACoor3d ***coors. Since the function > >> > VecVeiw can only output one vector, Vec sol, to a binary file , the > >> > array of coordinates must be output to another file and thus must be > >> > read separately. > >> > >> Not true, just read them in the same order you wrote them. > > > > Does this mean I can save the vectors in the same binary file? If yes, whether the following procedure is correct? > > /* writing vectors in solver program */ > > DM cda; > > Vec gc; > > ierr = DMDAGetCoordinateDA(da,&cda);CHKERRQ(ierr); > > ierr = DMDAGetCoordinates(cda, &gc);CHKERRQ(ierr); > > > > > > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); > > ierr = VecView(sol,viewer);CHKERRQ(ierr); //write solution vector > > ierr = VecView(gc,viewer);CHKERRQ(ierr); //write coordinate vector > > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > > > /* reading vectors in a stand-along program */ > > ierr = VecCreate(PETSC_COMM_WORLD,&sol);CHKERRQ(ierr); > > ierr = VecCreate(PETSC_COMM_WORLD,&gc);CHKERRQ(ierr); > > Create your DM and coordinate DM, then > > DMCreateGlobalVector(da,&sol); > DMCreateGlobalVector(cda,&gc); > > then below > > > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > > ierr = VecLoad(sol,viewer);CHKERRQ(ierr); //read solution vector > > ierr = VecLoad(gc,viewer);CHKERRQ(ierr); //read coordinate vector > > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > Alternatively, write the DM too so that you can load it instead of > making it from out-of-band knowledge of sizes: > > ierr = DMView(da,viewer);CHKERRQ(ierr); > ierr = VecView(global,viewer);CHKERRQ(ierr); > ierr = DMView(cda,viewer);CHKERRQ(ierr); > ierr = VecView(gc,viewer);CHKERRQ(ierr); > > and read with > > ierr = DMLoad(da,bviewer);CHKERRQ(ierr); > ierr = DMCreateGlobalVector(da,&sol);CHKERRQ(ierr); > ierr = VecLoad(sol,viewer);CHKERRQ(ierr); > ierr = DMLoad(cda,viewer);CHKERRQ(ierr); > ierr = DMCreateGlobalVector(cda,&gc);CHKERRQ(ierr); > ierr = VecLoad(gc,viewer);CHKERRQ(ierr); > > See src/dm/examples/tests/ex14.c and ex13.c. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu May 23 21:31:44 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 23 May 2013 21:31:44 -0500 Subject: [petsc-users] DMDACoor3d and VecView In-Reply-To: References: <87sj1dspes.fsf@mcs.anl.gov> <87ehcxslmo.fsf@mcs.anl.gov> Message-ID: <87wqqpp0q7.fsf@mcs.anl.gov> Roc Wang writes: > Thanks, I followed the src/dm/examples/tests/ex14.c and ex13.c. There is still > error. Please take a look at the followings: > //The portion of writing vector in Solver program > /* write da, solution Vec Coordinate Da and Vec cda in binary format */ > ierr = PetscLogEventRegister("Generate Vector",VEC_CLASSID,&VECTOR_WRITE); > CHKERRQ(ierr); > ierr = PetscLogEventBegin(VECTOR_WRITE,0,0,0,0);CHKERRQ(ierr); > ierr = PetscPrintf(PETSC_COMM_WORLD,"writing vector in binary to vector.bin > ...\n");CHKERRQ(ierr); > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_WRITE,& > viewer);CHKERRQ(ierr); > ierr = DMView(da,viewer);CHKERRQ(ierr); > ierr = VecView(x,viewer);CHKERRQ(ierr); > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > ierr = PetscLogEventEnd(VECTOR_WRITE,0,0,0,0);CHKERRQ(ierr); > //The portion of reading vector in the post-processing program > /* Read new vector in binary format */ > DM da; > Vec uu; > ierr = PetscLogEventRegister("Read Vector",VEC_CLASSID,&VECTOR_READ);CHKERRQ > (ierr); > ierr = PetscLogEventBegin(VECTOR_READ,0,0,0,0);CHKERRQ(ierr); > ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from vector.bin > ...\n");CHKERRQ(ierr); > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_READ,& > viewer);CHKERRQ(ierr); > //ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A > ierr = DMLoad(da,viewer);CHKERRQ(ierr); > ierr = DMCreateGlobalVector(da,&uu);CHKERRQ(ierr); > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > ierr = PetscLogEventEnd(VECTOR_READ,0,0,0,0);CHKERRQ(ierr); > If the line of ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A > is commented out, the error is like: > [2]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [2]PETSC ERROR: Invalid argument! > [2]PETSC ERROR: Wrong type of object: Parameter # 1! > [2]PETSC ERROR: > ------------------------------------------------------------------------ 1. ALWAYS paste the entire error message. 2. This is a memory error. > If with ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A, the > error is like: > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Arguments are incompatible! > [0]PETSC ERROR: Cannot change block size 3 to 1! > [0]PETSC ERROR: > ------------------------------------------------------------------------ Use VecSetOptionsPrefix(gc,"coord_") in both the code that writes and the code that reads, before VecView and VecLoad respectively. The simple PETSc binary format cannot distinguish the options unless you set different prefixes. From pengxwang at hotmail.com Thu May 23 22:23:37 2013 From: pengxwang at hotmail.com (Roc Wang) Date: Thu, 23 May 2013 22:23:37 -0500 Subject: [petsc-users] DMDACoor3d and VecView In-Reply-To: <87wqqpp0q7.fsf@mcs.anl.gov> References: <87sj1dspes.fsf@mcs.anl.gov> <87ehcxslmo.fsf@mcs.anl.gov> , <87wqqpp0q7.fsf@mcs.anl.gov> Message-ID: Thanks a lot. It works. Following is the code that reads: ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); ierr = DMLoad(da,viewer);CHKERRQ(ierr); ierr = DMCreateGlobalVector(da,&uu);CHKERRQ(ierr); ierr = VecSetOptionsPrefix(uu,"solution_");CHKERRQ(ierr); ierr = VecLoad(uu,viewer);CHKERRQ(ierr); ierr = DMCreate(PETSC_COMM_WORLD,&cda);CHKERRQ(ierr); ierr = DMLoad(cda,viewer);CHKERRQ(ierr); ierr = DMCreateGlobalVector(cda,&gc);CHKERRQ(ierr); ierr = VecSetOptionsPrefix(gc,"coord_");CHKERRQ(ierr); ierr = VecLoad(gc,viewer);CHKERRQ(ierr); > From: jedbrown at mcs.anl.gov > To: pengxwang at hotmail.com; petsc-users at mcs.anl.gov > Subject: RE: [petsc-users] DMDACoor3d and VecView > Date: Thu, 23 May 2013 21:31:44 -0500 > > Roc Wang writes: > > > Thanks, I followed the src/dm/examples/tests/ex14.c and ex13.c. There is still > > error. Please take a look at the followings: > > > //The portion of writing vector in Solver program > > /* write da, solution Vec Coordinate Da and Vec cda in binary format */ > > ierr = PetscLogEventRegister("Generate Vector",VEC_CLASSID,&VECTOR_WRITE); > > CHKERRQ(ierr); > > ierr = PetscLogEventBegin(VECTOR_WRITE,0,0,0,0);CHKERRQ(ierr); > > ierr = PetscPrintf(PETSC_COMM_WORLD,"writing vector in binary to vector.bin > > ...\n");CHKERRQ(ierr); > > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_WRITE,& > > viewer);CHKERRQ(ierr); > > > ierr = DMView(da,viewer);CHKERRQ(ierr); > > ierr = VecView(x,viewer);CHKERRQ(ierr); > > > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > ierr = PetscLogEventEnd(VECTOR_WRITE,0,0,0,0);CHKERRQ(ierr); > > > //The portion of reading vector in the post-processing program > > /* Read new vector in binary format */ > > DM da; > > Vec uu; > > > ierr = PetscLogEventRegister("Read Vector",VEC_CLASSID,&VECTOR_READ);CHKERRQ > > (ierr); > > ierr = PetscLogEventBegin(VECTOR_READ,0,0,0,0);CHKERRQ(ierr); > > ierr = PetscPrintf(PETSC_COMM_WORLD,"reading vector in binary from vector.bin > > ...\n");CHKERRQ(ierr); > > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"vector.bin",FILE_MODE_READ,& > > viewer);CHKERRQ(ierr); > > > //ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A > > > ierr = DMLoad(da,viewer);CHKERRQ(ierr); > > ierr = DMCreateGlobalVector(da,&uu);CHKERRQ(ierr); > > > ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr); > > ierr = PetscLogEventEnd(VECTOR_READ,0,0,0,0);CHKERRQ(ierr); > > > If the line of ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A > > is commented out, the error is like: > > [2]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [2]PETSC ERROR: Invalid argument! > > [2]PETSC ERROR: Wrong type of object: Parameter # 1! > > [2]PETSC ERROR: > > ------------------------------------------------------------------------ > > 1. ALWAYS paste the entire error message. > > 2. This is a memory error. > > > If with ierr = DMCreate(PETSC_COMM_WORLD,&da);CHKERRQ(ierr); //NOTE A, the > > error is like: > > [0]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [0]PETSC ERROR: Arguments are incompatible! > > [0]PETSC ERROR: Cannot change block size 3 to 1! > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > Use VecSetOptionsPrefix(gc,"coord_") in both the code that writes and > the code that reads, before VecView and VecLoad respectively. > > > The simple PETSc binary format cannot distinguish the options unless you > set different prefixes. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Eric.Chamberland at giref.ulaval.ca Fri May 24 12:30:31 2013 From: Eric.Chamberland at giref.ulaval.ca (Eric Chamberland) Date: Fri, 24 May 2013 13:30:31 -0400 Subject: [petsc-users] PARDISO and petsc In-Reply-To: <519A6D71.9040803@purdue.edu> References: <519A6AAD.7040805@purdue.edu> <874ndxh59q.fsf@mcs.anl.gov> <519A6D71.9040803@purdue.edu> Message-ID: <519FA3B7.4060209@giref.ulaval.ca> Hi, we are interested in beta-testing your interface if you need some beta-testers! thanks, Eric On 05/20/2013 02:37 PM, Michael Povolotskyi wrote: > On 05/20/2013 02:31 PM, Jed Brown wrote: >> Michael Povolotskyi writes: >> >>> Hello everybody, >>> does PETSc support interface to MKL PARDISO linear solver? >>> I did not find PARDISO in the documentation of PETSc, but may be >>> somebody tried this out already? >> Licensing is the main reason I have had no motivation to write an >> interface. We would accept patches if someone would like to write an >> interface. It should not be difficult and we can advise if you get >> stuck. > > > Sounds great. > Yes, I'm going to write an interface to the MPI version of PARDISO. > Will be in touch, > Michael. > From mpovolot at purdue.edu Fri May 24 12:35:59 2013 From: mpovolot at purdue.edu (Michael Povolotskyi) Date: Fri, 24 May 2013 13:35:59 -0400 Subject: [petsc-users] PARDISO and petsc In-Reply-To: <519FA3B7.4060209@giref.ulaval.ca> References: <519A6AAD.7040805@purdue.edu> <874ndxh59q.fsf@mcs.anl.gov> <519A6D71.9040803@purdue.edu> <519FA3B7.4060209@giref.ulaval.ca> Message-ID: <519FA4FF.6060709@purdue.edu> We just started, but I will let the PETSc community know as soon as we have the results. Michael. On 05/24/2013 01:30 PM, Eric Chamberland wrote: > Hi, > > we are interested in beta-testing your interface if you need some > beta-testers! > > thanks, > > Eric > > On 05/20/2013 02:37 PM, Michael Povolotskyi wrote: >> On 05/20/2013 02:31 PM, Jed Brown wrote: >>> Michael Povolotskyi writes: >>> >>>> Hello everybody, >>>> does PETSc support interface to MKL PARDISO linear solver? >>>> I did not find PARDISO in the documentation of PETSc, but may be >>>> somebody tried this out already? >>> Licensing is the main reason I have had no motivation to write an >>> interface. We would accept patches if someone would like to write an >>> interface. It should not be difficult and we can advise if you get >>> stuck. >> >> >> Sounds great. >> Yes, I'm going to write an interface to the MPI version of PARDISO. >> Will be in touch, >> Michael. >> > -- Michael Povolotskyi, PhD Research Assistant Professor Network for Computational Nanotechnology 207 S Martin Jischke Drive Purdue University, DLR, room 441-10 West Lafayette, Indiana 47907 phone: +1-765-494-9396 fax: +1-765-496-6026 From ling.zou at inl.gov Fri May 24 12:53:33 2013 From: ling.zou at inl.gov (Zou (Non-US), Ling) Date: Fri, 24 May 2013 11:53:33 -0600 Subject: [petsc-users] sequential Vec question Message-ID: Hi All, For a 1-D problem with multiple variables (u, v, w), it is natural to put those unknowns in a Vec for SNES to solve, like, u0, v0, w0, u1, v1, w1, ... uN, vN, wN as it would reduce the band width of the Jacobian matrix. When passing this kind of structure to FormFunction, it is however not convenient to access values as you need to jump around to get u(i-1), u(i) and u(i+1). It is more convenient to have the structure in an order like u0, u1, ... uN, v0, v1, ... vN, w0, w1, ... wN Thus I need some way to map b/w them. I looked at the manual, it seems that AO is probably the thing I am looking for. However, there are no specific examples for them. I also tried VecScatter, split the original Vec to three Vec(s), however, I am not quite sure if it is the right concept to use for this kind of scenario. Does anyone have experience on this topic (I bet it is quite common). Best, Ling -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri May 24 13:02:41 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 24 May 2013 13:02:41 -0500 Subject: [petsc-users] sequential Vec question In-Reply-To: References: Message-ID: On Fri, May 24, 2013 at 12:53 PM, Zou (Non-US), Ling wrote: > Hi All, > > For a 1-D problem with multiple variables (u, v, w), it is natural to put > those unknowns in a Vec for SNES to solve, like, > > u0, v0, w0, u1, v1, w1, ... uN, vN, wN > > as it would reduce the band width of the Jacobian matrix. > > When passing this kind of structure to FormFunction, it is however not > convenient to access values as you need to jump around to get u(i-1), u(i) > and u(i+1). It is more convenient to have the structure in an order like > > u0, u1, ... uN, v0, v1, ... vN, w0, w1, ... wN > No, not at all. That will kill your cache coherence, and generally be very bad. You can still index as you want, just use DMDAVecGetArrayDOF(), so that you get f[i-1][j][0] --> f_u(i-1,j) f[i][j+1][2] --> f_w(i,j+1) Matt > Thus I need some way to map b/w them. I looked at the manual, it seems > that AO is probably the thing I am looking for. However, there are no > specific examples for them. I also tried VecScatter, split the original Vec > to three Vec(s), however, I am not quite sure if it is the right concept to > use for this kind of scenario. > > Does anyone have experience on this topic (I bet it is quite common). > > > Best, > > Ling > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Fri May 24 14:51:12 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 24 May 2013 12:51:12 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <5196E8DC.1010602@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> Message-ID: <519FC4B0.9080702@uci.edu> Hi Jed, I followed your suggestion by using: -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 This works perfectly if I have a non-singular matrix. When instead I use periodic conditions for my system ( I set the nullspace removal correctly ), I receive an error saying a zero pivot is detected in the LU factorization. So, after some research, I found in the mailinglist a fix : -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_pc_factor_shift_nonzero Still I am receiving the following error [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Detected zero pivot in LU factorization: see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance 2.22045e-14! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24 14:40:48 2013 [0]PETSC ERROR: Libraries linked from [0]PETSC ERROR: Configure run at [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatPivotCheck_none() line 583 in src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h [0]PETSC ERROR: MatPivotCheck() line 602 in src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in src/mat/impls/aij/seq/aijfact.c [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in src/mat/interface/matrix.c [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in src/ksp/pc/impls/bjacobi/bjacobi.c [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c What could the reason be? Thank you, Michele On 05/17/2013 07:35 PM, Michele Rosso wrote: > Thank you very much. I will try and let you know. > > Michele > > On 05/17/2013 07:01 PM, Jed Brown wrote: >> Michele Rosso writes: >> >>> I noticed that the problem appears even if I use CG with the default >>> preconditioner: commenting KSPSetDM() solves the problem. >> Okay, this issue can't show up if you use SNES, but it's a consequence >> of making geometric multigrid work with a pure KSP interface. You can >> either use KSPSetComputeOperators() to put your assembly in a function >> (which will also be called on coarse levels if you use -pc_type mg >> without Galerkin coarse operators) or you can can provide the Jacobian >> using KSPSetOperators() as usual, but also call KSPSetDMActive() so that >> the DM is not used for computing/updating the Jacobian. >> >> The logic is cleaner in petsc-3.4 and I think it just does the right >> thing in your case. >> >>> So basically without a proper grid (it seems no grid with an even >>> numbers of nodes qualifies) and with my own system matrix, I cannot use >>> any type of multigrid >>> pre-conditioner? >> You can use all the AMG methods without setting a DM. >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 24 14:55:57 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 24 May 2013 14:55:57 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <519FC4B0.9080702@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> Message-ID: <8738tcp2ya.fsf@mcs.anl.gov> Michele Rosso writes: > Hi Jed, > > I followed your suggestion by using: > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > > This works perfectly if I have a non-singular matrix. When instead I use > periodic conditions for my system ( I set the nullspace removal > correctly ), > I receive an error saying a zero pivot is detected in the LU > factorization. So, after some research, I found in the mailinglist a fix : > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_pc_factor_shift_nonzero It'll need to be -mg_coarse_sub_pc_factor_shift_nonzero With petsc-3.4 (which you should upgrade to), use -mg_coarse_sub_pc_factor_shift_type NONZERO The reason you need this "sub" prefix is that the code always restricts using block Jacobi (usually localized so that all the entries are in one block), before applying the direct coarse solver. > Still I am receiving the following error > > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Detected zero pivot in LU factorization: > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! > [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance 2.22045e-14! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > 11:26:24 CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24 > 14:40:48 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatPivotCheck_none() line 583 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatPivotCheck() line 602 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in > src/mat/impls/aij/seq/aijfact.c > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in src/mat/interface/matrix.c > [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > src/ksp/pc/impls/bjacobi/bjacobi.c > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c > > What could the reason be? > Thank you, > > Michele > > > > On 05/17/2013 07:35 PM, Michele Rosso wrote: >> Thank you very much. I will try and let you know. >> >> Michele >> >> On 05/17/2013 07:01 PM, Jed Brown wrote: >>> Michele Rosso writes: >>> >>>> I noticed that the problem appears even if I use CG with the default >>>> preconditioner: commenting KSPSetDM() solves the problem. >>> Okay, this issue can't show up if you use SNES, but it's a consequence >>> of making geometric multigrid work with a pure KSP interface. You can >>> either use KSPSetComputeOperators() to put your assembly in a function >>> (which will also be called on coarse levels if you use -pc_type mg >>> without Galerkin coarse operators) or you can can provide the Jacobian >>> using KSPSetOperators() as usual, but also call KSPSetDMActive() so that >>> the DM is not used for computing/updating the Jacobian. >>> >>> The logic is cleaner in petsc-3.4 and I think it just does the right >>> thing in your case. >>> >>>> So basically without a proper grid (it seems no grid with an even >>>> numbers of nodes qualifies) and with my own system matrix, I cannot use >>>> any type of multigrid >>>> pre-conditioner? >>> You can use all the AMG methods without setting a DM. >>> >> From knepley at gmail.com Fri May 24 15:04:30 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 24 May 2013 15:04:30 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <8738tcp2ya.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> Message-ID: On Fri, May 24, 2013 at 2:55 PM, Jed Brown wrote: > Michele Rosso writes: > > > Hi Jed, > > > > I followed your suggestion by using: > > > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > > > > This works perfectly if I have a non-singular matrix. When instead I use > > periodic conditions for my system ( I set the nullspace removal > > correctly ), > > I receive an error saying a zero pivot is detected in the LU > > factorization. So, after some research, I found in the mailinglist a fix > : > > > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > > -mg_coarse_pc_factor_shift_nonzero > > It'll need to be -mg_coarse_sub_pc_factor_shift_nonzero > > With petsc-3.4 (which you should upgrade to), use > -mg_coarse_sub_pc_factor_shift_type NONZERO > > The reason you need this "sub" prefix is that the code always restricts > using block Jacobi (usually localized so that all the entries are in one > block), before applying the direct coarse solver. I think this is less elegant than -mg_coarse_pc_type svd Matt > > Still I am receiving the following error > > > > > > [0]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [0]PETSC ERROR: Detected zero pivot in LU factorization: > > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! > > [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance > 2.22045e-14! > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > > 11:26:24 CDT 2012 > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24 > > 14:40:48 2013 > > [0]PETSC ERROR: Libraries linked from > > [0]PETSC ERROR: Configure run at > > [0]PETSC ERROR: Configure options > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: MatPivotCheck_none() line 583 in > > > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > > [0]PETSC ERROR: MatPivotCheck() line 602 in > > > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in > > src/mat/impls/aij/seq/aijfact.c > > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in > src/mat/interface/matrix.c > > [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c > > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > > src/ksp/pc/impls/bjacobi/bjacobi.c > > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in > src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > > src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c > > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c > > > > What could the reason be? > > Thank you, > > > > Michele > > > > > > > > On 05/17/2013 07:35 PM, Michele Rosso wrote: > >> Thank you very much. I will try and let you know. > >> > >> Michele > >> > >> On 05/17/2013 07:01 PM, Jed Brown wrote: > >>> Michele Rosso writes: > >>> > >>>> I noticed that the problem appears even if I use CG with the default > >>>> preconditioner: commenting KSPSetDM() solves the problem. > >>> Okay, this issue can't show up if you use SNES, but it's a consequence > >>> of making geometric multigrid work with a pure KSP interface. You can > >>> either use KSPSetComputeOperators() to put your assembly in a function > >>> (which will also be called on coarse levels if you use -pc_type mg > >>> without Galerkin coarse operators) or you can can provide the Jacobian > >>> using KSPSetOperators() as usual, but also call KSPSetDMActive() so > that > >>> the DM is not used for computing/updating the Jacobian. > >>> > >>> The logic is cleaner in petsc-3.4 and I think it just does the right > >>> thing in your case. > >>> > >>>> So basically without a proper grid (it seems no grid with an even > >>>> numbers of nodes qualifies) and with my own system matrix, I cannot > use > >>>> any type of multigrid > >>>> pre-conditioner? > >>> You can use all the AMG methods without setting a DM. > >>> > >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Fri May 24 16:35:12 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 24 May 2013 14:35:12 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> Message-ID: <519FDD10.3060900@uci.edu> I tried -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_sub_pc_factor_shift_nonzero but I still get [0]PETSC ERROR: Detected zero pivot in LU factorization: see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! [0]PETSC ERROR: Zero pivot row 280 value 6.58999e-17 tolerance 2.22045e-14! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:08:33 2013 [0]PETSC ERROR: Libraries linked from [0]PETSC ERROR: Configure run at [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatPivotCheck_none() line 583 in src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h [0]PETSC ERROR: MatPivotCheck() line 602 in src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in src/mat/impls/aij/seq/aijfact.c [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in src/mat/interface/matrix.c [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in src/ksp/pc/impls/bjacobi/bjacobi.c [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c If instead I use -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_pc_type svd as Matthew suggested, I am told that there is an invalid argument. Michele On 05/24/2013 01:04 PM, Matthew Knepley wrote: > On Fri, May 24, 2013 at 2:55 PM, Jed Brown > wrote: > > Michele Rosso > writes: > > > Hi Jed, > > > > I followed your suggestion by using: > > > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > > > > This works perfectly if I have a non-singular matrix. When > instead I use > > periodic conditions for my system ( I set the nullspace removal > > correctly ), > > I receive an error saying a zero pivot is detected in the LU > > factorization. So, after some research, I found in the > mailinglist a fix : > > > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > > -mg_coarse_pc_factor_shift_nonzero > > It'll need to be -mg_coarse_sub_pc_factor_shift_nonzero > > With petsc-3.4 (which you should upgrade to), use > -mg_coarse_sub_pc_factor_shift_type NONZERO > > The reason you need this "sub" prefix is that the code always > restricts > using block Jacobi (usually localized so that all the entries are > in one > block), before applying the direct coarse solver. > > > I think this is less elegant than > > -mg_coarse_pc_type svd > > Matt > > > Still I am receiving the following error > > > > > > [0]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [0]PETSC ERROR: Detected zero pivot in LU factorization: > > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! > > [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance > 2.22045e-14! > > [0]PETSC ERROR: > > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > > 11:26:24 CDT 2012 > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: > > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24 > > 14:40:48 2013 > > [0]PETSC ERROR: Libraries linked from > > [0]PETSC ERROR: Configure run at > > [0]PETSC ERROR: Configure options > > [0]PETSC ERROR: > > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: MatPivotCheck_none() line 583 in > > > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > > [0]PETSC ERROR: MatPivotCheck() line 602 in > > > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in > > src/mat/impls/aij/seq/aijfact.c > > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in > src/mat/interface/matrix.c > > [0]PETSC ERROR: PCSetUp_LU() line 160 in > src/ksp/pc/impls/factor/lu/lu.c > > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: KSPSetUp() line 278 in > src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > > src/ksp/pc/impls/bjacobi/bjacobi.c > > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in > src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > > src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: KSPSolve() line 403 in > src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in > src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c > > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c > > [0]PETSC ERROR: KSPSolve() line 446 in > src/ksp/ksp/interface/itfunc.c > > > > What could the reason be? > > Thank you, > > > > Michele > > > > > > > > On 05/17/2013 07:35 PM, Michele Rosso wrote: > >> Thank you very much. I will try and let you know. > >> > >> Michele > >> > >> On 05/17/2013 07:01 PM, Jed Brown wrote: > >>> Michele Rosso> writes: > >>> > >>>> I noticed that the problem appears even if I use CG with the > default > >>>> preconditioner: commenting KSPSetDM() solves the problem. > >>> Okay, this issue can't show up if you use SNES, but it's a > consequence > >>> of making geometric multigrid work with a pure KSP interface. > You can > >>> either use KSPSetComputeOperators() to put your assembly in a > function > >>> (which will also be called on coarse levels if you use -pc_type mg > >>> without Galerkin coarse operators) or you can can provide the > Jacobian > >>> using KSPSetOperators() as usual, but also call > KSPSetDMActive() so that > >>> the DM is not used for computing/updating the Jacobian. > >>> > >>> The logic is cleaner in petsc-3.4 and I think it just does the > right > >>> thing in your case. > >>> > >>>> So basically without a proper grid (it seems no grid with an even > >>>> numbers of nodes qualifies) and with my own system matrix, I > cannot use > >>>> any type of multigrid > >>>> pre-conditioner? > >>> You can use all the AMG methods without setting a DM. > >>> > >> > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri May 24 16:37:24 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 24 May 2013 16:37:24 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <519FDD10.3060900@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> Message-ID: On Fri, May 24, 2013 at 4:35 PM, Michele Rosso wrote: > I tried > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_sub_pc_factor_shift_nonzero > > but I still get > > [0]PETSC ERROR: Detected zero pivot in LU factorization: > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! > [0]PETSC ERROR: Zero pivot row 280 value 6.58999e-17 tolerance 2.22045e-14! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 > CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:08:33 > 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatPivotCheck_none() line 583 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatPivotCheck() line 602 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in > src/mat/impls/aij/seq/aijfact.c > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in > src/mat/interface/matrix.c > [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > src/ksp/pc/impls/bjacobi/bjacobi.c > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c > > > If instead I use > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_pc_type svd > > as Matthew suggested, I am told that there is an invalid argument. > 1) When you send these in, we need to see -ksp_view, so we know what is begin used 2) This is not enough information above. I use this all the time, or I would not have suggested it Matt > Michele > > > > > > > > > > > > On 05/24/2013 01:04 PM, Matthew Knepley wrote: > > On Fri, May 24, 2013 at 2:55 PM, Jed Brown wrote: > >> Michele Rosso writes: >> >> > Hi Jed, >> > >> > I followed your suggestion by using: >> > >> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> > >> > This works perfectly if I have a non-singular matrix. When instead I use >> > periodic conditions for my system ( I set the nullspace removal >> > correctly ), >> > I receive an error saying a zero pivot is detected in the LU >> > factorization. So, after some research, I found in the mailinglist a >> fix : >> > >> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> > -mg_coarse_pc_factor_shift_nonzero >> >> It'll need to be -mg_coarse_sub_pc_factor_shift_nonzero >> >> With petsc-3.4 (which you should upgrade to), use >> -mg_coarse_sub_pc_factor_shift_type NONZERO >> >> The reason you need this "sub" prefix is that the code always restricts >> using block Jacobi (usually localized so that all the entries are in one >> block), before applying the direct coarse solver. > > > I think this is less elegant than > > -mg_coarse_pc_type svd > > Matt > > >> > Still I am receiving the following error >> > >> > >> > [0]PETSC ERROR: --------------------- Error Message >> > ------------------------------------ >> > [0]PETSC ERROR: Detected zero pivot in LU factorization: >> > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >> > [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance >> 2.22045e-14! >> > [0]PETSC ERROR: >> > ------------------------------------------------------------------------ >> > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 >> > 11:26:24 CDT 2012 >> > [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> > [0]PETSC ERROR: See docs/index.html for manual pages. >> > [0]PETSC ERROR: >> > ------------------------------------------------------------------------ >> > [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24 >> > 14:40:48 2013 >> > [0]PETSC ERROR: Libraries linked from >> > [0]PETSC ERROR: Configure run at >> > [0]PETSC ERROR: Configure options >> > [0]PETSC ERROR: >> > ------------------------------------------------------------------------ >> > [0]PETSC ERROR: MatPivotCheck_none() line 583 in >> > >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> > [0]PETSC ERROR: MatPivotCheck() line 602 in >> > >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >> > src/mat/impls/aij/seq/aijfact.c >> > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >> src/mat/interface/matrix.c >> > [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c >> > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c >> > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c >> > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >> > src/ksp/pc/impls/bjacobi/bjacobi.c >> > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >> src/ksp/pc/interface/precon.c >> > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >> > src/ksp/ksp/interface/itfunc.c >> > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c >> > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c >> > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c >> > >> > What could the reason be? >> > Thank you, >> > >> > Michele >> > >> > >> > >> > On 05/17/2013 07:35 PM, Michele Rosso wrote: >> >> Thank you very much. I will try and let you know. >> >> >> >> Michele >> >> >> >> On 05/17/2013 07:01 PM, Jed Brown wrote: >> >>> Michele Rosso writes: >> >>> >> >>>> I noticed that the problem appears even if I use CG with the default >> >>>> preconditioner: commenting KSPSetDM() solves the problem. >> >>> Okay, this issue can't show up if you use SNES, but it's a consequence >> >>> of making geometric multigrid work with a pure KSP interface. You can >> >>> either use KSPSetComputeOperators() to put your assembly in a function >> >>> (which will also be called on coarse levels if you use -pc_type mg >> >>> without Galerkin coarse operators) or you can can provide the Jacobian >> >>> using KSPSetOperators() as usual, but also call KSPSetDMActive() so >> that >> >>> the DM is not used for computing/updating the Jacobian. >> >>> >> >>> The logic is cleaner in petsc-3.4 and I think it just does the right >> >>> thing in your case. >> >>> >> >>>> So basically without a proper grid (it seems no grid with an even >> >>>> numbers of nodes qualifies) and with my own system matrix, I cannot >> use >> >>>> any type of multigrid >> >>>> pre-conditioner? >> >>> You can use all the AMG methods without setting a DM. >> >>> >> >> >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Fri May 24 16:46:39 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 24 May 2013 14:46:39 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> Message-ID: <519FDFBF.3080405@uci.edu> In both cases I used -ksp_view and -option_left. For case 1 ( -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_sub_pc_factor_shift_nonzero ) I posted the only output I had. For case 2 ( -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_pc_type svd) the output was too long to fit into an e-mail. Please find it attached. Michele On 05/24/2013 02:37 PM, Matthew Knepley wrote: > On Fri, May 24, 2013 at 4:35 PM, Michele Rosso > wrote: > > I tried > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_sub_pc_factor_shift_nonzero > > but I still get > > [0]PETSC ERROR: Detected zero pivot in LU factorization: > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! > [0]PETSC ERROR: Zero pivot row 280 value 6.58999e-17 tolerance > 2.22045e-14! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > 11:26:24 CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 > 16:08:33 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatPivotCheck_none() line 583 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatPivotCheck() line 602 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in > src/mat/impls/aij/seq/aijfact.c > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in > src/mat/interface/matrix.c > [0]PETSC ERROR: PCSetUp_LU() line 160 in > src/ksp/pc/impls/factor/lu/lu.c > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > src/ksp/pc/impls/bjacobi/bjacobi.c > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in > src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c > > > If instead I use > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_pc_type svd > > as Matthew suggested, I am told that there is an invalid argument. > > > 1) When you send these in, we need to see -ksp_view, so we know what > is begin used > > 2) This is not enough information above. I use this all the time, or I > would not have suggested it > > Matt > > Michele > > > > > > > > > > > > On 05/24/2013 01:04 PM, Matthew Knepley wrote: >> On Fri, May 24, 2013 at 2:55 PM, Jed Brown > > wrote: >> >> Michele Rosso > writes: >> >> > Hi Jed, >> > >> > I followed your suggestion by using: >> > >> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> > >> > This works perfectly if I have a non-singular matrix. When >> instead I use >> > periodic conditions for my system ( I set the nullspace >> removal >> > correctly ), >> > I receive an error saying a zero pivot is detected in the LU >> > factorization. So, after some research, I found in the >> mailinglist a fix : >> > >> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> > -mg_coarse_pc_factor_shift_nonzero >> >> It'll need to be -mg_coarse_sub_pc_factor_shift_nonzero >> >> With petsc-3.4 (which you should upgrade to), use >> -mg_coarse_sub_pc_factor_shift_type NONZERO >> >> The reason you need this "sub" prefix is that the code always >> restricts >> using block Jacobi (usually localized so that all the entries >> are in one >> block), before applying the direct coarse solver. >> >> >> I think this is less elegant than >> >> -mg_coarse_pc_type svd >> >> Matt >> >> > Still I am receiving the following error >> > >> > >> > [0]PETSC ERROR: --------------------- Error Message >> > ------------------------------------ >> > [0]PETSC ERROR: Detected zero pivot in LU factorization: >> > see >> http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >> > [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 >> tolerance 2.22045e-14! >> > [0]PETSC ERROR: >> > >> ------------------------------------------------------------------------ >> > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed >> Aug 29 >> > 11:26:24 CDT 2012 >> > [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> > [0]PETSC ERROR: See docs/faq.html for hints about trouble >> shooting. >> > [0]PETSC ERROR: See docs/index.html for manual pages. >> > [0]PETSC ERROR: >> > >> ------------------------------------------------------------------------ >> > [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri >> May 24 >> > 14:40:48 2013 >> > [0]PETSC ERROR: Libraries linked from >> > [0]PETSC ERROR: Configure run at >> > [0]PETSC ERROR: Configure options >> > [0]PETSC ERROR: >> > >> ------------------------------------------------------------------------ >> > [0]PETSC ERROR: MatPivotCheck_none() line 583 in >> > >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> > [0]PETSC ERROR: MatPivotCheck() line 602 in >> > >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >> > src/mat/impls/aij/seq/aijfact.c >> > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >> src/mat/interface/matrix.c >> > [0]PETSC ERROR: PCSetUp_LU() line 160 in >> src/ksp/pc/impls/factor/lu/lu.c >> > [0]PETSC ERROR: PCSetUp() line 832 in >> src/ksp/pc/interface/precon.c >> > [0]PETSC ERROR: KSPSetUp() line 278 in >> src/ksp/ksp/interface/itfunc.c >> > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line >> 715 in >> > src/ksp/pc/impls/bjacobi/bjacobi.c >> > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >> src/ksp/pc/interface/precon.c >> > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >> > src/ksp/ksp/interface/itfunc.c >> > [0]PETSC ERROR: KSPSolve() line 403 in >> src/ksp/ksp/interface/itfunc.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in >> src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >> src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >> src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >> src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCApply_MG() line 326 in >> src/ksp/pc/impls/mg/mg.c >> > [0]PETSC ERROR: PCApply() line 384 in >> src/ksp/pc/interface/precon.c >> > [0]PETSC ERROR: KSPSolve_CG() line 139 in >> src/ksp/ksp/impls/cg/cg.c >> > [0]PETSC ERROR: KSPSolve() line 446 in >> src/ksp/ksp/interface/itfunc.c >> > >> > What could the reason be? >> > Thank you, >> > >> > Michele >> > >> > >> > >> > On 05/17/2013 07:35 PM, Michele Rosso wrote: >> >> Thank you very much. I will try and let you know. >> >> >> >> Michele >> >> >> >> On 05/17/2013 07:01 PM, Jed Brown wrote: >> >>> Michele Rosso> >> writes: >> >>> >> >>>> I noticed that the problem appears even if I use CG with >> the default >> >>>> preconditioner: commenting KSPSetDM() solves the problem. >> >>> Okay, this issue can't show up if you use SNES, but it's >> a consequence >> >>> of making geometric multigrid work with a pure KSP >> interface. You can >> >>> either use KSPSetComputeOperators() to put your assembly >> in a function >> >>> (which will also be called on coarse levels if you use >> -pc_type mg >> >>> without Galerkin coarse operators) or you can can provide >> the Jacobian >> >>> using KSPSetOperators() as usual, but also call >> KSPSetDMActive() so that >> >>> the DM is not used for computing/updating the Jacobian. >> >>> >> >>> The logic is cleaner in petsc-3.4 and I think it just >> does the right >> >>> thing in your case. >> >>> >> >>>> So basically without a proper grid (it seems no grid >> with an even >> >>>> numbers of nodes qualifies) and with my own system >> matrix, I cannot use >> >>>> any type of multigrid >> >>>> pre-conditioner? >> >>> You can use all the AMG methods without setting a DM. >> >>> >> >> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- options used : -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_pc_type svd [0]PCSetData_AGG bs=1 MM=131072 [5]PETSC ERROR: [4]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [5]PETSC ERROR: [4]PETSC ERROR: Invalid argument! Invalid argument! [5]PETSC ERROR: [4]PETSC ERROR: Comm must be of size 1! Comm must be of size 1! [5]PETSC ERROR: [4]PETSC ERROR: ------------------------------------------------------------------------ [12]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: [13]PETSC ERROR: [14]PETSC ERROR: [15]PETSC ERROR: [25]PETSC ERROR: [17]PETSC ERROR: [6]PETSC ERROR: [23]PETSC ERROR: [22]PETSC ERROR: [4]PETSC ERROR: [26]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [18]PETSC ERROR: [19]PETSC ERROR: [8]PETSC ERROR: [23]PETSC ERROR: [22]PETSC ERROR: [31]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [5]PETSC ERROR: --------------------- Error Message ------------------------------------ [3]PETSC ERROR: [25]PETSC ERROR: [30]PETSC ERROR: --------------------- Error Message ------------------------------------ [27]PETSC ERROR: [28]PETSC ERROR: [20]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [27]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [21]PETSC ERROR: [17]PETSC ERROR: --------------------- Error Message ------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [2]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [10]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid argument! [8]PETSC ERROR: [29]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid argument! --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [9]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid argument! [22]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid argument! [24]PETSC ERROR: Invalid argument! [7]PETSC ERROR: [29]PETSC ERROR: [15]PETSC ERROR: Invalid argument! --------------------- Error Message ------------------------------------ [5]PETSC ERROR: [13]PETSC ERROR: Invalid argument! Invalid argument! Invalid argument! [23]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid argument! [64]PETSC ERROR: [65]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [64]PETSC ERROR: [65]PETSC ERROR: [78]PETSC ERROR: Invalid argument! Invalid argument! [64]PETSC ERROR: [79]PETSC ERROR: [89]PETSC ERROR: [65]PETSC ERROR: --------------------- Error Message ------------------------------------ Comm must be of size 1! [69]PETSC ERROR: [87]PETSC ERROR: [65]PETSC ERROR: [86]PETSC ERROR: [84]PETSC ERROR: [85]PETSC ERROR: [92]PETSC ERROR: [96]PETSC ERROR: [97]PETSC ERROR: [99]PETSC ERROR: --------------------- Error Message ------------------------------------ [98]PETSC ERROR: [96]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [100]PETSC ERROR: [97]PETSC ERROR: [98]PETSC ERROR: [123]PETSC ERROR: [101]PETSC ERROR: [122]PETSC ERROR: [103]PETSC ERROR: [106]PETSC ERROR: [112]PETSC ERROR: Invalid argument! [105]PETSC ERROR: --------------------- Error Message ------------------------------------ [108]PETSC ERROR: [118]PETSC ERROR: [104]PETSC ERROR: [111]PETSC ERROR: [97]PETSC ERROR: [127]PETSC ERROR: [124]PETSC ERROR: Invalid argument! [99]PETSC ERROR: --------------------- Error Message ------------------------------------ Comm must be of size 1! --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [109]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [103]PETSC ERROR: [102]PETSC ERROR: [101]PETSC ERROR: --------------------- Error Message ------------------------------------ [97]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ ------------------------------------------------------------------------ [107]PETSC ERROR: [100]PETSC ERROR: [120]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ Invalid argument! [102]PETSC ERROR: [97]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid argument! [107]PETSC ERROR: [106]PETSC ERROR: -------------------[33]PETSC ERROR: [32]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [33]PETSC ERROR: [32]PETSC ERROR: [39]PETSC ERROR: Invalid argument! [38]PETSC ERROR: [32]PETSC ERROR: --------------------- Error Message ------------------------------------ Comm must be of size 1! [34]PETSC ERROR: [39]PETSC ERROR: [44]PETSC ERROR: [37]PETSC ERROR: [35]PETSC ERROR: [62]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [35]PETSC ERROR: [61]PETSC ERROR: Invalid argument! [60]PETSC ERROR: Invalid argument! [57]PETSC ERROR: [34]PETSC ERROR: [35]PETSC ERROR: [46]PETSC ERROR: [58]PETSC ERROR: [53]PETSC ERROR: [38]PETSC ERROR: Invalid argument! [50]PETSC ERROR: [52]PETSC ERROR: --------------------- Error Message ------------------------------------ [32]PETSC ERROR: [48]PETSC ERROR: [49]PETSC ERROR: [39]PETSC ERROR: [43]PETSC ERROR: [46]PETSC ERROR: Invalid argument! [33]PETSC ERROR: [42]PETSC ERROR: [45]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ Invalid argument! Invalid argument! --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ Comm must be of size 1! [27]PETSC ERROR: [16]PETSC ERROR: [24]PETSC ERROR: [18]PETSC ERROR: [31]PETSC ERROR: [25]PETSC ERROR: --------------------- Error Message ------------------------------------ [29]PETSC ERROR: [0]PETSC ERROR: [7]PETSC ERROR: [19]PETSC ERROR: Comm must be of size 1! [6]PETSC ERROR: Invalid argument! Invalid argument! --------------------- Error Message ------------------------------------ [28]PETSC ERROR: Comm must be of size 1! [17]PETSC ERROR: Comm must be of size 1! --------------------- Error Message ------------------------------------ [4]PETSC ERROR: [27]PETSC ERROR: [12]PETSC ERROR: [29]PETSC ERROR: [22]PETSC ERROR: Invalid argument! Comm must be of size 1! Invalid argument! ------------------------------------------------------------------------ Comm must be of size 1! [19]PETSC ERROR: Invalid argument! [23]PETSC ERROR: Invalid argument! ------------------------------------------------------------------------ Invalid argument! [7]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [18]PETSC ERROR: Comm must be of size 1! ------------------------------------------------------------------------ Invalid argument! [8]PETSC ERROR: [28]PETSC ERROR: [31]PETSC ERROR: [24]PETSC ERROR: ------------------------------------------------------------------------ [30]PETSC ERROR: Comm must be of size 1! [22]PETSC ERROR: [6]PETSC ERROR: Invalid argument! [3]PETSC ERROR: [15]PETSC ERROR: --------------------- Error Message ------------------------------------ [5]PETSC ERROR: Comm must be of size 1! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [17]PETSC ERROR: --------------------- Error Message ------------------------------------ [9]PETSC ERROR: Comm must be of size 1! [11]PETSC ERROR: [31]PETSC ERROR: [30]PETSC ERROR: [14]PETSC ERROR: [19]PETSC ERROR: Comm must be of size 1! Invalid argument! [21]PETSC ERROR: [20]PETSC ERROR: [8]PETSC ERROR: [13]PETSC ERROR: [4]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ Invalid argument! [0]PETSC ERROR: Comm must be of size 1! Comm must be of size 1! Invalid argument! [2]PETSC ERROR: [28]PETSC ERROR: Comm must be of size 1! Invalid argument! ------------------------------------------------------------------------ [3]PETSC ERROR: Comm must be of size 1! [25]PETSC ERROR: Invalid argument! Invalid argument! [27]PETSC ERROR: Comm must be of size 1! ------------------------------------------------------------------------ [21]PETSC ERROR: [20]PETSC ERROR: [22]PETSC ERROR: [17]PETSC ERROR: Comm must be of size 1! See docs/changes/index.html for recent updates. [18]PETSC ERROR: [0]PETSC ERROR: [16]PETSC ERROR: Comm must be of size 1! [5]PETSC ERROR: [6]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! [19]PETSC ERROR: Comm must be of size 1! [7]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [22]PETSC ERROR: [24]PETSC ERROR: ------------------------------------------------------------------------ [23]PETSC ERROR: [0]PETSC ERROR: [12]PETSC ERROR: ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [13]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [20]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [9]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ ------------------------------------------------------------------------ [24]PETSC ERROR: [4]PETSC ERROR: Comm must be of size 1! [14]PETSC ERROR: [21]PETSC ERROR: ------------------------------------------------------------------------ [15]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/index.html for manual pages. --------------------- Error Message ------------------------------------ Invalid argument! [4]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: [8]PETSC ERROR: --------------------- Error Message ------------------------------------ [22]PETSC ERROR: ------------------------------------------------------------------------ [26]PETSC ERROR: [3]PETSC ERROR: [15]PETSC ERROR: ------------------------------------------------------------------------ Comm must be of size 1! [27]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [6]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [31]PETSC ERROR: Invalid argument! [28]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [23]PETSC ERROR: [24]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [29]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. [4]PETSC ERROR: Comm must be of size 1! [7]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [19]PETSC ERROR: [17]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [9]PETSC ERROR: [15]PETSC ERROR: Comm must be of size 1! [28]PETSC ERROR: [26]PETSC ERROR: [11]PETSC ERROR: [24]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. [30]PETSC ERROR: See docs/index.html for manual pages. [2]PETSC ERROR: See docs/changes/index.html for recent updates. [20]PETSC ERROR: [31]PETSC ERROR: [0]PETSC ERROR: Comm must be of size 1! [28]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [8]PETSC ERROR: ------------------------------------------------------------------------ Invalid argument! [10]PETSC ERROR: [18]PETSC ERROR: See docs/index.html for manual pages. [6]PETSC ERROR: [4]PETSC ERROR: [3]PETSC ERROR: [26]PETSC ERROR: Libraries linked from [25]PETSC ERROR: [0]PETSC ERROR: [11]PETSC ERROR: [22]PETSC ERROR: See docs/changes/index.html for recent updates. [4]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [29]PETSC ERROR: [12]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [23]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ [31]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/faq.html for hints about trouble shooting. [20]PETSC ERROR: [0]PETSC ERROR: [18]PETSC ERROR: [3]PETSC ERROR: See docs/changes/index.html for recent updates. [22]PETSC ERROR: ------------------------------------------------------------------------ Configure run at Comm must be of size 1! [25]PETSC ERROR: Invalid argument! [20]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [4]PETSC ERROR: [10]PETSC ERROR: [19]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [30]PETSC ERROR: [15]PETSC ERROR: ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [9]PETSC ERROR: Configure options [17]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [27]PETSC ERROR: [19]PETSC ERROR: [31]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [11]PETSC ERROR: See docs/index.html for manual pages. See docs/changes/index.html for recent updates. [23]PETSC ERROR: [13]PETSC ERROR: Comm must be of size 1! ------------------------------------------------------------------------ [14]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [12]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Comm must be of size 1! [19]PETSC ERROR: [28]PETSC ERROR: [22]PETSC ERROR: See docs/changes/index.html for recent updates. [20]PETSC ERROR: [21]PETSC ERROR: ------------------------------------------------------------------------ [16]PETSC ERROR: [17]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. Comm must be of size 1! [17]PETSC ERROR: [2]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ [19]PETSC ERROR: See docs/index.html for manual pages. [17]PETSC ERROR: [18]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/index.html for manual pages. See docs/changes/index.html for recent updates. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/index.html for manual pages. See docs/index.html for manual pages. [21]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [20]PETSC ERROR: [4]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: [29]PETSC ERROR: [30]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [25]PETSC ERROR: [21]PETSC ERROR: ------------------------------------------------------------------------ [10]PETSC ERROR: [19]PETSC ERROR: [7]PETSC ERROR: See docs/faq.html for hints about trouble shooting. Libraries linked from See docs/faq.html for hints about trouble shooting. [3]PETSC ERROR: Libraries linked from [6]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [16]PETSC ERROR: [13]PETSC ERROR: [17]PETSC ERROR: [4]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [15]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: [19]PETSC ERROR: ------------------------------------------------------------------------ Configure run at [23]PETSC ERROR: [26]PETSC ERROR: ------------------------------------------------------------------------ [24]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [7]PETSC ERROR: See docs/index.html for manual pages. [27]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [8]PETSC ERROR: [24]PETSC ERROR: [31]PETSC ERROR: See docs/index.html for manual pages. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ------------------------------------------------------------------------ [4]PETSC ERROR: See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. [12]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ [18]PETSC ERROR: [19]PETSC ERROR: See docs/faq.html for hints about trouble shooting. -- Error Message ------------------------------------ [119]PETSC ERROR: [122]PETSC ERROR: Invalid argument! [113]PETSC ERROR: [114]PETSC ERROR: Invalid argument! Invalid argument! --------------------- Error Message ------------------------------------ [109]PETSC ERROR: [98]PETSC ERROR: [110]PETSC ERROR: Invalid argument! Comm must be of size 1! --------------------- Error Message ------------------------------------ [104]PETSC ERROR: [108]PETSC ERROR: [103]PETSC ERROR: Invalid argument! Invalid argument! [99]PETSC ERROR: [116]PETSC ERROR: [111]PETSC ERROR: Comm must be of size 1! [112]PETSC ERROR: [121]PETSC ERROR: Invalid argument! --------------------- Error Message ------------------------------------ [101]PETSC ERROR: --------------------- Error Message ------------------------------------ [99]PETSC ERROR: [121]PETSC ERROR: [120]PETSC ERROR: [124]PETSC ERROR: Invalid argument! --------------------- Error Message ------------------------------------ [121]PETSC ERROR: Invalid argument! ------------------------------------------------------------------------ Invalid argument! [106]PETSC ERROR: [102]PETSC ERROR: [114]PETSC ERROR: [117]PETSC ERROR: [107]PETSC ERROR: [99]PETSC ERROR: --------------------- Error Message ------------------------------------ [116]PETSC ERROR: [123]PETSC ERROR: Comm must be of size 1! [98]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Invalid argument! [102]PETSC ERROR: Invalid argument! [100]PETSC ERROR: [122]PETSC ERROR: [127]PETSC ERROR: [115]PETSC ERROR: Invalid argument! [126]PETSC ERROR: Comm must be of size 1! Comm must be of size 1! [96]PETSC ERROR: ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ [112]PETSC ERROR: ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ Comm must be of size 1! Invalid argument! Comm must be of size 1! Comm must be of size 1! -[71]PETSC ERROR: [72]PETSC ERROR: [73]PETSC ERROR: --------------------- Error Message ------------------------------------ [93]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [78]PETSC ERROR: [80]PETSC ERROR: [73]PETSC ERROR: [85]PETSC ERROR: [68]PETSC ERROR: [91]PETSC ERROR: Invalid argument! --------------------- Error Message ------------------------------------ [90]PETSC ERROR: [75]PETSC ERROR: [88]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [83]PETSC ERROR: Invalid argument! [84]PETSC ERROR: ------------------------------------------------------------------------ [68]PETSC ERROR: --------------------- Error Message ------------------------------------ [93]PETSC ERROR: [92]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [71]PETSC ERROR: [74]PETSC ERROR: [65]PETSC ERROR: [67]PETSC ERROR: [77]PETSC ERROR: [95]PETSC ERROR: [94]PETSC ERROR: [88]PETSC ERROR: [82]PETSC ERROR: [83]PETSC ERROR: Invalid argument! Invalid argument! [91]PETSC ERROR: Invalid argument! --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ Invalid argument! --------------------- Error Message ------------------------------------ [82]PETSC ERROR: [81]PETSC ERROR: [87]PETSC ERROR: Invalid argument! [71]PETSC ERROR: --------------------- Error Message -----------[41]PETSC ERROR: Comm must be of size 1! [38]PETSC ERROR: --------------------- Error Message ------------------------------------ Comm must be of size 1! [34]PETSC ERROR: --------------------- Error Message ------------------------------------ [47]PETSC ERROR: Comm must be of size 1! --------------------- Error Message ------------------------------------ [35]PETSC ERROR: [34]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [36]PETSC ERROR: [42]PETSC ERROR: [44]PETSC ERROR: [54]PETSC ERROR: [40]PETSC ERROR: [58]PETSC ERROR: Comm must be of size 1! [51]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [37]PETSC ERROR: [39]PETSC ERROR: [48]PETSC ERROR: [60]PETSC ERROR: --------------------- Error Message ------------------------------------ [49]PETSC ERROR: [55]PETSC ERROR: Comm must be of size 1! --------------------- Error Message ------------------------------------ ------------------------------------------------------------------------ [46]PETSC ERROR: [39]PETSC ERROR: Invalid argument! Comm must be of size 1! [43]PETSC ERROR: [36]PETSC ERROR: [37]PETSC ERROR: --------------------- Error Message ------------------------------------ [50]PETSC ERROR: [38]PETSC ERROR: ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [33]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [63]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid ar-------------------- Error Message ------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [113]PETSC ERROR: Comm must be of size 1! Invalid argument! --------------------- Error Message ------------------------------------ Invalid argument! [97]PETSC ERROR: [106]PETSC ERROR: --------------------- Error Message ------------------------------------ [119]PETSC ERROR: [108]PETSC ERROR: Comm must be of size 1! --------------------- Error Message ------------------------------------ [98]PETSC ERROR: [101]PETSC ERROR: [107]PETSC ERROR: [100]PETSC ERROR: [105]PETSC ERROR: ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [96]PETSC ERROR: [125]PETSC ERROR: ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ Comm must be of size 1! Comm must be of size 1! [100]PETSC ERROR: Invalid argument! [116]PETSC ERROR: [127]PETSC ERROR: [103]PETSC ERROR: [102]PETSC ERROR: [120]PETSC ERROR: Invalid argument! [99]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ [125]PETSC ERROR: [124]PETSC ERROR: [126]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [98]PETSC ERROR: [97]PETSC ERROR: [102]PETSC ERROR: See docs/changes/index.html for recent updates. Comm must be of size 1! [118]PETSC ERROR: [119]PETSC ERROR: [112]PETSC ERROR: [122]PETSC ERROR: [98]PETSC ERROR: ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. Comm must be of size 1! Invalid argument! Comm must be of size 1! [103]PETSC ERROR: [127]PETSC ERROR: [110]PETSC ERROR: Comm must be of size 1! Invalid argument! ------------------------------------------------------------------------------------- Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [76]PETSC ERROR: --------------------- Error Message ------------------------------------ [65]PETSC ERROR: Invalid argument! Invalid argument! [90]PETSC ERROR: Invalid argument! --------------------- Error Message ------------------------------------ Invalid argument! --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [69]PETSC ERROR: --------------------- Error Message ------------------------------------ Invalid argument! [93]PETSC ERROR: [92]PETSC ERROR: [84]PETSC ERROR: [85]PETSC ERROR: Comm must be of size 1! [68]PETSC ERROR: [81]PETSC ERROR: --------------------- Error Message ------------------------------------ Comm must be of size 1! Comm must be of size 1! Comm must be of size 1! Comm must be of size 1! [82]PETSC ERROR: [70]PETSC ERROR: [77]PETSC ERROR: [71]PETSC ERROR: [95]PETSC ERROR: [84]PETSC ERROR: [78]PETSC ERROR: --------------------- Error Message ------------------------------------ ------------------------------------------------------------------------ Comm must be of size 1! See docs/changes/index.html for recent updates. [80]PETSC ERROR: Comm must be of size 1! [65]PETSC ERROR: [85]PETSC ERROR: --------------------- Error Message ------------------------------------ [86]PETSC ERROR: [70]PETSC ERROR: [68]PETSC ERROR: See docs/faq.html for hints about trouble shooting. --------------------- Error Message ------------------------------------ Invalid argument! [91]PETSC ERROR: Invalid argument! [73]PETSC ERROR: [72]PETSC ERROR: Invalid argument! Comm must be of size 1! [71]PETSC ERROR: ------------------------------------------------------------------------ [66]PETSC ERROR: Comm must be of size 1! [94]PETSC ERROR: [87]PETSC ERROR: [70]PETSC ERROR: Invalid argument! [68]PETSC ERROR: [90]PETSC ERROR: Comm must be of size 1! [76]PETSC ERROR: [65]PETSC ERROR: Invalid argument! --------------------- Error Message gument! Invalid argument! [59]PETSC ERROR: [42]PETSC ERROR: [57]PETSC ERROR: [51]PETSC ERROR: Invalid argument! [61]PETSC ERROR: [46]PETSC ERROR: [39]PETSC ERROR: Comm must be of size 1! [34]PETSC ERROR: [55]PETSC ERROR: ------------------------------------------------------------------------ Invalid argument! Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [33]PETSC ERROR: ------------------------------------------------------------------------ Invalid argument! --------------------- Error Message ------------------------------------ Invalid argument! --------------------- Error Message ------------------------------------ [48]PETSC ERROR: [41]PETSC ERROR: [46]PETSC ERROR: [57]PETSC ERROR: [45]PETSC ERROR: [49]PETSC ERROR: Comm must be of size 1! [56]PETSC ERROR: Comm must be of size 1! --------------------- Error Message ------------------------------------ [34]PETSC ERROR: [48]PETSC ERROR: [50]PETSC ERROR: [40]PETSC ERROR: Invalid argument! Invalid argument! Comm must be of size 1! Invalid argument! [56]PETSC ERROR: Comm must be of size 1! [36]PETSC ERROR: See docs/changes/index.html for recent updates. Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [52]PETSC ERROR: [58]PETSC ERROR: Invalid argument! [33]PETSC ERROR: Comm must be of size 1! See docs/changes/index.html for recent updates. [44]PETSC ERROR: ------------------------------------------------------------------------ [33]PETSC ERROR: [42]PETSC ERROR: Comm must be of size 1! See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [53]PETSC ERROR: [34]PETSC ERROR: [37]PETSC ERROR: Invalid argument! [63]PETSC ERROR: ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ [58]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ [35]PETSC ERROR: [38]PETSC ERROR: [34------------ [124]PETSC ERROR: [122]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! [111]PETSC ERROR: [119]PETSC ERROR: [102]PETSC ERROR: ------------------------------------------------------------------------ [106]PETSC ERROR: Invalid argument! Invalid argument! Invalid argument! [119]PETSC ERROR: [120]PETSC ERROR: Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [126]PETSC ERROR: [103]PETSC ERROR: [104]PETSC ERROR: [102]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/index.html for manual pages. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Invalid argument! [99]PETSC ERROR: [105]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [115]PETSC ERROR: [102]PETSC ERROR: [110]PETSC ERROR: [109]PETSC ERROR: Comm must be of size 1! ------------------------------------------------------------------------ [97]PETSC ERROR: [98]PETSC ERROR: Comm must be of size 1! [103]PETSC ERROR: See docs/index.html for manual pages. [114]PETSC ERROR: Invalid argument! ------------------------------------------------------------------------ Invalid argument! [101]PETSC ERROR: [105]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [102]PETSC ERROR: ------------------------------------------------------------------------ [100]PETSC ERROR: See docs/index.html for manual pages. Comm must be of size 1! ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [98]PETSC ERROR: Comm must be of size 1! [97]PETSC ERROR: [120]PETSC ERROR: [118]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [115]PETSC ERROR------------------------------------ [78]PETSC ERROR: Comm must be of size 1! Comm must be of size 1! [73]PETSC ERROR: [82]PETSC ERROR: [74]PETSC ERROR: ------------------------------------------------------------------------ [89]PETSC ERROR: ------------------------------------------------------------------------ [66]PETSC ERROR: Invalid argument! Invalid argument! Comm must be of size 1! Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [77]PETSC ERROR: [92]PETSC ERROR: [87]PETSC ERROR: [67]PETSC ERROR: Invalid argument! [95]PETSC ERROR: ------------------------------------------------------------------------ [94]PETSC ERROR: Invalid argument! See docs/index.html for manual pages. [73]PETSC ERROR: Invalid argument! ------------------------------------------------------------------------ [65]PETSC ERROR: [93]PETSC ERROR: [81]PETSC ERROR: ------------------------------------------------------------------------ Invalid argument! [65]PETSC ERROR: Invalid argument! [72]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ [84]PETSC ERROR: Invalid argument! Comm must be of size 1! [92]PETSC ERROR: [65]PETSC ERROR: [80]PETSC ERROR: Libraries linked from [68]PETSC ERROR: Invalid argument! [65]PETSC ERROR: [83]PETSC ERROR: Configure run at ------------------------------------------------------------------------ Invalid argument! [93]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! [65]PETSC ERROR: [71]PETSC ERROR: [74]PETSC ERROR: Comm must be of size 1! See docs/changes/index.html for recent updates. [67]PETSC ERROR: [66]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ Comm must be of size 1! [71]PETSC ERROR: ]PETSC ERROR: [57]PETSC ERROR: [60]PETSC ERROR: [47]PETSC ERROR: [40]PETSC ERROR: Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! [39]PETSC ERROR: Invalid argument! [51]PETSC ERROR: See docs/faq.html for hints about trouble shooting. Comm must be of size 1! Comm must be of size 1! Comm must be of size 1! Invalid argument! [39]PETSC ERROR: [35]PETSC ERROR: [59]PETSC ERROR: [43]PETSC ERROR: [33]PETSC ERROR: Invalid argument! [52]PETSC ERROR: [41]PETSC ERROR: Comm must be of size 1! See docs/index.html for manual pages. ------------------------------------------------------------------------ [56]PETSC ERROR: ------------------------------------------------------------------------ Comm must be of size 1! See docs/changes/index.html for recent updates. [59]PETSC ERROR: Comm must be of size 1! Comm must be of size 1! ------------------------------------------------------------------------ Comm must be of size 1! [51]PETSC ERROR: [33]PETSC ERROR: See docs/index.html for manual pages. [50]PETSC ERROR: [36]PETSC ERROR: ------------------------------------------------------------------------ [45]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [62]PETSC ERROR: Invalid argument! [58]PETSC ERROR: Invalid argument! [54]PETSC ERROR: [43]PETSC ERROR: [38]PETSC ERROR: ------------------------------------------------------------------------ [39]PETSC ERROR: [36]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [55]PETSC ERROR: ------------------------------------------------------------------------ Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [42]PETSC ERROR: [61]PETSC ERROR: [33]PETSC ERROR: [60]PETSC ERROR: [62]PETSC ERROR: [51]PETSC ERROR: [44]PETSC ERROR: [39]PETSC ERROR: See docs/index.html for manual pages. [46]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT: Comm must be of size 1! [119]PETSC ERROR: [121]PETSC ERROR: [126]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [102]PETSC ERROR: [113]PETSC ERROR: Comm must be of size 1! Libraries linked from [122]PETSC ERROR: [120]PETSC ERROR: [116]PETSC ERROR: ------------------------------------------------------------------------ [104]PETSC ERROR: ------------------------------------------------------------------------ Comm must be of size 1! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ ------------------------------------------------------------------------ [102]PETSC ERROR: See docs/changes/index.html for recent updates. [118]PETSC ERROR: See docs/changes/index.html for recent updates. [122]PETSC ERROR: [109]PETSC ERROR: [97]PETSC ERROR: [113]PETSC ERROR: [119]PETSC ERROR: [105]PETSC ERROR: [101]PETSC ERROR: ------------------------------------------------------------------------ ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [99]PETSC ERROR: [111]PETSC ERROR: [97]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [104]PETSC ERROR: [100]PETSC ERROR: [68]PETSC ERROR: [91]PETSC ERROR: Comm must be of size 1! [76]PETSC ERROR: [70]PETSC ERROR: Comm must be of size 1! [86]PETSC ERROR: Comm must be of size 1! [76]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [69]PETSC ERROR: [92]PETSC ERROR: Comm must be of size 1! [88]PETSC ERROR: Configure options Comm must be of size 1! [90]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [89]PETSC ERROR: Comm must be of size 1! [69]PETSC ERROR: [66]PETSC ERROR: ------------------------------------------------------------------------ [95]PETSC ERROR: ------------------------------------------------------------------------ [82]PETSC ERROR: ------------------------------------------------------------------------ [76]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [67]PETSC ERROR: ------------------------------------------------------------------------ [65]PETSC ERROR: [68]PETSC ERROR: ------------------------------------------------------------------------ [87]PETSC ERROR: [78]PETSC ERROR: [69]PETSC ERROR: See docs/index.html for manual pages. [91]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [66]PETSC ERROR: [84]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [80]PETSC ERROR: Comm must be of size 1! [81]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [68]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [83]PETSC ERROR: Comm must be of size 1! [72]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11: 2012 ------------------------------------------------------------------------ Invalid argument! [63]PETSC ERROR: Comm must be of size 1! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ------------------------------------------------------------------------ [36]PETSC ERROR: [35]PETSC ERROR: [47]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [37]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [56]PETSC ERROR: [50]PETSC ERROR: See docs/changes/index.html for recent updates. [112]PETSC ERROR: ------------------------------------------------------------------------ [103]PETSC ERROR: [126]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Configure run at [118]PETSC ERROR: Libraries linked from [124]PETSC ERROR: See docs/index.html for manual pages. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [102]PETSC ERROR: [106]PETSC ERROR: [97]PETSC ERROR: See docs/changes/index.html for recent updates. Configure options [98]PETSC ERROR: Configure run at Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [108]PETSC ERROR: [105]PETSC ERROR: [97]PETSC ERROR: [126]PETSC ERROR: [124]PETSC ERROR: See docs/changes/index.html for recent updates. Configure options See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ [97]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [119]PETSC ERROR: [102]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [99]PETSC ERROR: Comm must be of size 1! ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ [105]PETSC ERROR: [107]PETSC ERROR: [117]PETSC ERROR: [98]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/faq.html for hints about trouble shooting. [118]PETSC ERROR: [111]PETSC ERROR: See docs/changes/index.html for recent updates. [125]PETSC ERROR: [122]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [123]PETSC ERROR: ------------------------------------26:24 CDT 2012 [70]PETSC ERROR: [78]PETSC ERROR: See docs/changes/index.html for recent updates. [93]PETSC ERROR: [74]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [67]PETSC ERROR: ------------------------------------------------------------------------ [69]PETSC ERROR: [89]PETSC ERROR: ------------------------------------------------------------------------ [68]PETSC ERROR: See docs/changes/index.html for recent updates. [95]PETSC ERROR: ------------------------------------------------------------------------ [92]PETSC ERROR: ------------------------------------------------------------------------ [76]PETSC ERROR: Comm must be of size 1! ------------------------------------------------------------------------ ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [94]PETSC ERROR: [85]PETSC ERROR: [80]PETSC ERROR: [74]PETSC ERROR: [87]PETSC ERROR: [83]PETSC ERROR: [89]PETSC ERROR: [65]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [69]PETSC ERROR: [88]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [70]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! [94]PETSC ERROR: [95]PETSC ERROR: [82]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [77]PETSC ERROR: [64]PETSC ERROR: [34]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! Invalid argument! Comm must be of size 1! [48]PETSC ERROR: [63]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [39]PETSC ERROR: Comm must be of size 1! [49]PETSC ERROR: See docs/changes/index.html for recent updates. Comm must be of size 1! [37]PETSC ERROR: ------------------------------------------------------------------------ [41]PETSC ERROR: Comm must be of size 1! [40]PETSC ERROR: Libraries linked from [38]PETSC ERROR: Invalid argument! [52]PETSC ERROR: [35]PETSC ERROR: [42]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [61]PETSC ERROR: ------------------------------------ ------------------------------------------------------------------------ Comm must be of size 1! See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/index.html for manual pages. [121]PETSC ERROR: [97]PETSC ERROR: Comm must be of size 1! [120]PETSC ERROR: [126]PETSC ERROR: [103]PETSC ERROR: [110]PETSC ERROR: ------------------------------------------------------------------------ [111]PETSC ERROR: [100]PETSC ERROR: ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [108]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [101]PETSC ERROR: [121]PETSC ERROR: [113]PETSC ERROR: [102]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [109]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [106]PETSC ERROR: See docs/faq.html for hints about trouble shooting. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. Comm must be of size 1! ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [118]PETSC ERROR: [119]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [116]PETSC ERROR: ------------------------------------------------------------------------ [97]PETSC ERROR: [112]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [100]PETSC ERROR: [104]PETSC ERROR: [101]PETSC ERROR: ------------------------------------------------------------------------ [113]PETSC ERROR: [108]PETSC ERROR: See docs/index.html for manual pages. [102]PETSC ERROR: [106]PETSC ERROR: [123]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [114]PETSC[47]PETSC ERROR: [60]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. [39]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [38]PETSC ERROR: [49]PETSC ERROR: ------------------------------------------------------------------------ [43]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [63]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. [55]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [62]PETSC ERROR: [38]PETSC ERROR: ------------------------------------------------------------------------ [45]PETSC ERROR: [52]PETSC ERROR: [47]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [37]PETSC ERROR: [35]PETSC ERROR: [36]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [60]PETSC ERROR: [48]PETSC ERROR: [40]PETSC ERROR: [46]PETSC ERROR: [53]PETSC ERROR: [52]PETSC ERROR: [59]PETSC ERROR: [44]PETSC ERROR: [54]PETSC ERROR: [36]PETSC ERROR: Configure run at [37]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. [38]PETSC ERROR: Comm must be of size 1! ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [34]PETSC ERROR: ------------------------------------------------------------------------ [49]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. [38]PETSC ERROR: [52]PETSC ERROR: See docs/changes/index.html for recent updates. Comm must be of size 1! See docs/changes/index.html for recent updates. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [3]PETSC ERROR: [2]PETSC ERROR: [23]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: [21]PETSC ERROR: [30]PETSC ERROR: [20]PETSC ERROR: [0]PETSC ERROR: [28]PETSC ERROR: See docs/faq.html for hints about trouble shooting. MatSetType() line 74 in src/mat/interface/matreg.c [13]PETSC ERROR: [15]PETSC ERROR: [8]PETSC ERROR: ------------------------------------------------------------------------ [22]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [3]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Configure run at [6]PETSC ERROR: [29]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. See docs/index.html for manual pages. [28]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [29]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 See docs/index.html for manual pages. Configure run at [26]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 See docs/index.html for manual pages. [30]PETSC ERROR: See docs/index.html for manual pages. [21]PETSC ERROR: [22]PETSC ERROR: [23]PETSC ERROR: [18]PETSC ERROR: [11]PETSC ERROR: [4]PETSC ERROR: [31]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ------------------------------------------------------------------------ ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Configure options Configure options Configure options [0]PETSC ERROR: [12]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ MatConvert() line 3747 in src/mat/interface/matrix.c [22]PETSC ERROR: [19]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: [0]PETSC ERROR: See docs/index.html for manual pages. MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ ------------------------------------------------------------------------ [13]PETSC ERROR: [15]PETSC ERROR: [0]PETSC ERROR: [27]PETSC ERROR: [6]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [25]PETSC ERROR: [4]PETSC ERROR: See docs/index.html for manual pages. [28]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c Libraries linked from ------------------------------------------------------------------------ [18]PETSC ERROR: [24]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [20]PETSC ERROR: [21]PETSC ERROR: [23]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ [0]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [15]PETSC ERROR: [3]PETSC ERROR: [9]PETSC ERROR: [26]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 See docs/index.html for manual pages. [4]PETSC ERROR: Libraries linked from [8]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ [9]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [11]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [7]PETSC ERROR: [17]PETSC ERROR: [29]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. [0]PETSC ERROR: [13]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [26]PETSC ERROR: Configure run at [10]PETSC ERROR: See docs/changes/index.html for recent updates. ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [30]PETSC ERROR: ------------------------------------------------------------------------ [22]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [16]PETSC ERROR: [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [23]PETSC ERROR: [24]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure options [4]PETSC ERROR: [16]PETSC ERROR: Configure run at Libraries linked from PCSetUp() line 832 in src/ksp/pc/interface/precon.c [22]PETSC ERROR: [23]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [31]PETSC ERROR: [2]PETSC ERROR: ------------------------------------------------------------------------ [4]PETSC ERROR: [0]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [25]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c ------------------------------------------------------------------------ See docs/index.html for manual pages. [4]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [8]PETSC ERROR: [21]PETSC ERROR: [0]PETSC ERROR: See docs/index.html for manual pages. Libraries linked from [10]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c ------------------------------------------------------------------------ [3]PETSC ERROR: [17]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [19]PETSC ERROR: [4]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ Configure options ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Libraries linked from [2]PETSC ERROR: Configure run at [9]PETSC ERROR: [28]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [24]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [26]PETSC ERROR: [12]PETSC ERROR: [21]PETSC ERROR: [23]PETSC ERROR: [0]PETSC ERROR: Configure run at Libraries linked from [21]PETSC ERROR: [18]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Libraries linked from [20]PETSC ERROR: [11]PETSC ERROR: See docs/index.html for manual pages. [31]PETSC ERROR: [30]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [8]PETSC ERROR: [7]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [29]PETSC ERROR: Configure run at ------------------------------------------------------------------------ ------------------------------------------------------------------------ [2]PETSC ERROR: Libraries linked from PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [19]PETSC ERROR: Libraries linked from [17]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [0]PETSC ERROR: Configure run at PCSetUp() line 832 in src/ksp/pc/interface/precon.c [9]PETSC ERROR: Libraries linked from [3]PETSC ERROR: [0]PETSC ERROR: ------------------------------------------------------------------------ Libraries linked from See docs/changes/index.html for recent updates. [27]PETSC ERROR: [4]PETSC ERROR: [29]PETSC ERROR: [26]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatSetType() line 74 in src/mat/interface/matreg.c [18]PETSC ERROR: [8]PETSC ERROR: ------------------------------------------------------------------------ Configure run at ------------------------------------------------------------------------ ------------------------------------------------------------------------ [16]PETSC ERROR: [6]PETSC ERROR: [7]PETSC ERROR: [22]PETSC ERROR: [9]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [18]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [26]PETSC ERROR: [20]PETSC ERROR: See docs/index.html for manual pages. [14]PETSC ERROR: Libraries linked from [13]PETSC ERROR: Configure options Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [28]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Libraries linked from ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [12]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [6]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [30]PETSC ERROR: Configure run at [20]PETSC ERROR: Libraries linked from [23]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c Libraries linked from Configure options ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Configure run at [16]PETSC ERROR: [25]PETSC ERROR: [6]PETSC ERROR: [26]PETSC ERROR: [14]PETSC ERROR: Configure options [22]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [2]PETSC ERROR: Configure run at [8]PETSC ERROR: Configure run at Configure run at See docs/faq.html for hints about trouble shooting. Configure run at MatSetType() line 74 in src/mat/interface/matreg.c [13]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ See docs/index.html for manual pages. PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Configure run at [22]PETSC ERROR: [11]PETSC ERROR: [9]PETSC ERROR: [15]PETSC ERROR: Configure options Libraries linked from [10]PETSC ERROR: [25]PETSC ERROR: [7]PETSC ERROR: [24]PETSC ERROR: [31]PETSC ERROR: [6]PETSC ERROR: Libraries linked from [23]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [19]PETSC ERROR: [22]PETSC ERROR: Configure options [16]PETSC ERROR: Configure options Configure options Libraries linked from MatConvert() line 3747 in src/mat/interface/matrix.c Configure options KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [18]PETSC ERROR: [8]PETSC ERROR: [20]PETSC ERROR: [22]PETSC ERROR: See docs/index.html for manual pages. [6]PETSC ERROR: [28]PETSC ERROR: [7]PETSC ERROR: [29]PETSC ERROR: Configure run at [27]PETSC ERROR: ------------------------------------------------------------------------ [11]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [9]PETSC ERROR: [17]PETSC ERROR: [24]PETSC ERROR: Libraries linked from [6]PETSC ERROR: Libraries linked from [30]PETSC ERROR: Configure run at ------------------------------------------------------------------------ [23]PETSC ERROR: Configure options ------------------------------------------------------------------------ [27]PETSC ERROR: ------------------------------------------------------------------------ PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [2]PETSC ERROR: Configure run at [3]PETSC ERROR: [28]PETSC ERROR: [25]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [30]PETSC ERROR: [12]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. [58]PETSC ERROR: [50]PETSC ERROR: [46]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [48]PETSC ERROR: See docs/changes/index.html for recent updates. [60]PETSC ERROR: See docs/changes/index.html for recent updates. [51]PETSC ERROR: See docs/index.html for manual pages. [54]PETSC ERROR: ------------------------------------------------------------------------ [50]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [57]PETSC ERROR: [61]PETSC ERROR: [56]PETSC ERROR: Libraries linked from Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. [53]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [49]PETSC ERROR: [60]PETSC ERROR: [42]PETSC ERROR: ------------------------------------------------------------------------ [34]PETSC ERROR: [35]PETSC ERROR: [52]PETSC ERROR: Libraries linked from ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [38]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/index.html for manual pages. [43]PETSC ERROR: [35]PETSC ERROR: Configure run at Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [53]PETSC ERROR: See docs/changes/index.html for recent updates. [46]PETSC ERROR: [37]PETSC ERROR: ------------------------------------------------------------------------ [54]PETSC ERROR: ------------------------------------------------------------------------ [44]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [61]PETSC ERROR: [55]PETSC ERROR: [47]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ------------------------------------------------------------------------ [38]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [39]PETSC ERROR: [52]PETSC ERROR: [62]PETSC ERROR: [40]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [49]PETSC ERROR: See docs/changes/index.html for recent updates. [34]PETSC ERROR: Configure options ------------------------------------------------------------------------ [51]PETSC ERROR: Configure run at Libraries linked from Configure options [34]PETSC ERROR: [56]PETSC ERROR: [57]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. Configure options [53]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [52]PETSC ERROR: [62]PETSC ERROR: [63]PETSC ERROR: [60]PETSC ERROR: [34]PETSC ERROR: [45]PETSC ERROR: [59]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [60]PETSC ERROR: See docs/changes/index.html for recent updates. [50]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [38]PETSC ERROR: [42]PETSC ERROR: [43]PETSC ERROR: ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [55]PETSC ERROR: See docs/changes/index.html for recent updates. [56]PETSC ERROR: [57]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [51]PETSC ERROR: [61]PETSC ERROR: [36]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [54]PETSC ERROR: ------------------------------------------------------------------------ [63]PETSC ERROR: [62]PETSC ERROR: [34]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [50]PETSC ERROR: ------------------------------------------------------------------------ [37]PETSC ERROR: See docs/changes/index.html for recent updates. [48]PETSC ERROR: [35]PETSC ERROR: [39]PETSC ERROR: See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Configure run at ------------------------------------------------------------------------ ------------------------------------------------------------------------ [32]PETSC ERROR: [35]PETSC ERROR: [41]PETSC ERROR: [42]PETSC ERROR: [61]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [58]PETSC ERROR: [51]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. [52]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Libraries linked from ------------------------------------------------------------------------ [46]PETSC ERROR: [39]PETSC ERROR: [38]PETSC ERROR: ------------------------------------------------------------------------ [53]PETSC ERROR: [43]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense. ERROR: [103]PETSC ERROR: Libraries linked from See docs/changes/index.html for recent updates. [115]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ [109]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [120]PETSC ERROR: ------------------------------------------------------------------------ [98]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [99]PETSC ERROR: See docs/index.html for manual pages. [100]PETSC ERROR: [97]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [124]PETSC ERROR: Comm must be of size 1! ------------------------------------------------------------------------ ------------------------------------------------------------------------ [99]PETSC ERROR: See docs/changes/index.html for recent updates. Configure run at [101]PETSC ERROR: Invalid argument! See docs/index.html for manual pages. [123]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 MatConvert() line 3747 in src/mat/interface/matrix.c See docs/changes/index.html for recent updates. ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [102]PETSC ERROR: [126]PETSC ERROR: [125]PETSC ERROR: [116]PETSC ERROR: [97]PETSC ERROR: [104]PETSC ERROR: ------------------------------------------------------------------------ ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 See docs/index.html for manual pages. [122]PETSC ERROR: [123]PETSC ERROR: [115]PETSC ERROR: [126]PETSC ERROR: [101]PETSC ERROR: [106]PETSC ERROR: [107]PETSC ERROR: Libraries linked from See docs/faq.html for hints about trouble shooting. [110]PETSC ERROR: [98]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [104]PETSC ERROR: [114]PETSC ERROR: [119]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 MatConvec Configure options [59]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. [50]PETSC ERROR: [49]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [51]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 See docs/changes/index.html for recent updates. [57]PETSC ERROR: [56]PETSC ERROR: [48]PETSC ERROR: [58]PETSC ERROR: See docs/changes/index.html for recent updates. MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [44]PETSC ERROR: [37]PETSC ERROR: [61]PETSC ERROR: See docs/index.html for manual pages. [63]PETSC ERROR: [62]PETSC ERROR: [32]PETSC ERROR: See docs/changes/index.html for recent updates. [38]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ [54]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. [46]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [50]PETSC ERROR: [35]PETSC ERROR: [34]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [61]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [40]PETSC ERROR: See docs/changes/index.html for recent updates. [56]PETSC ERROR: [47]PETSC ERROR: Libraries linked from [32]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. [36]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. ./hit on a named nid21819 by Unknown Fri Mart() line 3747 in src/mat/interface/matrix.c See docs/index.html for manual pages. [105]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [99]PETSC ERROR: See docs/index.html for manual pages. [97]PETSC ERROR: [121]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [105]PETSC ERROR: [115]PETSC ERROR: [114]PETSC ERROR: [111]PETSC ERROR: [100]PETSC ERROR: [113]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [109]PETSC ERROR: Configure run at [103]PETSC ERROR: [102]PETSC ERROR: [97]PETSC ERROR: See docs/index.html for manual pages. Configure options KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c See docs/changes/index.html for recent updates. [97]PETSC ERROR: [117]PETSC ERROR: ------------------------------------------------------------------------ [108]PETSC ERROR: [104]PETSC ERROR: See docs/faq.html for hints about trouble shooting. PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [116]PETSC ERROR: [110]PETSC ERROR: Comm must be of size 1! See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [98]PETSC ERROR: [112]PETSC ERROR: [121]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 Libraries linked from See docs/faq.html for hints about trouble shooting. [118]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [119]PETSC ERROR: Libraries linked from [111]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ [116]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [117]PETSC ERROR: ------------------------------------------------------------------------ [97]PETSC ERROR: See docs/index.html for manual pages. PCSetUp_GAMG() line 984 in src/ky 24 16:12:20 2013 See docs/index.html for manual pages. [45]PETSC ERROR: [35]PETSC ERROR: [56]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 MatSetType() line 74 in src/mat/interface/matreg.c [46]PETSC ERROR: See docs/changes/index.html for recent updates. [34]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [36]PETSC ERROR: Configure run at [44]PETSC ERROR: [41]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [53]PETSC ERROR: Libraries linked from MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c See docs/index.html for manual pages. MatConvert() line 3747 in src/mat/interface/matrix.c [47]PETSC ERROR: [35]PETSC ERROR: [34]PETSC ERROR: [54]PETSC ERROR: [55]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [43]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c ------------------------------------------------------------------------ MatSetType() line 74 in src/mat/interface/matreg.c [39]PETSC ERROR: [62]PETSC ERROR: See docs/index.html for manual pages. [38]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [53]PETSC ERROR: [43]PETSC ERROR: ------------------------------------------------------------------------ [55]PETSC ERROR: See docs/index.html for manual pages. [32]PETSC ERROR: [56]PETSC ERROR: sp/pc/impls/gamg/gamg.c [114]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [100]PETSC ERROR: [126]PETSC ERROR: [101]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [123]PETSC ERROR: [97]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [120]PETSC ERROR: [113]PETSC ERROR: [121]PETSC ERROR: [99]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ [126]PETSC ERROR: [112]PETSC ERROR: Libraries linked from Configure run at ------------------------------------------------------------------------ Libraries linked from PCSetUp() line 832 in src/ksp/pc/interface/precon.c [100]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [126]PETSC ERROR: See docs/index.html for manual pages. Configure run at [123]PETSC ERROR: [126]PETSC ERROR: [115]PETSC ERROR: [117]PETSC ERROR: [98]PETSC ERROR: [101]PETSC ERROR: Configure options [102]PETSC ERROR: [97]PETSC ERROR: [108]PETSC ERROR: Configure options [110]PETSC ERROR: [99]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ [125]PETSC ERROR: See docs/index.html for manual pages. MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c See docs/index.html for manual pages. See docs/index.html for manual pages. See docs/index.html for manual pages. [58]PETSC ERROR: [54]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [55]PETSC ERROR: [34]PETSC ERROR: See docs/faq.html for hints about trouble shooting. MatSetType() line 74 in src/mat/interface/matreg.c [44]PETSC ERROR: [37]PETSC ERROR: ------------------------------------------------------------------------ [41]PETSC ERROR: [63]PETSC ERROR: [61]PETSC ERROR: [46]PETSC ERROR: [51]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [58]PETSC ERROR: Configure run at Libraries linked from [32]PETSC ERROR: [39]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [37]PETSC ERROR: Configure options ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [49]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ Libraries linked from ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [57]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [38]PETSC ERROR: [36]PETSC ERROR: [32]PETSC ERROR: [41]PETSC ERROR: Configure options [44]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [52]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [45]PETSC ERROR: [59]PETSC ERROR: ------------------------------------------------------------------------ [39]PETSC ERROR: [50]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ ------------------------------------------------------------------------ Configure run at [61]PETSC ERROR: [37]PETSC ERROR: [43]PETSC ERROR: Configure run at Configure run at ------------------------------------------------------------------------ [46]PETSC ERROR: Configure run at [53]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri Ma[106]PETSC ERROR: ------------------------------------------------------------------------ KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [104]PETSC ERROR: [121]PETSC ERROR: [120]PETSC ERROR: [108]PETSC ERROR: [109]PETSC ERROR: Configure options [100]PETSC ERROR: Configure run at ------------------------------------------------------------------------ ------------------------------------------------------------------------ [103]PETSC ERROR: [101]PETSC ERROR: [126]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ [114]PETSC ERROR: ------------------------------------------------------------------------ [105]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure run at See docs/index.html for manual pages. [99]PETSC ERROR: See docs/index.html for manual pages. Libraries linked from [116]PETSC ERROR: Configure options Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ------------------------------------------------------------------------ ------------------------------------------------------------------------ [126]PETSC ERROR: [116]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 See docs/faq.html for hints about trouble shooting. [96]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [108]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c See docs/index.html for manual pages. [126]PETSC ERROR: [118]PETSC ERROR: [119]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ [122]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [114]PETSC ERROR: [105]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [116]PETSC ERROR: [117]PETSC ERROR: Libraries linked from [102]PETSC ERROR: [98]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 See docs/index.html for manual pages. See docs/changes/index.html for recent updates. [100]PETSC ERROR: [113]PETSC ERROR: [126]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [104]PETSC ERROR: [96]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ [101]PETSC ERROR: See docs/changes/index.html for recent updates. [118]PETSC ERROR: Configure run at [96]PETSC ERROR: See docs/changes/index.html for recent updates. ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [111]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [115]PETSC ERROR: ------------------------------------------------------------------------ [107]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [124]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [101]PETSC ERROR: ------------------------------------------------------------------------ [98]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. MatConvert() line 3747 in src/mat/interface/matrix.c [117]PETSC ERROR: [99]PETSC ERROR: [98]PETSC ERROR: [122]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c See docs/index.html for manual pages. [124]PETSC ERROR: [125]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [112]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c See docs/faq.html for hints about trouble shooting. MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [103]PETSC ERROR: [101]PETSC ERROR: [109]PETSC ERROR: ------------------------------------------------------------------------ [110]PETSC ERROR: Libraries linked from [99]PETSC ERROR: [119]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [100]PETSC ERROR: [111]PETSC ERROR: [105]PETSC ERROR: Configure options [122]PETSC ERROR: See docs/changes/index.html for recent updates. [96]PETSC ERROR: [98]PETSC ERROR: [114]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [106]PETSC ERROR: [118]PETSC ERROR: Configure options Libraries linked from [119]PETSC ERROR: [115]PETSC ERROR: ------------------------------------------------------------------------ [118]PETSC ERROR: See docs/index.html for manual pages. ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 MatSetType() line 74 in src/mat/interface/matreg.c [107]PETSC ERROR: [120]PETSC ERROR: [96]PETSC ERROR: [102]PETSC ERROR: [103]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------------------------------ Configure run at [108]PETSC ERROR: [113]PETSC ERROR: ------------------------------------------------------------------------ [126]PETSC ERROR: [125]PETSC ERROR: [98]PETSC ERROR: [96]PETSC ERROR: Libraries linked from Libraries linked from [116]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [99]PETSC ERROR: Configure run at ------------------------------------------------------------------------ ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 Configure run at ------------------------------------------------------------------------ [114]PETSC ERROR: [119]PETSC ERROR: ------------------------------------------------------------------------ [112]PETSC ERROR: Libraries linked from KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [122]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [96]PETSC ERROR: Configure run at Libraries linked from [116]PETSC ERROR: [111]PETSC ERROR: ---------------------------------------------Configure run at [13]PETSC ERROR: [15]PETSC ERROR: Configure run at [27]PETSC ERROR: [14]PETSC ERROR: Configure options [23]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [26]PETSC ERROR: Configure options See docs/faq.html for hints about trouble shooting. [16]PETSC ERROR: [20]PETSC ERROR: [15]PETSC ERROR: Configure options ------------------------------------------------------------------------ ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [22]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [9]PETSC ERROR: ------------------------------------------------------------------------ [31]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [17]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ [3]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [8]PETSC ERROR: [26]PETSC ERROR: ------------------------------------------------------------------------ [14]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c Configure run at [2]PETSC ERROR: See docs/index.html for manual pages. [25]PETSC ERROR: [6]PETSC ERROR: [7]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ ------------------------------------------------------------------------ Configure options [20]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [14]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatConvert() line 3747 in src/mat/interface/matrix.c [29]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: [27]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 Configure run at MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [18]PETSC ERROR: Configure options [12]PETSC ERROR: Configure options [28]PETSC ERROR: [7]PETSC ERROR: [6]PETSC ERROR: [14]PETSC ERROR: [23]PETSC ERROR: [15]PETSC ERROR: [20]PETSC ERROR: [22]PETSC ERROR: ------------------------------------------------------------------------ MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------------------------------ MatSetType() line 74 in src/mat/interface/matreg.c [2]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [16]PETSC ERROR: ------------------------------------------------------------------------ [20]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 MatSetType() line 74 in src/mat/interface/matreg.c [14]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [28]PETSC ERROR: [11]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c Libraries linked from [17]PETSC ERROR: [24]PETSC ERROR: [26]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [27]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [19]PETSC ERROR: See docs/faq.html for hints about trouble shooting. MatConvert() line 3747 in src/mat/interface/matrix.c [26]PETSC ERROR: [16]PETSC ERROR: Configure options ------------------------------------------------------------------------ MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Libraries linked from Configure run at MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [31]PETSC ERROR: [18]PETSC ERROR: [20]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [10]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [29]PETSC ERROR: [8]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [23]PETSC ERROR: [22]PETSC ERROR: [19]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [14]PETSC ERROR: [16]PETSC ERROR: [28]PETSC ERROR: [24]PETSC ERROR: [30]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatSetType() line 74 in src/mat/interface/matreg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [27]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatSetType() line 74 in src/mat/interface/matreg.c [31]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [20]PETSC ERROR: [24]PETSC ERROR: [15]PETSC ERROR: Configure run at MatSetType() line 74 in src/mat/interface/matreg.c [9]PETSC ERROR: [7]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c Libraries linked from [26]PETSC ERROR: ------------------------------------------------------------------------ [14]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: [23]PETSC ERROR: [9]PETSC ERROR: Configure options [15]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [30]PETSC ERROR: [6]PETSC ERROR: [13]PETSC ERROR: [7]PETSC ERROR: Configure options PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatSetType() line 74 in src/mat/interface/matreg.c [6]PETSC ERROR: [20]PETSC ERROR: [18]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatConvert() line 3747 in src/mat/interface/matrix.c [20]PETSC ERROR: [25]PETSC ERROR: [28]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [23]PETSC ERROR: [17]PETSC ERROR: Configure options PCSetUp() line 832 in src/ksp/pc/interface/precon.c [16]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [7]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [2]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ [20]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------------------------------ PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c MatConvert() line 3747 in src/mat/interface/matrix.c MatSetType() line 74 in src/mat/interface/matreg.c [9]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [24]PETSC ERROR: [16]PETSC ERROR: [23]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [19]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatConvert() line 3747 in src/mat/interface/matrix.c [12]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure run at PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [16]PETSC ERROR: [3]PETSC ERROR: Configure options [12]PETSC ERROR: [14]PETSC ERROR: [13]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c MatConvert() line 3747 in src/mat/interface/matrix.c [27]PETSC ERROR: [26]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [25]PETSC ERROR: Configure options [28]PETSC ERROR: [19]PETSC ERROR: [12]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c MatSetType() line 74 in src/mat/interface/matreg.c See docs/index.html for manual pages. ------------------------------------------------------------------------ [10]PETSC ERROR: [18]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [17]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [31]PETSC ERROR: [30]PETSC ERROR: [3]PETSC ERROR: [29]PETSC ERROR: [24]PETSC ERROR: [19]PETSC ERROR: ------------------------------------------------------------------------ PCSetUp() line 832 in src/ksp/pc/interface/precon.c [11]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [10]PETSC ERROR: [15]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c Libraries linked from ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c [11]PETSC ERROR: [6]PETSC ERROR: Configure run at MatSetType() line 74 in src/mat/interface/matreg.c MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [24]PETSC ERROR: [27]PETSC ERROR: [29]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [28]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: [24]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [16]PETSC ERROR: ------------------------------------------------------------------------ [7]PETSC ERROR: [17]PETSC ERROR: [30]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [31]PETSC ERROR: [2]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [11]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [10]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [14]PETSC ERROR: [16]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [31]PETSC ERROR: [7]PETSC ERROR: [24]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [7]PETSC ERROR: [26]PETSC ERROR: Configure options [28]PETSC ERROR: [16]PETSC ERROR: [30]PETSC ERROR: [9]PETSC ERROR: Libraries linked from [11]PETSC ERROR: [6]PETSC ERROR: ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [19]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [2]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [26]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [15]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [25]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: --------------------------- [121]PETSC ERROR: [105]PETSC ERROR: Libraries linked from [96]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c See docs/index.html for manual pages. [111]PETSC ERROR: [110]PETSC ERROR: See docs/index.html for manual pages. Configure options [107]PETSC ERROR: [100]PETSC ERROR: [117]PETSC ERROR: Configure options [115]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 Configure run at [122]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [118]PETSC ERROR: [116]PETSC ERROR: [126]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. Configure run at PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [104]PETSC ERROR: [113]PETSC ERROR: [117]PETSC ERROR: [101]PETSC ERROR: [103]PETSC ERROR: [96]PETSC ERROR: ------------------------------------------------------------------------ [98]PETSC ERROR: [122]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c Libraries linked from Configure options MatConvert() line 3747 in src/mat/interface/matrix.c [96]PETSC ERROR: [109]PETSC ERROR: [106]PETSC ERROR: [100]PETSC ERROR: [99]PETSC ERROR: Configure run at ------------------------------------------------------------------------ ------------------------------------------------------------------------ PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c Configure options MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [106]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ [105]PETSC ERROR: [112]PETSC ERROR: [109]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [103]PEy 24 16:12:20 2013 [49]PETSC ERROR: [52]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 Libraries linked from ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ ------------------------------------------------------------------------ [36]PETSC ERROR: [57]PETSC ERROR: [59]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c Libraries linked from See docs/index.html for manual pages. [61]PETSC ERROR: Configure options [46]PETSC ERROR: [56]PETSC ERROR: [36]PETSC ERROR: [51]PETSC ERROR: [44]PETSC ERROR: [37]PETSC ERROR: [63]PETSC ERROR: Configure run at See docs/faq.html for hints about trouble shooting. [47]PETSC ERROR: [34]PETSC ERROR: [45]PETSC ERROR: [38]PETSC ERROR: See docs/index.html for manual pages. [32]PETSC ERROR: [58]PETSC ERROR: [42]PETSC ERROR: [59]PETSC ERROR: Configure options [45]PETSC ERROR: [61]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [43]PETSC ERROR: Libraries linked from ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [55]PETSC ERROR: ------------------------------------------------------------------------ [54]PETSC ERROR: [57]PETSC ERROR: [35]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Libraries linked from ------------------------------------------------------------------------ [58]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c Configure options [38]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ [57]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [35]PETSC ERROR: [32]PETSC ERROR: Libraries linked from [46]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 ./hit on a named nid21819TSC ERROR: [102]PETSC ERROR: Configure options MatSetType() line 74 in src/mat/interface/matreg.c ------------------------------------------------------------------------ Libraries linked from [115]PETSC ERROR: [106]PETSC ERROR: [124]PETSC ERROR: [125]PETSC ERROR: [100]PETSC ERROR: [101]PETSC ERROR: Configure options PCSetUp() line 832 in src/ksp/pc/interface/precon.c See docs/index.html for manual pages. ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [96]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [127]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [124]PETSC ERROR: [126]PETSC ERROR: Libraries linked from Libraries linked from KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Libraries linked from PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [103]PETSC ERROR: [111]PETSC ERROR: [102]PETSC ERROR: [100]PETSC ERROR: Configure options PCSetUp() line 832 in src/ksp/pc/interface/precon.c [101]PETSC ERROR: Configure run at [102]PETSC ERROR: [108]PETSC ERROR: [116]PETSC ERROR: [110]PETSC ERROR: ------------------------------------------------------------------------ [104]PETSC ERROR: Configure run at Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 MatConvert() line 3747 in src/mat/interface/matrix.c [122]PETSC ERROR: [125]PETSC ERROR: Libraries linked from [107]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [113]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [105]PETSC ERROR: Configure options MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ [122]PETSC ERROR: [118]PETSC ERROR: [109]PETSC ERROR: [126]PETSC ERROR: [127]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [111]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in s by Unknown Fri May 24 16:12:20 2013 [55]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c See docs/index.html for manual pages. [56]PETSC ERROR: ------------------------------------------------------------------------ [40]PETSC ERROR: [35]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [53]PETSC ERROR: [48]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c Configure run at See docs/index.html for manual pages. Configure run at ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [62]PETSC ERROR: [41]PETSC ERROR: [38]PETSC ERROR: [46]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 Libraries linked from [61]PETSC ERROR: [62]PETSC ERROR: Libraries linked from Configure run at PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure options Configure run at [59]PETSC ERROR: [54]PETSC ERROR: [47]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [45]PETSC ERROR: [52]PETSC ERROR: [35]PETSC ERROR: Libraries linked from [37]PETSC ERROR: ------------------------------------------------------------------------ [40]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [43]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [53]PETSC ERROR: ------------------------------------------------------------------------ KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [32]PETSC ERROR: [61]PETSC ERROR: Configure run at MatSetType() line 74 in src/mat/interface/matreg.c ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [51]PETSC ERROR: [62]PETSC ERROR: [61]PETSC ERROR: [35]PETSC ERROR: [59]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c Configure options [37]PETSC ERROR: [32]PETSC ERROR: ------------------------------------------------------------------------ [57]PETSC ERROR: Libraries linked from [55]PETSC ERROR: [56]PETSC ERROR: Configure options [38]PETSC ERROR: ------------------------rc/ksp/pc/impls/mg/mg.c [117]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [104]PETSC ERROR: [116]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 Configure run at [108]PETSC ERROR: [112]PETSC ERROR: [103]PETSC ERROR: [99]PETSC ERROR: [124]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [101]PETSC ERROR: [113]PETSC ERROR: [117]PETSC ERROR: Configure run at MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------ PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [52]PETSC ERROR: [39]PETSC ERROR: [42]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [41]PETSC ERROR: [35]PETSC ERROR: [32]PETSC ERROR: [61]PETSC ERROR: Libraries linked from MatConvert() line 3747 in src/mat/interface/matrix.c [63]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c ------------------------------------------------------------------------ Configure run at MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [37]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [62]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c Configure run at [38]PETSC ERROR: [39]PETSC ERROR: [35]PETSC ERROR: [50]PETSC ERROR: [54]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [52]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [61]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [40]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [46]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure run at [56]PETSC ERROR: [55]PETSC ERROR: [61]PETSC ERROR: [32]PETSC ERROR: [58]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c Configure options ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [53]PETSC ERROR: [126]PETSC ERROR: [98]PETSC ERROR: Libraries linked from PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [109]PETSC ERROR: [122]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Libraries linked from ------------------------------------------------------------------------ [101]PETSC ERROR: [117]PETSC ERROR: [119]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [99]PETSC ERROR: Libraries linked from MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [120]PETSC ERROR: Configure options ------------------------------------------------------------------------ [126]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [105]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c ------------------------------------------------------------------------ [100]PETSC ERROR: [101]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatSetType() line 74 in src/mat/interface/matreg.c [106]PETSC ERROR: [107]PETSC ERROR: [104]PETSC ERROR: Configure options [110]PETSC ERROR: [111]PETSC ERROR: Configure run at Libraries linked from [121]PETSC ERROR: Configure run at PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [110]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c See docs/changes/index.html for recent updates. [98]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure options Configure run at [126]PETSC ERROR: [118]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [122]PETSC ERROR: [112]PETSC ERROR: Libraries linked from [113]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [103]PETSC ERROR: Configure run at [109]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure Configure options KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [36]PETSC ERROR: [37]PETSC ERROR: Libraries linked from [41]PETSC ERROR: [39]PETSC ERROR: Libraries linked from Configure options [42]PETSC ERROR: [61]PETSC ERROR: [44]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c Configure run at Configure options [46]PETSC ERROR: Configure run at PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [47]PETSC ERROR: [32]PETSC ERROR: [52]PETSC ERROR: [40]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [57]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [62]PETSC ERROR: [56]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c Configure run at [36]PETSC ERROR: [54]PETSC ERROR: Configure options [61]PETSC ERROR: ------------------------------------------------------------------------ [59]PETSC ERROR: Configure options [51]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [37]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c Configure options [62]PETSC ERROR: [46]PETSC ERROR: [32]PETSC ERROR: Configure run at MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [35]PETSC ERROR: Configure options [61]PETSC ERROR: [54]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [36]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [43]PETSC ERROR: [62]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------------------------------ KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [59]PETSC ERROR: [58]PETSC ERROR: [37]PETSC ERROR: [36]PETSC ERRORoptions [119]PETSC ERROR: [115]PETSC ERROR: [100]PETSC ERROR: [116]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [112]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [103]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatConvert() line 3747 in src/mat/interface/matrix.c [117]PETSC ERROR: [100]PETSC ERROR: [124]PETSC ERROR: [101]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c [108]PETSC ERROR: Configure options [99]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [107]PETSC ERROR: ------------------------------------------------------------------------ [101]PETSC ERROR: [103]PETSC ERROR: [114]PETSC ERROR: [98]PETSC ERROR: [124]PETSC ERROR: ------------------------------------------------------------------------ MatSetType() line 74 in src/mat/interface/matreg.c [120]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [127]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [113]PETSC ERROR: [118]PETSC ERROR: [122]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [99]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [106]PETSC ERROR: [100]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c ------------------------------------------------------------------------ [99]PETSC ERROR: [113]PETSC ERROR: [125]PETSC ERROR: [116]PETSC ERROR: [111]PETSC ERROR: [108]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c Configure run at [104]PETSC ERROR: [121]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ Confi: [39]PETSC ERROR: [55]PETSC ERROR: ------------------------------------------------------------------------ [32]PETSC ERROR: [44]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ ------------------------------------------------------------------------ [41]PETSC ERROR: gure run at [127]PETSC ERROR: Configure run at ------------------------------------------------------------------------ KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [112]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [103]PETSC ERROR: ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [105]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [115]PETSC ERROR: [114]PETSC ERROR: [125]PETSC ERROR: Configure options [119]PETSC ERROR: Libraries linked from KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [118]PETSC ERROR: Configure options [106]PETSC ERROR: Configure run at [124]PETSC ERROR: [120]PETSC ERROR: See docs/index.html for manual pages. [103]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [122]PETSC ERROR: [127]PETSC ERROR: [113]PETSC ERROR: Configure options ------------------------------------------------------------------------ [114]PETSC ERROR: Configure options ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [121]PETSC ERROR: [120]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c ------------------------------------------------------------------------ [114]PETSC ERROR: [121]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatConvert() line 3747 in src/mat/interface/matrix.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------------------------------ [112]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [127]PETSC ERROR: Configure options [103]PETSC ERROR: [113]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [122]PETSC ERROR: [115]PETSC ERROR: PCSePCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c Libraries linked from [32]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [58]PETSC ERROR: Configure options [56]PETSC ERROR: [51]PETSC ERROR: [50]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [62]PETSC ERROR: [57]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ [52]PETSC ERROR: [45]PETSC ERROR: Configure options [53]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c [42]PETSC ERROR: [40]PETSC ERROR: [38]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Configure run at Configure options [36]PETSC ERROR: Configure run at [61]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Libraries linked from PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c MatConvert() line 3747 in src/mat/interface/matrix.c [54]PETSC ERROR: ------------------------------------------------------------------------ PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [59]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [32]PETSC ERROR: [36]PETSC ERROR: [39]PETSC ERROR: [37]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [62]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ [32]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [39]PETSC ERROR: [62]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [62]PETSC ERROR: [43]PETSC ERROR: [44]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [58]PETSC ERROR: [32]PETSC ERROR: [46]PETSC ERROR: [40]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c ------------------------------------------------------------------------ [53]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [48]PETSC ERROR: [37]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [3]PETSC ERROR: [6]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [26]PETSC ERROR: [19]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [10]PETSC ERROR: [11]PETSC ERROR: [9]PETSC ERROR: Configure run at [27]PETSC ERROR: [2]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [3]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [18]PETSC ERROR: [7]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [13]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [16]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [6]PETSC ERROR: [8]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [26]PETSC ERROR: [25]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [28]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [17]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [3]PETSC ERROR: [10]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [26]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [14]PETSC ERROR: [18]PETSC ERROR: [7]PETSC ERROR: [17]PETSC ERROR: [29]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c ------------------------------------------------------------------------ [25]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [12]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [30]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [2]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [24]PETSC ERROR: [27]PETSC ERROR: [26]PETSC ERROR: [28]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [17]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [27]PETSC ERROR: [6]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [7]PETSC ERROR: [31]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [29]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [13]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [17]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [31]PETSC ERROR: [3]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [18]PETSC ERROR: [7]PETSC ERROR: [9]PETSC ERROR: Configure options KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [30]PETSC ERROR: [16]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatSetType() line 74 in src/mat/interface/matreg.c [11]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [13]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [14]PETSC ERROR: [15]PETSC ERROR: [18]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [2]PETSC ERROR: [3]PETSC ERROR: [16]PETSC ERROR: [31]PETSC ERROR: [25]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c MatConvert() line 3747 in src/mat/interface/matrix.c [8]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [12]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [30]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [9]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [13]PETSC ERROR: [15]PETSC ERROR: [14]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [27]PETSC ERROR: [3]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [8]PETSC ERROR: [16]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [18]PETSC ERROR: [9]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [16]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [18]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [2]PETSC ERROR: [29]PETSC ERROR: [13]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [31]PETSC ERROR: [24]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [15]PETSC ERROR: [14]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [29]PETSC ERROR: [10]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [14]PETSC ERROR: [15]PETSC ERROR: [8]PETSC ERROR: ------------------------------------------------------------------------ [12]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [25]PETSC ERROR: [9]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [31]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [14]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [10]PETSC ERROR: [25]PETSC ERROR: [15]PETSC ERROR: [12]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [11]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatConvert() line 3747 in src/mat/interface/matrix.c [30]PETSC ERROR: [27]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [10]PETSC ERROR: [9]PETSC ERROR: [12]PETSC ERROR: [25]PETSC ERROR: [14]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [13]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [8]PETSC ERROR: [11]PETSC ERROR: [30]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [29]PETSC ERROR: [9]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [25]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [10]PETSC ERROR: [11]PETSC ERROR: [14]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [27]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [8]PETSC ERROR: [12]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [14]PETSC ERROR: [13]PETSC ERROR: [8]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [12]PETSC ERROR: [11]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [29]PETSC ERROR: [13]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [12]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [29]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [12]PETSC ERROR: [10]PETSC ERROR: [13]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [11]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [13]PETSC ERROR: [11]PETSC ERROR: [12]PETSC ERROR: [30]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [10]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [25]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [10]PETSC ERROR: [11]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [10]PETSC ERROR: [11]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [10]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [10]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [10]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Invalid argument! [1]PETSC ERROR: Comm must be of size 1! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 [1]PETSC ERROR: Libraries linked from [1]PETSC ERROR: Configure run at [1]PETSC ERROR: Configure options [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [1]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [1]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [1]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [1]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [1]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [1]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [1]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [1]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c tUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [111]PETSC ERROR: [107]PETSC ERROR: [122]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [116]PETSC ERROR: [120]PETSC ERROR: [119]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 MatSetType() line 74 in src/mat/interface/matreg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [117]PETSC ERROR: [109]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [118]PETSC ERROR: Configure options ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [120]PETSC ERROR: Configure run at [118]PETSC ERROR: [125]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [127]PETSC ERROR: [119]PETSC ERROR: [122]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [121]PETSC ERROR: [116]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [119]PETSC ERROR: [113]PETSC ERROR: [110]PETSC ERROR: [107]PETSC ERROR: [111]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Libraries linked from MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatSetType() line 74 in src/mat/interface/matreg.c MatSetType() line 74 in src/mat/interface/matreg.c [124]PETSC ERROR: [112]PETSC ERROR: [121]PETSC ERROR: Configure run at [127]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatSetType() line 74 in src/mat/interface/matreg.c [125]PETSC ERROR: [117]PETSC ERROR: Configure run at PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c ------------------------------------------------------------------------ [127]PETSC ERROR: [114]PETSC ERROR: [115]PETSC ERROR: [108]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c MatConvert() line 3747 in src/mat/interface/matrix.c Configure options PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c [109]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c MatSetType() line 74 in src/mat/interface/matreg.c [113]PETSC ERROR: Configure options Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [83]PETSC ERROR: See docs/changes/index.html for recent updates. [71]PETSC ERROR: [78]PETSC ERROR: [66]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [67]PETSC ERROR: [70]PETSC ERROR: [74]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. See docs/changes/index.html for recent updates. Comm must be of size 1! [87]PETSC ERROR: [73]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [78]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [68]PETSC ERROR: See docs/index.html for manual pages. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. [87]PETSC ERROR: [78]PETSC ERROR: [69]PETSC ERROR: See docs/changes/index.html for recent updates. [71]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ [70]PETSC ERROR: [69]PETSC ERROR: [74]PETSC ERROR: [86]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 ------------------------------------------------------------------------ See docs/changes/index.html for recent updates. [82]PETSC ERROR: [76]PETSC ERROR: Libraries linked from [67]PETSC ERROR: [68]PETSC ERROR: See docs/changes/index.html for recent updates. [72]PETSC ERROR: Configure run at See docs/faq.html for hints about trouble shooting. [85]PETSC ERROR: [84]PETSC ERROR: [88]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [64]PETSC ERROR: [93]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ See docs/index.htm[32]PETSC ERROR: [55]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [49]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [56]PETSC ERROR: [57]PETSC ERROR: [51]PETSC ERROR: [50]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Configure options [52]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatSetType() line 74 in src/mat/interface/matreg.c [43]PETSC ERROR: [62]PETSC ERROR: [57]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [56]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [62]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [36]PETSC ERROR: [42]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [58]PETSC ERROR: Configure run at Libraries linked from MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [51]PETSC ERROR: [46]PETSC ERROR: [106]PETSC ERROR: [124]PETSC ERROR: [118]PETSC ERROR: Configure options [111]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [121]PETSC ERROR: [127]PETSC ERROR: [110]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [105]PETSC ERROR: [108]PETSC ERROR: [120]PETSC ERROR: ------------------------------------------------------------------------ MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [114]PETSC ERROR: [121]PETSC ERROR: [119]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [125]PETSC ERROR: ------------------------------------------------------------------------ [127]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------------------------------ MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatConvert() line 3747 in src/mat/interface/matrix.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [116]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [127]PETSC ERROR: [121]PETSC ERROR: [107]PETSC ERROR: [108]PETSC ERROR: [111]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [109]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c ------------------------------------------------------------------------ [104]PETSC ERROR: l for manual pages. See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [94]PETSC ERROR: [95]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [65]PETSC ERROR: [70]PETSC ERROR: [71]PETSC ERROR: [89]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [91]PETSC ERROR: [81]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [87]PETSC ERROR: [67]PETSC ERROR: [73]PETSC ERROR: [66]PETSC ERROR: See docs/index.html for manual pages. [80]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ ------------------------------------------------------------------------ [78]PETSC ERROR: [85]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. [70]PETSC ERROR: [67]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [43]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatConvert() line 3747 in src/mat/interface/matrix.c [49]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [62]PETSC ERROR: Libraries linked from PCSetUp() line 832 in src/ksp/pc/interface/precon.c [57]PETSC ERROR: [37]PETSC ERROR: [36]PETSC ERROR: [62]PETSC ERROR: [54]PETSC ERROR: [51]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [58]PETSC ERROR: Configure options [50]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [45]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [53]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [54]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [36]PETSC ERROR: [55]PETSC ERROR: [47]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [58]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c [33]PETSC ERROR: [40]PETSC ERROR: [36]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c ------------------------------------------------------------------------ KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Configure run at [46]PETSC ERROR: [56]PETSC ERROR: [41]PETSC ERROR: [44]PETSC ERROR: [49]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c MatConvert() line 3747 in src/mat/interface/matrix.c [111]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [107]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [112]PETSC ERROR: [116]PETSC ERROR: [121]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [115]PETSC ERROR: [117]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c MatSetType() line 74 in src/mat/interface/matreg.c [114]PETSC ERROR: [116]PETSC ERROR: [125]PETSC ERROR: [127]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [119]PETSC ERROR: [106]PETSC ERROR: [113]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [118]PETSC ERROR: [105]PETSC ERROR: [104]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [118]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [106]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [125]PETSC ERROR: [115]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [107]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [110]PETSC ERROR: [77]PETSC ERROR: [66]PETSC ERROR: [68]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [71]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [72]PETSC ERROR: [76]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [77]PETSC ERROR: Configure options ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 See docs/changes/index.html for recent updates. [69]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [86]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. Libraries linked from [70]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 See docs/changes/index.html for recent updates. [83]PETSC ERROR: [82]PETSC ERROR: [65]PETSC ERROR: See docs/changes/index.html for recent updates. [84]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [72]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. [73]PETSC ERROR: [66]PETSC ERROR: [67]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. MatConvert() line 3747 in src/mat/interface/matrix.c [85]PETSC ERROR: [93]PETSC ERROR: [81]PETSC ERROR: [80]PETSC ERROR: [65]PETSC ERROR: See docs/changes/index.html for recent updates. [89]PETSC ERROR: Libraries linked from ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [74]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. [78]PETSC ERROR: [72]PETSC ERROR: See docs/index.html for manual pages. [81]PETSC ERROR: [95]PETSC ERROR: [70]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [69]PETSC ERROR: [68]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pagesPCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [58]PETSC ERROR: Libraries linked from KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c ------------------------------------------------------------------------ ------------------------------------------------------------------------ MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [33]PETSC ERROR: [58]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c Configure run at [52]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [49]PETSC ERROR: [40]PETSC ERROR: [33]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Configure options MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatSetType() line 74 in src/mat/interface/matreg.c Configure options [56]PETSC ERROR: [57]PETSC ERROR: [51]PETSC ERROR: [52]PETSC ERROR: [41]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [50]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [54]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [48]PETSC ERROR: [33]PETSC ERROR: ------------------------------------------------------------------------ [57]PETSC ERROR: [49]PETSC ERROR: Configure run at [58]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c ------------------------------------------------------------------------ MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [40]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [53]PETSC ERROR: [111]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatConvert() line 3747 in src/mat/interface/matrix.c [124]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [113]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c MatConvert() line 3747 in src/mat/interface/matrix.c [125]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [116]PETSC ERROR: [114]PETSC ERROR: [124]PETSC ERROR: [112]PETSC ERROR: [107]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [113]PETSC ERROR: [119]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [127]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [117]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c [115]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [121]PETSC ERROR: [124]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [125]PETSC ERROR: [127]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [118]PETSC ERROR: [117]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [109]PETSC ERROR: [114]PETSC ERROR: [112]PETSC ERROR: [110]PETSC ERROR: [107]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [118]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c . PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c See docs/index.html for manual pages. [65]PETSC ERROR: Configure run at [76]PETSC ERROR: [72]PETSC ERROR: [87]PETSC ERROR: ------------------------------------------------------------------------ [70]PETSC ERROR: See docs/changes/index.html for recent updates. [93]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [88]PETSC ERROR: [91]PETSC ERROR: [84]PETSC ERROR: ------------------------------------------------------------------------ ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [93]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. [77]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. [76]PETSC ERROR: [68]PETSC ERROR: Libraries linked from See docs/index.html for manual pages. [84]PETSC ERROR: [94]PETSC ERROR: See docs/faq.html for hints about trouble shooting. PCSetUp() line 832 in src/ksp/pc/interface/precon.c ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [91]PETSC ERROR: [82]PETSC ERROR: [65]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [87]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [81]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Libraries linked from Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Libraries linked from [89]PETSC ERROR: See docs/changes/index.html for recent updates. [71]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c See docs/index.html for manual pages. [66]PETSC ERROR: [67]PETSC ERROR: [65]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. ./hit on a named nid21860 by Unknown Fri Ma[56]PETSC ERROR: [52]PETSC ERROR: [58]PETSC ERROR: [51]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [54]PETSC ERROR: [55]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [56]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c Configure run at MatConvert() line 3747 in src/mat/interface/matrix.c [44]PETSC ERROR: [53]PETSC ERROR: [58]PETSC ERROR: [48]PETSC ERROR: [57]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [33]PETSC ERROR: Configure options MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [47]PETSC ERROR: [40]PETSC ERROR: [33]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [48]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatSetType() line 74 in src/mat/interface/matreg.c [52]PETSC ERROR: [53]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [43]PETSC ERROR: [42]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c [51]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [33]PETSC ERROR: [52]PETSC ERROR: [55]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [41]PETSC ERROR: [49]PETSC ERROR: ./hit on a named nid21819 by Unknown Fri May 24 16:12:20 2013 [48]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [127]PETSC ERROR: [107]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [127]PETSC ERROR: [117]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [108]PETSC ERROR: [109]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [112]PETSC ERROR: [124]PETSC ERROR: [125]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [115]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [113]PETSC ERROR: [127]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [117]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [125]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [106]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [111]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [121]PETSC ERROR: [120]PETSC ERROR: [124]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [115]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [112]PETSC ERROR: [114]PETSC ERROR: [125]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [12y 24 16:12:20 2013 [85]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [86]PETSC ERROR: [91]PETSC ERROR: [77]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ [67]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [69]PETSC ERROR: Configure run at Configure options [94]PETSC ERROR: [76]PETSC ERROR: [65]PETSC ERROR: Configure run at See docs/index.html for manual pages. PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [71]PETSC ERROR: [70]PETSC ERROR: Libraries linked from [65]PETSC ERROR: [73]PETSC ERROR: [66]PETSC ERROR: [87]PETSC ERROR: See docs/changes/index.html for recent updates. [77]PETSC ERROR: [74]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [83]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c See docs/faq.html for hints about trouble shooting. Configure options [68]PETSC ERROR: [91]PETSC ERROR: See docs/index.html for manual pages. Configure run at [65]PETSC ERROR: ------------------------------------------------------------------------ [71]PETSC ERROR: [80]PETSC ERROR: See docs/index.html for manual pages. MatSetType() line 74 in src/mat/interface/matreg.c [94]PETSC ERROR: Configure run at [68]PETSC ERROR: [93]PETSC ERROR: Libraries linked from [76]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Libraries linked from See docs/index.html for manual pages. ------------------------------------------------------------------------ [74]PETSC ERROR: [57]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [53]PETSC ERROR: [60]PETSC ERROR: [33]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [56]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [48]PETSC ERROR: [46]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c Libraries linked from PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c MatConvert() line 3747 in src/mat/interface/matrix.c [33]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [60]PETSC ERROR: [50]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c [33]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [51]PETSC ERROR: Configure run at [49]PETSC ERROR: [53]PETSC ERROR: [57]PETSC ERROR: [60]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [49]PETSC ERROR: [40]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [53]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [33]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure options [55]PETSC ERROR: [60]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c ------------------------------------------------------------------------ [57]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [49]PETSC ERROR: [50]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [42]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [44]0]PETSC ERROR: [124]PETSC ERROR: [121]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [108]PETSC ERROR: [115]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [106]PETSC ERROR: [127]PETSC ERROR: [112]PETSC ERROR: [85]PETSC ERROR: [84]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [86]PETSC ERROR: [83]PETSC ERROR: ------------------------------------------------------------------------ [69]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [82]PETSC ERROR: ------------------------------------------------------------------------ Configure options [91]PETSC ERROR: [73]PETSC ERROR: [67]PETSC ERROR: [66]PETSC ERROR: Libraries linked from ------------------------------------------------------------------------ [93]PETSC ERROR: [72]PETSC ERROR: See docs/index.html for manual pages. [88]PETSC ERROR: [81]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 See docs/index.html for manual pages. ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 See docs/faq.html for hints about trouble shooting. [83]PETSC ERROR: [94]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [85]PETSC ERROR: [82]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [70]PETSC ERROR: [72]PETSC ERROR: [88]PETSC ERROR: [91]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 Configure run at [69]PETSC ERROR: Configure run at Configure run at [89]PETSC ERROR: Libraries linked from Configure run at [66]PETSC ERROR: [72]PETSC ERROR: ------------------------------------------------------------------------ [74]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Configure options [71]PETSC ERROR: [68]PETSC ERROR: [93]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Libraries linked from [91]PETSC ERROR: ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [71]PETSC ERROR: ------------------PETSC ERROR: [51]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [53]PETSC ERROR: [60]PETSC ERROR: [33]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [43]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c Configure options [53]PETSC ERROR: [60]PETSC ERROR: [33]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c [120]PETSC ERROR: [104]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [110]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [109]PETSC ERROR: [120]PETSC ERROR: [106]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [105]PETSC ERROR: [120]PETSC ERROR: [117]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [114]PETSC ERROR: [127]PETSC ERROR: [124]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [112]PETSC ERROR: [120]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [107]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [125]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [115]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [114]PETSC ERROR: [110]PETSC ERROR: [111]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [117]PETSC ERROR: [106]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [104]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [114]PETSC ERROR: [117]PETSC ERROR: [125]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gam------------------------------------------------------ [74]PETSC ERROR: Configure options ------------------------------------------------------------------------ Libraries linked from Configure options MatSetType() line 74 in src/mat/interface/matreg.c [84]PETSC ERROR: Configure run at [78]PETSC ERROR: Configure run at Libraries linked from --------------------- Error Message ------------------------------------ Libraries linked from [67]PETSC ERROR: [68]PETSC ERROR: [89]PETSC ERROR: [82]PETSC ERROR: [83]PETSC ERROR: [91]PETSC ERROR: [84]PETSC ERROR: [76]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [74]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [68]PETSC ERROR: Configure options See docs/index.html for manual pages. KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [80]PETSC ERROR: [81]PETSC ERROR: [71]PETSC ERROR: Configure run at Configure run at Configure options [87]PETSC ERROR: See docs/faq.html for hints about trouble shooting. Libraries linked from [93]PETSC ERROR: [69]PETSC ERROR: Configure options [89]PETSC ERROR: [94]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [78]PETSC ERROR: ------------------------------------------------------------------------ Libraries linked from [68]PETSC ERROR: [69]PETSC ERROR: [72]PETSC ERROR: [91]PETSC ERROR: [66]PETSC ERROR: Configure options Configure run at [77]PETSC ERROR: Configure options [83]PETSC ERROR: ------------------------------------------------------------------------ Configure options ------------------------------------------------------------------------ MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [70]PETSC ERROR: [74]PETSC ERROR: ------------------------------------------------------------------------ MatSetType() line 74 in src/mat/interface/matreg.c [93]PETSC ERROR: Configure options [86]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [67]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/[47]PETSC ERROR: [49]PETSC ERROR: [55]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c ------------------------------------------------------------------------ [60]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [54]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [60]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [44]PETSC ERROR: [45]PETSC ERROR: [33]PETSC ERROR: [48]PETSC ERROR: [51]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [54]PETSC ERROR: [55]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [47]PETSC ERROR: [50]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [55]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [54]PETSC ERROR: Configure options PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [50]PETSC ERROR: [60]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [54]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [41]PETSC ERROR: [49]PETSC ERROR: [48]PETSC ERROR: [42]PETSC ERROR: [47]PETSC ERROR: [60]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [45]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [50]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [55]PETSC ERROR: [60]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_Mdense.c ------------------------------------------------------------------------ MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 MatSetType() line 74 in src/mat/interface/matreg.c [67]PETSC ERROR: [93]PETSC ERROR: [91]PETSC ERROR: [94]PETSC ERROR: [78]PETSC ERROR: [66]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatSetType() line 74 in src/mat/interface/matreg.c [79]PETSC ERROR: See docs/index.html for manual pages. [76]PETSC ERROR: [69]PETSC ERROR: [72]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 Libraries linked from [88]PETSC ERROR: Configure run at MatSetType() line 74 in src/mat/interface/matreg.c [70]PETSC ERROR: ------------------------------------------------------------------------ MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ Configure run at MatConvert() line 3747 in src/mat/interface/matrix.c [93]PETSC ERROR: [91]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [89]PETSC ERROR: [70]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [86]PETSC ERROR: [72]PETSC ERROR: [68]PETSC ERROR: [74]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c ------------------------------------------------------------------------ MatConvert() line 3747 in src/mat/interface/matrix.c [87]PETSC ERROR: [93]PETSC ERROR: [94]PETSC ERROR: [91]PETSC ERROR: [83]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [80]PETSC ERROR: [73]PETSC ERROR: Configure options [88]PETSC ERROR: [74]PETSC ERROR: Libraries linked from [67]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [85]PETSC ERROR: [73]PETSC ERROR: [82]PETSC ERROR: Configure options [94]PETSC ERROR: ------------------------------------------------------------------------ Configure run at [84]PETSC ERROR: [83]PETSC ERROR: ------------------------------------------------------------------------ [85]PETSC ERROR: [76]PETSC ERROR: [81]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [78]PETSC ERROR: [77]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 MatConvert() line 3747 in src/mat/interface/matrix.c [71]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [72]PETSC ERROR: [80]PETSC ERROR: ------------------------------------------------------------------------ [68]PETSC ERROR: [69]PETSC ERROR: [88]PETSC ERROR: [74]PETSC ERROR: Configure options Libraries linked from [66]PETSC ERROR: ------------------------------------------------------------------------ MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatConvert() line 3747 in src/mat/interface/matrix.c [83]PETSC ERROR: [82]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [93]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c Configure run at [94]PETSC ERROR: [67]PETSC ERROR: Libraries linked from PCSetUp() line 832 in src/ksp/pc/interface/precon.c [89]PETSC ERROR: Libraries linked from PCSetUp() line 832 in src/ksp/pc/interface/precon.c Invalid argument! [91]PETSC ERROR: [93]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [76]PETSC ERROR: See docs/index.html for manual pages. Configure options [5]PETSC ERROR: [21]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [5]PETSC ERROR: [21]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:12:20 2013 MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [5]PETSC ERROR: [21]PETSC ERROR: Libraries linked from MatSetType() line 74 in src/mat/interface/matreg.c [5]PETSC ERROR: [21]PETSC ERROR: Configure run at [5]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c Configure options [5]PETSC ERROR: [21]PETSC ERROR: ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [5]PETSC ERROR: [21]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [5]PETSC ERROR: [21]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [5]PETSC ERROR: [21]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [5]PETSC ERROR: [21]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [5]PETSC ERROR: [21]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [5]PETSC ERROR: [21]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [5]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [5]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [5]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [5]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [80]PETSC ERROR: Libraries linked from MatSetType() line 74 in src/mat/interface/matreg.c [68]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [78]PETSC ERROR: [79]PETSC ERROR: [84]PETSC ERROR: Configure run at MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [72]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [74]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [71]PETSC ERROR: ------------------------------------------------------------------------ [94]PETSC ERROR: [77]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [82]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [91]PETSC ERROR: [88]PETSC ERROR: [67]PETSC ERROR: [70]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [69]PETSC ERROR: [71]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c Configure run at [83]PETSC ERROR: Configure options MatSetType() line 74 in src/mat/interface/matreg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [94]PETSC ERROR: [88]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [69]PETSC ERROR: Configure options [86]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c ------------------------------------------------------------------------ [69]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c Configure run at [66]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [76]PETSC ERROR: [81]PETSC ERROR: [80]PETSC ERROR: [93]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [67]PETSC ERROR: [71]PETSC ERROR: [87]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatSetType() line 74 in src/mat/interface/matreg.c [93]PETSC ERROR: Comm must be of size 1! [69]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [67]PETSC ERROR: [84]PETSC ERROR: [85]PETSC ERROR: Configure options [78]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c Configure run at KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [80]PETSC ERROR: [94]PETSC ERROR: [81]PETSC ERROR: [79]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c ------------------------------------------------------------------------ [73]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatConvert() line 3747 in src/mat/interface/matrix.c [74]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [93]PETSC ERROR: [89]PETSC ERROR: ------------------------------------------------------------------------ MatConvert() line 3747 in src/mat/interface/matrix.c ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [69]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ MatSetType() line 74 in src/mat/interface/matreg.c [89]PETSC ERROR: Configure options [76]PETSC ERROR: [67]PETSC ERROR: Configure options PCSetUp() line 832 in src/ksp/pc/interface/precon.c [80]PETSC ERROR: [69]PETSC ERROR: [81]PETSC ERROR: [70]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [94]PETSC ERROR: [83]PETSC ERROR: Configure options [66]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [88]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ ------------------------------------------------------------------------ PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [88]PETSC ERROR: [71]PETSC ERROR: [66]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [85]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [78]PETSC ERROR: [66]PETSC ERROR: [93]PETSC ERROR: [94]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [70]PETSC ERROR: [91]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [86]PETSC ERROR: [74]PETSC ERROR: [67]PETSC ERROR: Libraries linked from [82]PETSC ERROR: [71]PETSC ERROR: [79]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [76]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [89]PETSC ERROR: [74]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [67]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatSetType() line 74 in src/mat/interface/matreg.c [84]PETSC ERROR: [72]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [74]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [88]PETSC ERROR: [86]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [81]PETSC ERROR: Configure run at [83]PETSC ERROR: [77]PETSC ERROR: Configure run at [91]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [70]PETSC ERROR: [66]PETSC ERROR: [73]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [89]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [87]PETSC ERROR: [86]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c MatSetType() line 74 in src/mat/interface/matreg.c [91]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [76]PETSC ERROR: [66]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [94]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [80]PETSC ERROR: [71]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatSetType() line 74 in src/mat/interface/matreg.c [89]PETSC ERROR: [83]PETSC ERROR: [88]PETSC ERROR: [94]PETSC ERROR: [82]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [84]PETSC ERROR: [72]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c Configure options Configure options PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [85]PETSC ERROR: [94]PETSC ERROR: [84]PETSC ERROR: ------------------------------------------------------------------------ [66]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c [78]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [76]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [72]PETSC ERROR: [74]PETSC ERROR: [91]PETSC ERROR: [82]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [66]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [94]PETSC ERROR: [83]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [80]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [86]PETSC ERROR: [74]PETSC ERROR: [76]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [70]PETSC ERROR: [72]PETSC ERROR: [81]PETSC ERROR: [89]PETSC ERROR: [79]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [73]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c See docs/changes/index.html for recent updates. [88]PETSC ERROR: [89]PETSC ERROR: ------------------------------------------------------------------------ KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [70]PETSC ERROR: [81]PETSC ERROR: [84]PETSC ERROR: [87]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [78]PETSC ERROR: [72]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [76]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatConvert() line 3747 in src/mat/interface/matrix.c [80]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [87]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [78]PETSC ERROR: [86]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [80]PETSC ERROR: [82]PETSC ERROR: [85]PETSC ERROR: [83]PETSC ERROR: [87]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [72]PETSC ERROR: [82]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [88]PETSC ERROR: [86]PETSC ERROR: [81]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [80]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [89]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [81]PETSC ERROR: [83]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatSetType() line 74 in src/mat/interface/matreg.c [79]PETSC ERROR: [76]PETSC ERROR: [73]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [82]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c See docs/faq.html for hints about trouble shooting. [87]PETSC ERROR: [80]PETSC ERROR: [83]PETSC ERROR: [81]PETSC ERROR: [79]PETSC ERROR: [86]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [88]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [81]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [73]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [77]PETSC ERROR: [86]PETSC ERROR: See docs/index.html for manual pages. PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatConvert() line 3747 in src/mat/interface/matrix.c [79]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [80]PETSC ERROR: ------------------------------------------------------------------------ PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [86]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [88]PETSC ERROR: [82]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [83]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [80]PETSC ERROR: [73]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c ------------------------------------------------------------------------ [78]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [77]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [87]PETSC ERROR: [81]PETSC ERROR: [79]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [86]PETSC ERROR: [72]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [77]PETSC ERROR: [82]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [73]PETSC ERROR: [84]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [80]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [85]PETSC ERROR: [81]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatSetType() line 74 in src/mat/interface/matreg.c [86]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [78]PETSC ERROR: [89]PETSC ERROR: [82]PETSC ERROR: [87]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [84]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [77]PETSC ERROR: [88]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [89]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [81]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [88]PETSC ERROR: [84]PETSC ERROR: [73]PETSC ERROR: [85]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [78]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [84]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [87]PETSC ERROR: [79]PETSC ERROR: [85]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c Libraries linked from MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [89]PETSC ERROR: [79]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [77]PETSC ERROR: [86]PETSC ERROR: [87]PETSC ERROR: [85]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [88]PETSC ERROR: Configure run at KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [79]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c Configure options [73]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [86]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [85]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [79]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [86]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c ------------------------------------------------------------------------ [85]PETSC ERROR: [73]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [79]PETSC ERROR: [85]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [77]PETSC ERROR: [79]PETSC ERROR: [73]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [79]PETSC ERROR: [77]PETSC ERROR: [73]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [79]PETSC ERROR: [77]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [79]PETSC ERROR: [77]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [79]PETSC ERROR: [77]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [79]PETSC ERROR: [77]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [79]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [79]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [79]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [75]PETSC ERROR: [90]PETSC ERROR: Invalid argument! Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [75]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 Comm must be of size 1! [64]PETSC ERROR: [90]PETSC ERROR: [75]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [90]PETSC ERROR: [64]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [90]PETSC ERROR: [75]PETSC ERROR: [64]PETSC ERROR: See docs/index.html for manual pages. Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [90]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ [75]PETSC ERROR: [64]PETSC ERROR: [90]PETSC ERROR: See docs/changes/index.html for recent updates. ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 ------------------------------------------------------------------------ [90]PETSC ERROR: [75]PETSC ERROR: [64]PETSC ERROR: See docs/faq.html for hints about trouble shooting. Libraries linked from [75]PETSC ERROR: [92]PETSC ERROR: [90]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 See docs/index.html for manual pages. See docs/index.html for manual pages. [75]PETSC ERROR: Configure run at [92]PETSC ERROR: [64]PETSC ERROR: [90]PETSC ERROR: ------------------------------------------------------------------------ Libraries linked from ------------------------------------------------------------------------ [64]PETSC ERROR: Configure options [92]PETSC ERROR: Configure run at [75]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [90]PETSC ERROR: [92]PETSC ERROR: [64]PETSC ERROR: ------------------------------------------------------------------------ Configure options ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [90]PETSC ERROR: [64]PETSC ERROR: Libraries linked from MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ [92]PETSC ERROR: [90]PETSC ERROR: [64]PETSC ERROR: Configure run at [75]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [92]PETSC ERROR: [90]PETSC ERROR: Configure options [64]PETSC ERROR: Libraries linked from [92]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [75]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c ------------------------------------------------------------------------ [90]PETSC ERROR: Configure run at [92]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [64]PETSC ERROR: [90]PETSC ERROR: [92]PETSC ERROR: [75]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatConvert() line 3747 in src/mat/interface/matrix.c Configure options PCSetUp() line 832 in src/ksp/pc/interface/precon.c [92]PETSC ERROR: [64]PETSC ERROR: [90]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [92]PETSC ERROR: [90]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [64]PETSC ERROR: [75]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [92]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c ------------------------------------------------------------------------ [64]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [90]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [92]PETSC ERROR: [90]PETSC ERROR: [75]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [92]PETSC ERROR: [64]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [90]PETSC ERROR: [92]PETSC ERROR: [64]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [75]PETSC ERROR: [92]PETSC ERROR: [64]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [75]PETSC ERROR: [92]PETSC ERROR: [64]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [75]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [75]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [75]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [75]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [75]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [75]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [75]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. [95]PETSC ERROR: ------------------------------------------------------------------------ [95]PETSC ERROR: ./hit on a named nid21860 by Unknown Fri May 24 16:12:20 2013 [95]PETSC ERROR: Libraries linked from [95]PETSC ERROR: Configure run at [95]PETSC ERROR: Configure options [95]PETSC ERROR: ------------------------------------------------------------------------ [95]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [95]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c [95]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c [95]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [95]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [95]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [95]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [95]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [95]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [95]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. G() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [54]PETSC ERROR: [49]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [50]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [60]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatSetType() line 74 in src/mat/interface/matreg.c [55]PETSC ERROR: [50]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [49]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [48]PETSC ERROR: [60]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [42]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [48]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [60]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [47]PETSC ERROR: [41]PETSC ERROR: [43]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [48]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatConvert() line 3747 in src/mat/interface/matrix.c [48]PETSC ERROR: [40]PETSC ERROR: [47]PETSC ERROR: [44]PETSC ERROR: [42]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c ------------------------------------------------------------------------ [48]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c MatConvert() line 3747 in src/mat/interface/matrix.c [48]PETSC ERROR: g/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [115]PETSC ERROR: [117]PETSC ERROR: [125]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [105]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [115]PETSC ERROR: [110]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [107]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [104]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [108]PETSC ERROR: [107]PETSC ERROR: [109]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [105]PETSC ERROR: [110]PETSC ERROR: [104]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [109]PETSC ERROR: [110]PETSC ERROR: [107]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [104]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [105]PETSC ERROR: [110]PETSC ERROR: [108]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [104]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [110]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [108]PETSC ERROR: [105]PETSC ERROR: [109]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [110]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [108]PETSC ERROR: [109]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [45]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [43]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [47]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [45]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [44]PETSC ERROR: [40]PETSC ERROR: [47]PETSC ERROR: [41]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatSetType() line 74 in src/mat/interface/matreg.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [41]PETSC ERROR: [40]PETSC ERROR: [42]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [44]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [45]PETSC ERROR: [42]PETSC ERROR: [43]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [42]PETSC ERROR: [40]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [45]PETSC ERROR: [43]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [40]PETSC ERROR: [42]PETSC ERROR: [47]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [47]PETSC ERROR: [45]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [41]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [44]PETSC ERROR: [42]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [47]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [43]PETSC ERROR: [41]PETSC ERROR: [44]PETSC ERROR: [45]PETSC ERROR: PCSetUp_MG()See docs/index.html for manual pages. MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [123]PETSC ERROR: ------------------------------------------------------------------------ [96]PETSC ERROR: [123]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c ./hit on a named nid21861 by Unknown Fri May 24 16:12:20 2013 [96]PETSC ERROR: [123]PETSC ERROR: Libraries linked from MatConvert() line 3747 in src/mat/interface/matrix.c [123]PETSC ERROR: Configure run at [96]PETSC ERROR: [123]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c Configure options [96]PETSC ERROR: [123]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c ------------------------------------------------------------------------ [96]PETSC ERROR: [123]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c [96]PETSC ERROR: [123]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c MatSetType() line 74 in src/mat/interface/matreg.c [96]PETSC ERROR: [123]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c MatConvert() line 3747 in src/mat/interface/matrix.c [96]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [123]PETSC ERROR: [96]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [123]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [123]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [123]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [123]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [123]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [123]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [41]PETSC ERROR: [47]PETSC ERROR: [44]PETSC ERROR: [45]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [40]PETSC ERROR: [41]PETSC ERROR: [45]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [41]PETSC ERROR: [45]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [45]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: AssertiKSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [63]PETSC ERROR: [34]PETSC ERROR: Configure run at PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [34]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [34]PETSC ERROR: [63]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [34]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c Configure options [59]PETSC ERROR: [63]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c ------------------------------------------------------------------------ [63]PETSC ERROR: [59]PETSC ERROR: MatCreate_SeqDense() line 2189 in src/mat/impls/dense/seq/dense.c MatSetType() line 74 in src/mat/interface/matreg.c [63]PETSC ERROR: [59]PETSC ERROR: MatSetType() line 74 in src/mat/interface/matreg.c MatConvert() line 3747 in src/mat/interface/matrix.c [63]PETSC ERROR: [59]PETSC ERROR: MatConvert() line 3747 in src/mat/interface/matrix.c PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c [63]PETSC ERROR: [59]PETSC ERROR: PCSetUp_SVD() line 51 in src/ksp/pc/impls/svd/svd.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [63]PETSC ERROR: [59]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [63]PETSC ERROR: [59]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c [63]PETSC ERROR: [59]PETSC ERROR: PCSetUp_MG() line 729 in src/ksp/pc/impls/mg/mg.c PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c [63]PETSC ERROR: [59]PETSC ERROR: PCSetUp_GAMG() line 984 in src/ksp/pc/impls/gamg/gamg.c PCSetUp() line 832 in src/ksp/pc/interface/precon.c [63]PETSC ERROR: [59]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [63]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c on `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. _pmiu_daemon(SIGCHLD): [NID 21860] [c9-1c1s2n2] [Fri May 24 16:12:48 2013] PE RANK 64 exit signal Aborted [NID 21860] 2013-05-24 16:12:48 Apid 1659986: initiated application termination _pmiu_daemon(SIGCHLD): [NID 21861] [c9-1c1s2n3] [Fri May 24 16:12:48 2013] PE RANK 96 exit signal Aborted Application 1659986 exit codes: 134 Application 1659986 exit signals: Killed Application 1659986 resources: utime ~62s, stime ~34s From jedbrown at mcs.anl.gov Fri May 24 16:51:29 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 24 May 2013 16:51:29 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <519FDD10.3060900@uci.edu> References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> Message-ID: <87r4gwnj1a.fsf@mcs.anl.gov> Michele Rosso writes: >> With petsc-3.4 (which you should upgrade to), use >> -mg_coarse_sub_pc_factor_shift_type NONZERO Actually, use this with petsc-3.3 also (and please upgrade to petsc-3.4). The option you were passing was not being used. From knepley at gmail.com Fri May 24 17:17:20 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 24 May 2013 17:17:20 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <519FDFBF.3080405@uci.edu> References: <519687DD.4050209@uci.edu> <87r4h5pezo.fsf@mcs.anl.gov> <51969CF0.4030200@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <519FDFBF.3080405@uci.edu> Message-ID: On Fri, May 24, 2013 at 4:46 PM, Michele Rosso wrote: > In both cases I used -ksp_view and -option_left. > > For case 1 ( -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_sub_pc_factor_shift_nonzero > ) I posted the only output I had. > > For case 2 ( -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_pc_type svd) the output was too long to fit into an e-mail. > Please find it attached. > It looks like you are running in parallel (never do this until everything works in serial), which probably means you are using PCREDUNDANT for the coarse PC, so you need what Jed suggested: -mg_coarse_sub_pc_type svd Matt > Michele > > On 05/24/2013 02:37 PM, Matthew Knepley wrote: > > On Fri, May 24, 2013 at 4:35 PM, Michele Rosso wrote: > >> I tried >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> -mg_coarse_sub_pc_factor_shift_nonzero >> >> but I still get >> >> [0]PETSC ERROR: Detected zero pivot in LU factorization: >> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >> [0]PETSC ERROR: Zero pivot row 280 value 6.58999e-17 tolerance >> 2.22045e-14! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 >> CDT 2012 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./hit on a named nid21818 by Unknown Fri May 24 16:08:33 >> 2013 >> [0]PETSC ERROR: Libraries linked from >> [0]PETSC ERROR: Configure run at >> [0]PETSC ERROR: Configure options >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: MatPivotCheck_none() line 583 in >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> [0]PETSC ERROR: MatPivotCheck() line 602 in >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >> src/mat/impls/aij/seq/aijfact.c >> [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >> src/mat/interface/matrix.c >> [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c >> [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >> src/ksp/pc/impls/bjacobi/bjacobi.c >> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >> src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c >> [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c >> >> >> If instead I use >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> -mg_coarse_pc_type svd >> >> as Matthew suggested, I am told that there is an invalid argument. >> > > 1) When you send these in, we need to see -ksp_view, so we know what is > begin used > > 2) This is not enough information above. I use this all the time, or I > would not have suggested it > > Matt > > >> Michele >> >> >> >> >> >> >> >> >> >> >> >> On 05/24/2013 01:04 PM, Matthew Knepley wrote: >> >> On Fri, May 24, 2013 at 2:55 PM, Jed Brown wrote: >> >>> Michele Rosso writes: >>> >>> > Hi Jed, >>> > >>> > I followed your suggestion by using: >>> > >>> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >>> > >>> > This works perfectly if I have a non-singular matrix. When instead I >>> use >>> > periodic conditions for my system ( I set the nullspace removal >>> > correctly ), >>> > I receive an error saying a zero pivot is detected in the LU >>> > factorization. So, after some research, I found in the mailinglist a >>> fix : >>> > >>> > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >>> > -mg_coarse_pc_factor_shift_nonzero >>> >>> It'll need to be -mg_coarse_sub_pc_factor_shift_nonzero >>> >>> With petsc-3.4 (which you should upgrade to), use >>> -mg_coarse_sub_pc_factor_shift_type NONZERO >>> >>> The reason you need this "sub" prefix is that the code always restricts >>> using block Jacobi (usually localized so that all the entries are in one >>> block), before applying the direct coarse solver. >> >> >> I think this is less elegant than >> >> -mg_coarse_pc_type svd >> >> Matt >> >> >>> > Still I am receiving the following error >>> > >>> > >>> > [0]PETSC ERROR: --------------------- Error Message >>> > ------------------------------------ >>> > [0]PETSC ERROR: Detected zero pivot in LU factorization: >>> > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >>> > [0]PETSC ERROR: Zero pivot row 280 value 6.5908e-17 tolerance >>> 2.22045e-14! >>> > [0]PETSC ERROR: >>> > >>> ------------------------------------------------------------------------ >>> > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 >>> > 11:26:24 CDT 2012 >>> > [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> > [0]PETSC ERROR: See docs/index.html for manual pages. >>> > [0]PETSC ERROR: >>> > >>> ------------------------------------------------------------------------ >>> > [0]PETSC ERROR: ./hit on a named nid09458 by Unknown Fri May 24 >>> > 14:40:48 2013 >>> > [0]PETSC ERROR: Libraries linked from >>> > [0]PETSC ERROR: Configure run at >>> > [0]PETSC ERROR: Configure options >>> > [0]PETSC ERROR: >>> > >>> ------------------------------------------------------------------------ >>> > [0]PETSC ERROR: MatPivotCheck_none() line 583 in >>> > >>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>> > [0]PETSC ERROR: MatPivotCheck() line 602 in >>> > >>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>> > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >>> > src/mat/impls/aij/seq/aijfact.c >>> > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >>> src/mat/interface/matrix.c >>> > [0]PETSC ERROR: PCSetUp_LU() line 160 in >>> src/ksp/pc/impls/factor/lu/lu.c >>> > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c >>> > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c >>> > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >>> > src/ksp/pc/impls/bjacobi/bjacobi.c >>> > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >>> src/ksp/pc/interface/precon.c >>> > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >>> > src/ksp/ksp/interface/itfunc.c >>> > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >>> > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in >>> src/ksp/pc/impls/mg/mg.c >>> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >>> src/ksp/pc/impls/mg/mg.c >>> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >>> src/ksp/pc/impls/mg/mg.c >>> > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >>> src/ksp/pc/impls/mg/mg.c >>> > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c >>> > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c >>> > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c >>> > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c >>> > >>> > What could the reason be? >>> > Thank you, >>> > >>> > Michele >>> > >>> > >>> > >>> > On 05/17/2013 07:35 PM, Michele Rosso wrote: >>> >> Thank you very much. I will try and let you know. >>> >> >>> >> Michele >>> >> >>> >> On 05/17/2013 07:01 PM, Jed Brown wrote: >>> >>> Michele Rosso writes: >>> >>> >>> >>>> I noticed that the problem appears even if I use CG with the default >>> >>>> preconditioner: commenting KSPSetDM() solves the problem. >>> >>> Okay, this issue can't show up if you use SNES, but it's a >>> consequence >>> >>> of making geometric multigrid work with a pure KSP interface. You >>> can >>> >>> either use KSPSetComputeOperators() to put your assembly in a >>> function >>> >>> (which will also be called on coarse levels if you use -pc_type mg >>> >>> without Galerkin coarse operators) or you can can provide the >>> Jacobian >>> >>> using KSPSetOperators() as usual, but also call KSPSetDMActive() so >>> that >>> >>> the DM is not used for computing/updating the Jacobian. >>> >>> >>> >>> The logic is cleaner in petsc-3.4 and I think it just does the right >>> >>> thing in your case. >>> >>> >>> >>>> So basically without a proper grid (it seems no grid with an even >>> >>>> numbers of nodes qualifies) and with my own system matrix, I cannot >>> use >>> >>>> any type of multigrid >>> >>>> pre-conditioner? >>> >>> You can use all the AMG methods without setting a DM. >>> >>> >>> >> >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Fri May 24 17:18:24 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 24 May 2013 15:18:24 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <87r4gwnj1a.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> Message-ID: <519FE730.9000309@uci.edu> Using -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_sub_pc_factor_shift_type NONZERO -option_left -ksp_view still produces the same error: [0]PCSetData_AGG bs=1 MM=131072 [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Detected zero pivot in LU factorization: see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 tolerance 2.22045e-14! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri May 24 17:06:50 2013 [0]PETSC ERROR: Libraries linked from [0]PETSC ERROR: Configure run at [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatPivotCheck_none() line 583 in src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h [0]PETSC ERROR: MatPivotCheck() line 602 in src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in src/mat/impls/aij/seq/aijfact.c [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in src/mat/interface/matrix.c [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in src/ksp/pc/impls/bjacobi/bjacobi.c [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in src/ksp/pc/impls/bjacobi/bjacobi.c [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c On 05/24/2013 02:51 PM, Jed Brown wrote: > Michele Rosso writes: > >>> With petsc-3.4 (which you should upgrade to), use >>> -mg_coarse_sub_pc_factor_shift_type NONZERO > Actually, use this with petsc-3.3 also (and please upgrade to > petsc-3.4). > > The option you were passing was not being used. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri May 24 17:20:51 2013 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 24 May 2013 17:20:51 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <519FE730.9000309@uci.edu> References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> Message-ID: On Fri, May 24, 2013 at 5:18 PM, Michele Rosso wrote: > Using > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_sub_pc_factor_shift_type NONZERO -option_left -ksp_view > This is what debugging is about. We are not running your problem. How could this be debugged? 1) Run the problem w/o a null space so that it finishes 2) Look at the output for -ksp_view 3) Does the coarse solver say that it is shifted? 4) Are there options which were unused? Matt > still produces the same error: > > [0]PCSetData_AGG bs=1 MM=131072 > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Detected zero pivot in LU factorization: > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! > [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 tolerance 2.22045e-14! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 > CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri May 24 17:06:50 > 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatPivotCheck_none() line 583 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatPivotCheck() line 602 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in > src/mat/impls/aij/seq/aijfact.c > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in > src/mat/interface/matrix.c > [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > src/ksp/pc/impls/bjacobi/bjacobi.c > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > src/ksp/pc/impls/bjacobi/bjacobi.c > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c > > On 05/24/2013 02:51 PM, Jed Brown wrote: > > Michele Rosso writes: > > > With petsc-3.4 (which you should upgrade to), use > -mg_coarse_sub_pc_factor_shift_type NONZERO > > Actually, use this with petsc-3.3 also (and please upgrade to > petsc-3.4). > > The option you were passing was not being used. > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 24 17:22:46 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 24 May 2013 17:22:46 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: References: <519687DD.4050209@uci.edu> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <519FDFBF.3080405@uci.edu> Message-ID: <87ip28nhl5.fsf@mcs.anl.gov> Matthew Knepley writes: > It looks like you are running in parallel (never do this until everything > works in serial), which probably means you are using > PCREDUNDANT for the coarse PC, so you need what Jed suggested: > > -mg_coarse_sub_pc_type svd GAMG uses block Jacobi by default, not redundant. If it was redundant, you would use -mg_coarse_redundant_pc_type svd. From mirzadeh at gmail.com Fri May 24 21:08:08 2013 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Fri, 24 May 2013 19:08:08 -0700 Subject: [petsc-users] ViennaCL Message-ID: Hi guys, Speaking of interfaces, is there any plan to provide interfaces to ViennaCL solvers? Tnx Mohammad -------------- next part -------------- An HTML attachment was scrubbed... URL: From rupp at mcs.anl.gov Fri May 24 21:45:16 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Fri, 24 May 2013 21:45:16 -0500 Subject: [petsc-users] ViennaCL In-Reply-To: References: Message-ID: <51A025BC.4030002@mcs.anl.gov> Hi Mohammad, there is a first interface to ViennaCL in the next-branch already. Configure with --download-viennacl --with-opencl-include=/path/to/OpenCL/includes --with-opencl-lib=/path/to/libOpenCL.so and then use -vec_type viennacl -mat_type aijviennacl as runtime options. As this resides in next, it is still work in progress. If you encounter any problems during installation, please let us know. Also, OpenCL typically shows larger latency than CUDA, so you should have at least 100k unknowns to see any performance gain. Best regards, Karli On 05/24/2013 09:08 PM, Mohammad Mirzadeh wrote: > Hi guys, > > Speaking of interfaces, is there any plan to provide interfaces to > ViennaCL solvers? > > Tnx > Mohammad From mirzadeh at gmail.com Fri May 24 22:04:49 2013 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Fri, 24 May 2013 20:04:49 -0700 Subject: [petsc-users] ViennaCL In-Reply-To: <51A025BC.4030002@mcs.anl.gov> References: <51A025BC.4030002@mcs.anl.gov> Message-ID: sweet. I was in the middle of writing my own interfaces; i'll try those as well. Do I still get to use the VIENNACL_WITH_XYZ macros? On Fri, May 24, 2013 at 7:45 PM, Karl Rupp wrote: > Hi Mohammad, > > there is a first interface to ViennaCL in the next-branch already. > Configure with > --download-viennacl > --with-opencl-include=/path/**to/OpenCL/includes > --with-opencl-lib=/path/to/**libOpenCL.so > and then use > -vec_type viennacl > -mat_type aijviennacl > as runtime options. > > As this resides in next, it is still work in progress. If you encounter > any problems during installation, please let us know. Also, OpenCL > typically shows larger latency than CUDA, so you should have at least 100k > unknowns to see any performance gain. > > Best regards, > Karli > > > > On 05/24/2013 09:08 PM, Mohammad Mirzadeh wrote: > >> Hi guys, >> >> Speaking of interfaces, is there any plan to provide interfaces to >> ViennaCL solvers? >> >> Tnx >> Mohammad >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rupp at mcs.anl.gov Fri May 24 22:36:11 2013 From: rupp at mcs.anl.gov (Karl Rupp) Date: Fri, 24 May 2013 22:36:11 -0500 Subject: [petsc-users] ViennaCL In-Reply-To: References: <51A025BC.4030002@mcs.anl.gov> Message-ID: <51A031AB.1070503@mcs.anl.gov> Hi, currently only the OpenCL backend is used in PETSc, as with CUDA there is already the cusp bindings and for threading there is threadcomm. Thus, you don't need to add any additional defines. Best regards, Karli On 05/24/2013 10:04 PM, Mohammad Mirzadeh wrote: > sweet. I was in the middle of writing my own interfaces; i'll try those > as well. Do I still get to use the VIENNACL_WITH_XYZ macros? > > > On Fri, May 24, 2013 at 7:45 PM, Karl Rupp > wrote: > > Hi Mohammad, > > there is a first interface to ViennaCL in the next-branch already. > Configure with > --download-viennacl > --with-opencl-include=/path/__to/OpenCL/includes > --with-opencl-lib=/path/to/__libOpenCL.so > and then use > -vec_type viennacl > -mat_type aijviennacl > as runtime options. > > As this resides in next, it is still work in progress. If you > encounter any problems during installation, please let us know. > Also, OpenCL typically shows larger latency than CUDA, so you should > have at least 100k unknowns to see any performance gain. > > Best regards, > Karli > > > > On 05/24/2013 09:08 PM, Mohammad Mirzadeh wrote: > > Hi guys, > > Speaking of interfaces, is there any plan to provide interfaces to > ViennaCL solvers? > > Tnx > Mohammad > > > From mirzadeh at gmail.com Fri May 24 23:25:22 2013 From: mirzadeh at gmail.com (Mohammad Mirzadeh) Date: Fri, 24 May 2013 21:25:22 -0700 Subject: [petsc-users] ViennaCL In-Reply-To: <51A031AB.1070503@mcs.anl.gov> References: <51A025BC.4030002@mcs.anl.gov> <51A031AB.1070503@mcs.anl.gov> Message-ID: I see. Thanks for the info Karl. On Fri, May 24, 2013 at 8:36 PM, Karl Rupp wrote: > Hi, > > currently only the OpenCL backend is used in PETSc, as with CUDA there is > already the cusp bindings and for threading there is threadcomm. Thus, you > don't need to add any additional defines. > > Best regards, > Karli > > > > On 05/24/2013 10:04 PM, Mohammad Mirzadeh wrote: > >> sweet. I was in the middle of writing my own interfaces; i'll try those >> as well. Do I still get to use the VIENNACL_WITH_XYZ macros? >> >> >> On Fri, May 24, 2013 at 7:45 PM, Karl Rupp > > wrote: >> >> Hi Mohammad, >> >> there is a first interface to ViennaCL in the next-branch already. >> Configure with >> --download-viennacl >> --with-opencl-include=/path/__**to/OpenCL/includes >> --with-opencl-lib=/path/to/__**libOpenCL.so >> >> and then use >> -vec_type viennacl >> -mat_type aijviennacl >> as runtime options. >> >> As this resides in next, it is still work in progress. If you >> encounter any problems during installation, please let us know. >> Also, OpenCL typically shows larger latency than CUDA, so you should >> have at least 100k unknowns to see any performance gain. >> >> Best regards, >> Karli >> >> >> >> On 05/24/2013 09:08 PM, Mohammad Mirzadeh wrote: >> >> Hi guys, >> >> Speaking of interfaces, is there any plan to provide interfaces to >> ViennaCL solvers? >> >> Tnx >> Mohammad >> >> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From u1028451 at gmail.com Sat May 25 06:27:13 2013 From: u1028451 at gmail.com (Agnostic Noname) Date: Sat, 25 May 2013 14:27:13 +0300 Subject: [petsc-users] Petsc configuration failure in windows 7 x64 Message-ID: Hello, I am trying to setup petsc under windows 7 x64, with VS2012 Win64. For now I left Fortran aside (I plan to do it however), so I am trying to build a fortran-free version of petsc. First I installed the latest version of cmake: 2.8.11 to make sure VS2012 Win64 and fortran XE 2013 are supported. The steps I follow are: * Open a "Open VS2012 x64 Native Tools Command Prompt" * Run "c:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\vcvarsall.bat" x64 * Run cygwin: C:\Cygwin\bin\bash.exe --login * cd /cygdrive/f/users/user1/petsc-3.4.0 * Configure with: ./configure --with-debugging=no --with-cc='win32fe cl -O2' --with-fc=0 --with-cxx='win32fe cl -O2' --download-f2cblaslapack=1 --download-blacs=1 --download-mups=1 --with-clanguage=cxx --with-scalar-type=complex --with-precision=double --with-mpi-dir='/cygdrive/c/Program Files/MPICH2' After these steps configuration process ends with a status number 256 and a message "falling back to legacy build". Of course I tried to run make, as suggested at the end of the script "make PETSC_DIR=/cygdrive/f/users/user1/petsc-3.4.0 PETSC_ARCH=arch-mswin-cxx-opt all" but this ended up failing. Please find attached the configure.log and the make.log files. Thanks in advance. Keep up the great work! -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 1820442 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: application/octet-stream Size: 55276 bytes Desc: not available URL: From knepley at gmail.com Sat May 25 08:01:13 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 25 May 2013 08:01:13 -0500 Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: References: Message-ID: On Sat, May 25, 2013 at 6:27 AM, Agnostic Noname wrote: > Hello, > > I am trying to setup petsc under windows 7 x64, with VS2012 Win64. For now > I left Fortran aside (I plan to do it however), so I am trying to build a > fortran-free version of petsc. > > First I installed the latest version of cmake: 2.8.11 to make sure VS2012 > Win64 and fortran XE 2013 are supported. > > The steps I follow are: > * Open a "Open VS2012 x64 Native Tools Command Prompt" > * Run "c:\Program Files (x86)\Microsoft Visual Studio > 11.0\VC\vcvarsall.bat" x64 > * Run cygwin: C:\Cygwin\bin\bash.exe --login > * cd /cygdrive/f/users/user1/petsc-3.4.0 > * Configure with: ./configure --with-debugging=no --with-cc='win32fe cl > -O2' --with-fc=0 --with-cxx='win32fe cl -O2' --download-f2cblaslapack=1 > --download-blacs=1 --download-mups=1 --with-clanguage=cxx > --with-scalar-type=complex --with-precision=double > --with-mpi-dir='/cygdrive/c/Program Files/MPICH2' > > After these steps configuration process ends with a status number 256 and > a message "falling back to legacy build". > > Of course I tried to run make, as suggested at the end of the script "make > PETSC_DIR=/cygdrive/f/users/user1/petsc-3.4.0 PETSC_ARCH=arch-mswin-cxx-opt > all" but this ended up failing. > > Please find attached the configure.log and the make.log files. > > Thanks in advance. Keep up the great work! > 1) Please send messages with logs to petsc-maint at mcs.anl.gov so everyone does not get huge attachments 2) You have a buggy compiler libfast in: /cygdrive/f/users/user1/petsc-3.4.0/src/mat/impls/baij/seq baij.c baij2.c baijfact.c baijfact2.c f:\users\user1\petsc-3.4.0\src\mat\impls\baij\seq\baijfact2.c(2941) : fatal error C1001: An internal error has occurred in the compiler. (compiler file 'f:\dd\vctools\compiler\utc\src\p2\main.c', line 211) To work around this problem, try simplifying or changing the program near the locations listed above. Please choose the Technical Support command on the Visual C++ Help menu, or open the Technical Support help file for more information INTERNAL COMPILER ERROR in 'C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\BIN\amd64\cl.exe' Please choose the Technical Support command on the Visual C++ Help menu, or open the Technical Support help file for more information Microsoft (R) Library Manager Version 11.00.50727.1 Copyright (C) Microsoft Corporation. All rights reserved. baijfact2.o : fatal error LNK1136: invalid or corrupt file I recommend either upgrading to the latest version, or if that does not work, backing off the optimization (you have -O2). Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Sat May 25 21:33:27 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Sat, 25 May 2013 21:33:27 -0500 (CDT) Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: References: Message-ID: On Sat, 25 May 2013, Matthew Knepley wrote: > On Sat, May 25, 2013 at 6:27 AM, Agnostic Noname wrote: > > > 1) Please send messages with logs to petsc-maint at mcs.anl.gov so everyone > does not get huge attachments Its now acceptable to send build logs to petsc-users. petsc-users is now support with public archives - and petsc-maint is as usualy private communiation. We should be doing automatic compression of attachments [which we haven't figured out]. Currently the limit is set at 5MB - so if the e-mail comes through - then its acceptable. Alternative is to compress and send logs [or we generate logs in a compressed-ready-to-send tarball] Satish From ztdepyahoo at 163.com Sat May 25 23:01:05 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Sun, 26 May 2013 12:01:05 +0800 (CST) Subject: [petsc-users] How to write the dmma mesh into tecplot format Message-ID: I write a code for the solution of heat conduction problem with DMMA. i now want to write the output file in a tecplot format by the root cpu. how to gather the coordinate and field infor into the root process. Regards -------------- next part -------------- An HTML attachment was scrubbed... URL: From u1028451 at gmail.com Sun May 26 04:37:22 2013 From: u1028451 at gmail.com (Agnostic Noname) Date: Sun, 26 May 2013 12:37:22 +0300 Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: References: Message-ID: Dear Matt and Satish, thank you for the immediate reply. I tried looking around for any updates to VS2012 but I didn't find any. However, adding #include to src\mat\impls\baij\seq\baijfact2.c did the trick. Although the make test failed, the petsc libs generated were libf2cblas.lib, libf2clapack.lib and libpetsc.lib so I think it should be ok. Hopefully I will manage to link them to my application. I know it's an ugly hack but most of my colleagues that will try to compile petsc will be using VS2012. I don't know the reason for this error, since in mac and linux petsc works like a charm. Thanks again for your help. It's greatly appreciated. On Sun, May 26, 2013 at 5:33 AM, Satish Balay wrote: > On Sat, 25 May 2013, Matthew Knepley wrote: > > > On Sat, May 25, 2013 at 6:27 AM, Agnostic Noname > wrote: > > > > > > 1) Please send messages with logs to petsc-maint at mcs.anl.gov so everyone > > does not get huge attachments > > Its now acceptable to send build logs to petsc-users. petsc-users is > now support with public archives - and petsc-maint is as usualy > private communiation. > > We should be doing automatic compression of attachments [which we > haven't figured out]. Currently the limit is set at 5MB - so if the > e-mail comes through - then its acceptable. Alternative is to compress > and send logs [or we generate logs in a compressed-ready-to-send > tarball] > > Satish > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun May 26 05:10:17 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 26 May 2013 05:10:17 -0500 Subject: [petsc-users] How to write the dmma mesh into tecplot format In-Reply-To: References: Message-ID: On Sat, May 25, 2013 at 11:01 PM, ??? wrote: > I write a code for the solution of heat conduction problem with DMMA. > i now want to write the output file in a tecplot format by the root cpu. > how to gather the coordinate and field infor into the root process. > Just do VecView() for the solution and coordinates. Then you can write a small serial program that reads that in and processes it into TecPlot format. Matt > Regards > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From choi240 at purdue.edu Sun May 26 06:19:52 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Sun, 26 May 2013 07:19:52 -0400 (EDT) Subject: [petsc-users] Errors from large matrices In-Reply-To: <554518184.189.1369565250061.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <857243199.205.1369567192530.JavaMail.root@mailhub028.itcs.purdue.edu> Hello all, I need to multiply a large seqaij matrix(X1) and a maij(or baij) matrix(CC). I set up X1 (size:4273949x108965941330383, nonzeros:143599552) and C (size:25495389x10, nonzeros:254953890) and created a maij matrix CC from C. However, I got errors such as out of memory and Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range. Is this memory problem, and do I have to change seqaij into mpiaij and use multi processors? Or do I have another methods fixing it? If you know the method, then please let me know it. Thank you. Joon Code: ... ierr = MatCreate(PETSC_COMM_SELF, &X1); CHKERRQ(ierr); ierr = MatSetSizes(X1, PETSC_DECIDE, PETSC_DECIDE, I, J*K); CHKERRQ(ierr); ierr = MatSetBlockSizes(X1, I, J); CHKERRQ(ierr); ierr = MatSetType(X1, MATSEQAIJ); CHKERRQ(ierr); ierr = MatSeqAIJSetPreallocation(X1, 0, nnz); CHKERRQ(ierr); for (int x=0; x(tups[x]); j = std::tr1::get<2>(tups[x]) + std::tr1::get<1>(tups[x])*J; val = std::tr1::get<3>(tups[x]); ierr = MatSetValues(X1, 1, &i, 1, &j, &val, INSERT_VALUES); CHKERRQ(ierr); } ierr = MatAssemblyBegin(X1, MAT_FINAL_ASSEMBLY); ierr = MatAssemblyEnd(X1, MAT_FINAL_ASSEMBLY); ierr = PetscGetTime(&v1); CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD, "Setup Time: %2.1e \n", v1-v); CHKERRQ(ierr); // Create a matrix C (K x R) with all values 1 ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, K, R, R, NULL, &C); CHKERRQ(ierr); for (k=0; k References: <857243199.205.1369567192530.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: On May 26, 2013, at 6:19 AM, Joon hee Choi wrote: > Hello all, > > I need to multiply a large seqaij matrix(X1) and a maij(or baij) matrix(CC). I set up X1 (size:4273949x108965941330383, nonzeros:143599552) and C (size:25495389x10, nonzeros:254953890) and created a maij matrix CC from C. However, I got errors such as out of memory and Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range. Is this memory problem, and do I have to change seqaij into mpiaij and use multi processors? Or do I have another methods fixing it? If you know the method, then please let me know it. Thank you. http://www.mcs.anl.gov/petsc/documentation/faq.html#with-64-bit-indices > > Joon > > > Code: > ... > ierr = MatCreate(PETSC_COMM_SELF, &X1); CHKERRQ(ierr); > ierr = MatSetSizes(X1, PETSC_DECIDE, PETSC_DECIDE, I, J*K); CHKERRQ(ierr); > ierr = MatSetBlockSizes(X1, I, J); CHKERRQ(ierr); > ierr = MatSetType(X1, MATSEQAIJ); CHKERRQ(ierr); > ierr = MatSeqAIJSetPreallocation(X1, 0, nnz); CHKERRQ(ierr); > > for (int x=0; x i = std::tr1::get<0>(tups[x]); > j = std::tr1::get<2>(tups[x]) + std::tr1::get<1>(tups[x])*J; > val = std::tr1::get<3>(tups[x]); > ierr = MatSetValues(X1, 1, &i, 1, &j, &val, INSERT_VALUES); CHKERRQ(ierr); > } > ierr = MatAssemblyBegin(X1, MAT_FINAL_ASSEMBLY); > ierr = MatAssemblyEnd(X1, MAT_FINAL_ASSEMBLY); > ierr = PetscGetTime(&v1); CHKERRQ(ierr); > ierr = PetscPrintf(PETSC_COMM_WORLD, "Setup Time: %2.1e \n", v1-v); CHKERRQ(ierr); > > // Create a matrix C (K x R) with all values 1 > ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, K, R, R, NULL, &C); CHKERRQ(ierr); > for (k=0; k for (r=0; r ierr = MatSetValues(C, 1, &k, 1, &r, &one, INSERT_VALUES); CHKERRQ(ierr); > } > } > ierr = MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > ierr = MatCreateMAIJ(C, J, &MC); CHKERRQ(ierr); > ierr = MatConvert(MC, MATBAIJ, MAT_INITIAL_MATRIX, &CC); CHKERRQ(ierr); > ierr = MatMatMult(X1, CC, MAT_INITIAL_MATRIX, PETSC_DEFAULT, &M); CHKERRQ(ierr); > ... > > > Results and Errors with -info -mat-view-info: > [0] PetscInitialize(): PETSc successfully started: number of processors = 1 > [0] PetscInitialize(): Running on machine: rossmann-fe02.rcac.purdue.edu > [0] PetscFOpen(): Opening file /group/ml/data/tensor/nell/sparse.large.txt > [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max tags = 2147483647 > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4273949 X 108965941330383; storage space: 83847 unneeded,143599552 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 504677 > [0] Mat_CheckInode(): Found 3499069 nodes out of 4273949 rows. Not using Inode routines > Matrix Object: 1 MPI processes > type: seqaij > rows=4273949, cols=108965941330383, bs=4273949 > total: nonzeros=143599552, allocated nonzeros=143683399 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 25495389 X 10; storage space: 0 unneeded,254953890 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 10 > [0] Mat_CheckInode(): Found 5099078 nodes of 25495389. Limit used: 5. Using Inode routines > Matrix Object: 1 MPI processes > type: seqaij > rows=25495389, cols=10 > total: nonzeros=254953890, allocated nonzeros=254953890 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 5099078 nodes, limit used is 5 > Matrix Object: 1 MPI processes > type: seqmaij > rows=108965941330383, cols=42739470, bs=4273947 > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Out of memory. This could be due to allocating > [0]PETSC ERROR: too large an object or bleeding by not properly > [0]PETSC ERROR: destroying unneeded objects. > [0]PETSC ERROR: Memory allocated 0 Memory used by process 11980140544 > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0]PETSC ERROR: Memory requested 871727530643064! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 6, Mon Feb 11 12:26:34 CST 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: ./tensor on a linux-sta named rossmann-fe02.rcac.purdue.edu by choi240 Sun May 26 07:13:32 2013 > [0]PETSC ERROR: Libraries linked from /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/linux-static/lib > [0]PETSC ERROR: Configure run at Tue May 21 15:56:45 2013 > [0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-scalar-type=real --with-shared-libraries=0 --with-pic=1 --with-clanguage=C++ --with-fortran --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no --download-metis=no --download-parmetis=no --download-superlu_dist=no --download-mumps=no --download-scalapack=yes --download-blacs=yes --download-hypre=no --download-spooles=no > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: PetscMallocAlign() line 49 in /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/src/sys/memory/mal.c > [0]PETSC ERROR: MatConvert_SeqMAIJ_SeqAIJ() line 3232 in /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/src/mat/impls/maij/maij.c > [0]PETSC ERROR: MatConvert() line 3778 in /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/src/mat/interface/matrix.c > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run > [0]PETSC ERROR: to get more information on the crash. > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 6, Mon Feb 11 12:26:34 CST 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: ./tensor on a linux-sta named rossmann-fe02.rcac.purdue.edu by choi240 Sun May 26 07:13:32 2013 > [0]PETSC ERROR: Libraries linked from /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/linux-static/lib > [0]PETSC ERROR: Configure run at Tue May 21 15:56:45 2013 > [0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-scalar-type=real --with-shared-libraries=0 --with-pic=1 --with-clanguage=C++ --with-fortran --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no --download-metis=no --download-parmetis=no --download-superlu_dist=no --download-mumps=no --download-scalapack=yes --download-blacs=yes --download-hypre=no --download-spooles=no > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 From choi240 at purdue.edu Sun May 26 13:10:09 2013 From: choi240 at purdue.edu (Choi240) Date: Sun, 26 May 2013 14:10:09 -0400 (EDT) Subject: [petsc-users] Errors from large matrices Message-ID: <8vk2k6gl3a52oilbn95mtp53.1369591594753@email.android.com> Thank you for your fast reply. However, I was already using 64 bit pointer. -------- Original message -------- Subject: Re: [petsc-users] Errors from large matrices From: Barry Smith To: Joon hee Choi CC: petsc-users at mcs.anl.gov On May 26, 2013, at 6:19 AM, Joon hee Choi wrote: > Hello all, > > I need to multiply a large seqaij matrix(X1) and a maij(or baij) matrix(CC). I set up X1 (size:4273949x108965941330383, nonzeros:143599552) and C (size:25495389x10, nonzeros:254953890) and created a maij matrix CC from C. However, I got errors such as out of memory and Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range. Is this memory problem, and do I have to change seqaij into mpiaij and use multi processors? Or do I have another methods fixing it? If you know the method, then please let me know it. Thank you. http://www.mcs.anl.gov/petsc/documentation/faq.html#with-64-bit-indices > > Joon > > > Code: >? ... >? ierr = MatCreate(PETSC_COMM_SELF, &X1); CHKERRQ(ierr); >? ierr = MatSetSizes(X1, PETSC_DECIDE, PETSC_DECIDE, I, J*K); CHKERRQ(ierr); >? ierr = MatSetBlockSizes(X1, I, J); CHKERRQ(ierr); >? ierr = MatSetType(X1, MATSEQAIJ); CHKERRQ(ierr); >? ierr = MatSeqAIJSetPreallocation(X1, 0, nnz); CHKERRQ(ierr); > >? for (int x=0; x?????? i = std::tr1::get<0>(tups[x]); >?????? j = std::tr1::get<2>(tups[x]) + std::tr1::get<1>(tups[x])*J; >?????? val = std::tr1::get<3>(tups[x]); >?????? ierr = MatSetValues(X1, 1, &i, 1, &j, &val, INSERT_VALUES); CHKERRQ(ierr); >? } >? ierr = MatAssemblyBegin(X1, MAT_FINAL_ASSEMBLY); >? ierr = MatAssemblyEnd(X1, MAT_FINAL_ASSEMBLY); >? ierr = PetscGetTime(&v1); CHKERRQ(ierr); >? ierr = PetscPrintf(PETSC_COMM_WORLD, "Setup Time: %2.1e \n", v1-v); CHKERRQ(ierr); > >? // Create a matrix C (K x R) with all values 1 >? ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, K, R, R, NULL, &C); CHKERRQ(ierr); >? for (k=0; k?????? for (r=0; r??????????? ierr = MatSetValues(C, 1, &k, 1, &r, &one, INSERT_VALUES); CHKERRQ(ierr); >?????? } >? } >? ierr = MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); >? ierr = MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > >? ierr = M... -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun May 26 13:27:01 2013 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 26 May 2013 14:27:01 -0400 Subject: [petsc-users] Errors from large matrices In-Reply-To: <8vk2k6gl3a52oilbn95mtp53.1369591594753@email.android.com> References: <8vk2k6gl3a52oilbn95mtp53.1369591594753@email.android.com> Message-ID: On Sun, May 26, 2013 at 2:10 PM, Choi240 wrote: > Thank you for your fast reply. However, I was already using 64 bit pointer. > Then it sounds like you ar really out of memory. Also, make sure you are checking all error codes with CHKERRQ so you do not get follow-on errors. Matt > > -------- Original message -------- > Subject: Re: [petsc-users] Errors from large matrices > From: Barry Smith > To: Joon hee Choi > CC: petsc-users at mcs.anl.gov > > > > On May 26, 2013, at 6:19 AM, Joon hee Choi wrote: > > > Hello all, > > > > I need to multiply a large seqaij matrix(X1) and a maij(or baij) > matrix(CC). I set up X1 (size:4273949x108965941330383, nonzeros:143599552) > and C (size:25495389x10, nonzeros:254953890) and created a maij matrix CC > from C. However, I got errors such as out of memory and Caught signal > number 11 SEGV: Segmentation Violation, probably memory access out of > range. Is this memory problem, and do I have to change seqaij into mpiaij > and use multi processors? Or do I have another methods fixing it? If you > know the method, then please let me know it. Thank you. > > http://www.mcs.anl.gov/petsc/documentation/faq.html#with-64-bit-indices > > > > > > Joon > > > > > > Code: > > ... > > ierr = MatCreate(PETSC_COMM_SELF, &X1); CHKERRQ(ierr); > > ierr = MatSetSizes(X1, PETSC_DECIDE, PETSC_DECIDE, I, J*K); > CHKERRQ(ierr); > > ierr = MatSetBlockSizes(X1, I, J); CHKERRQ(ierr); > > ierr = MatSetType(X1, MATSEQAIJ); CHKERRQ(ierr); > > ierr = MatSeqAIJSetPreallocation(X1, 0, nnz); CHKERRQ(ierr); > > > > for (int x=0; x > i = std::tr1::get<0>(tups[x]); > > j = std::tr1::get<2>(tups[x]) + std::tr1::get<1>(tups[x])*J; > > val = std::tr1::get<3>(tups[x]); > > ierr = MatSetValues(X1, 1, &i, 1, &j, &val, INSERT_VALUES); > CHKERRQ(ierr); > > } > > ierr = MatAssemblyBegin(X1, MAT_FINAL_ASSEMBLY); > > ierr = MatAssemblyEnd(X1, MAT_FINAL_ASSEMBLY); > > ierr = PetscGetTime(&v1); CHKERRQ(ierr); > > ierr = PetscPrintf(PETSC_COMM_WORLD, "Setup Time: %2.1e \n", v1-v); > CHKERRQ(ierr); > > > > // Create a matrix C (K x R) with all values 1 > > ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, K, R, R, NULL, &C); > CHKERRQ(ierr); > > for (k=0; k > for (r=0; r > ierr = MatSetValues(C, 1, &k, 1, &r, &one, INSERT_VALUES); > CHKERRQ(ierr); > > } > > } > > ierr = MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > ierr = MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > > > ierr = M... > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From choi240 at purdue.edu Sun May 26 15:44:34 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Sun, 26 May 2013 16:44:34 -0400 (EDT) Subject: [petsc-users] Errors from large matrices In-Reply-To: Message-ID: <1843255333.598.1369601074413.JavaMail.root@mailhub028.itcs.purdue.edu> Thank you for your fast reply. I was already checking all error codes with CHKERRQ and -info -mat-view-info, but I got those errors from "ierr = MatConvert(MC, MATBAIJ, MAT_INITIAL_MATRIX, &CC); CHKERRQ(ierr);". Then, what can I do? Sincerely, Joon ----- Original Message ----- From: "Matthew Knepley" To: "Choi240" Cc: "Barry Smith" , petsc-users at mcs.anl.gov Sent: Sunday, May 26, 2013 2:27:01 PM Subject: Re: [petsc-users] Errors from large matrices On Sun, May 26, 2013 at 2:10 PM, Choi240 < choi240 at purdue.edu > wrote: Thank you for your fast reply. However, I was already using 64 bit pointer. Then it sounds like you ar really out of memory. Also, make sure you are checking all error codes with CHKERRQ so you do not get follow-on errors. Matt -------- Original message -------- Subject: Re: [petsc-users] Errors from large matrices From: Barry Smith < bsmith at mcs.anl.gov > To: Joon hee Choi < choi240 at purdue.edu > CC: petsc-users at mcs.anl.gov On May 26, 2013, at 6:19 AM, Joon hee Choi < choi240 at purdue.edu > wrote: > Hello all, > > I need to multiply a large seqaij matrix(X1) and a maij(or baij) matrix(CC). I set up X1 (size:4273949x108965941330383, nonzeros:143599552) and C (size:25495389x10, nonzeros:254953890) and created a maij matrix CC from C. However, I got errors such as out of memory and Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range. Is this memory problem, and do I have to change seqaij into mpiaij and use multi processors? Or do I have another methods fixing it? If you know the method, then please let me know it. Thank you. http://www.mcs.anl.gov/petsc/documentation/faq.html#with-64-bit-indices > > Joon > > > Code: > ... > ierr = MatCreate(PETSC_COMM_SELF, &X1); CHKERRQ(ierr); > ierr = MatSetSizes(X1, PETSC_DECIDE, PETSC_DECIDE, I, J*K); CHKERRQ(ierr); > ierr = MatSetBlockSizes(X1, I, J); CHKERRQ(ierr); > ierr = MatSetType(X1, MATSEQAIJ); CHKERRQ(ierr); > ierr = MatSeqAIJSetPreallocation(X1, 0, nnz); CHKERRQ(ierr); > > for (int x=0; x i = std::tr1::get<0>(tups[x]); > j = std::tr1::get<2>(tups[x]) + std::tr1::get<1>(tups[x])*J; > val = std::tr1::get<3>(tups[x]); > ierr = MatSetValues(X1, 1, &i, 1, &j, &val, INSERT_VALUES); CHKERRQ(ierr); > } > ierr = MatAssemblyBegin(X1, MAT_FINAL_ASSEMBLY); > ierr = MatAssemblyEnd(X1, MAT_FINAL_ASSEMBLY); > ierr = PetscGetTime(&v1); CHKERRQ(ierr); > ierr = PetscPrintf(PETSC_COMM_WORLD, "Setup Time: %2.1e \n", v1-v); CHKERRQ(ierr); > > // Create a matrix C (K x R) with all values 1 > ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, K, R, R, NULL, &C); CHKERRQ(ierr); > for (k=0; k for (r=0; r ierr = MatSetValues(C, 1, &k, 1, &r, &one, INSERT_VALUES); CHKERRQ(ierr); > } > } > ierr = MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > ierr = M... -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener From bsmith at mcs.anl.gov Sun May 26 18:12:11 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 26 May 2013 18:12:11 -0500 Subject: [petsc-users] Errors from large matrices In-Reply-To: <8vk2k6gl3a52oilbn95mtp53.1369591594753@email.android.com> References: <8vk2k6gl3a52oilbn95mtp53.1369591594753@email.android.com> Message-ID: On May 26, 2013, at 1:10 PM, Choi240 wrote: > Thank you for your fast reply. However, I was already using 64 bit pointer. 64 bit pointers are very very different than --with-64-bit-indices! With 64 bit pointers integers are still 32 bits which means one cannot have an integer number greater than about 2 billion. So if your matrix dimension is more than about 2 billion or you have over 2 billion non zeros (sequential) then you need to configure PETSc with --with-64-bit-indices. Based on your statement size:4273949x108965941330383 you need to configure with --with-64-bit-indices. Now did you configure with 64 bit indices? Barry > > > -------- Original message -------- > Subject: Re: [petsc-users] Errors from large matrices > From: Barry Smith > To: Joon hee Choi > CC: petsc-users at mcs.anl.gov > > > > On May 26, 2013, at 6:19 AM, Joon hee Choi wrote: > > > Hello all, > > > > I need to multiply a large seqaij matrix(X1) and a maij(or baij) matrix(CC). I set up X1 (size:4273949x108965941330383, nonzeros:143599552) and C (size:25495389x10, nonzeros:254953890) and created a maij matrix CC from C. However, I got errors such as out of memory and Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range. Is this memory problem, and do I have to change seqaij into mpiaij and use multi processors? Or do I have another methods fixing it? If you know the method, then please let me know it. Thank you. > > http://www.mcs.anl.gov/petsc/documentation/faq.html#with-64-bit-indices > > > > > > Joon > > > > > > Code: > > ... > > ierr = MatCreate(PETSC_COMM_SELF, &X1); CHKERRQ(ierr); > > ierr = MatSetSizes(X1, PETSC_DECIDE, PETSC_DECIDE, I, J*K); CHKERRQ(ierr); > > ierr = MatSetBlockSizes(X1, I, J); CHKERRQ(ierr); > > ierr = MatSetType(X1, MATSEQAIJ); CHKERRQ(ierr); > > ierr = MatSeqAIJSetPreallocation(X1, 0, nnz); CHKERRQ(ierr); > > > > for (int x=0; x > i = std::tr1::get<0>(tups[x]); > > j = std::tr1::get<2>(tups[x]) + std::tr1::get<1>(tups[x])*J; > > val = std::tr1::get<3>(tups[x]); > > ierr = MatSetValues(X1, 1, &i, 1, &j, &val, INSERT_VALUES); CHKERRQ(ierr); > > } > > ierr = MatAssemblyBegin(X1, MAT_FINAL_ASSEMBLY); > > ierr = MatAssemblyEnd(X1, MAT_FINAL_ASSEMBLY); > > ierr = PetscGetTime(&v1); CHKERRQ(ierr); > > ierr = PetscPrintf(PETSC_COMM_WORLD, "Setup Time: %2.1e \n", v1-v); CHKERRQ(ierr); > > > > // Create a matrix C (K x R) with all values 1 > > ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, K, R, R, NULL, &C); CHKERRQ(ierr); > > for (k=0; k > for (r=0; r > ierr = MatSetValues(C, 1, &k, 1, &r, &one, INSERT_VALUES); CHKERRQ(ierr); > > } > > } > > ierr = MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > ierr = MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > > > ierr = M... From choi240 at purdue.edu Sun May 26 19:23:40 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Sun, 26 May 2013 20:23:40 -0400 (EDT) Subject: [petsc-users] Errors from large matrices In-Reply-To: Message-ID: <1728308087.740.1369614220334.JavaMail.root@mailhub028.itcs.purdue.edu> Thank you very much. I was confused between 64 bit pointers and 64 bit indices. The module I am using supports 64 bit indices. I succeeded in setting up the matrix with 4273949 x 108965941330383 size (non-zeros: 143599552), but failed to set up the maij matrix with 108965941330383 x 42739470 size and 4273947 block size using MatCreateMAIJ(Mat A, PetscInt dof, Mat *maij). The number of non-zeros are 25495389 x 10 x 4273947 = 1.1 x 10^15. Do I have the method to set up the matrix successfully? Thank you, Joon ----- Original Message ----- From: "Barry Smith" To: "Choi240" Cc: petsc-users at mcs.anl.gov Sent: Sunday, May 26, 2013 7:12:11 PM Subject: Re: [petsc-users] Errors from large matrices On May 26, 2013, at 1:10 PM, Choi240 wrote: > Thank you for your fast reply. However, I was already using 64 bit pointer. 64 bit pointers are very very different than --with-64-bit-indices! With 64 bit pointers integers are still 32 bits which means one cannot have an integer number greater than about 2 billion. So if your matrix dimension is more than about 2 billion or you have over 2 billion non zeros (sequential) then you need to configure PETSc with --with-64-bit-indices. Based on your statement size:4273949x108965941330383 you need to configure with --with-64-bit-indices. Now did you configure with 64 bit indices? Barry > > > -------- Original message -------- > Subject: Re: [petsc-users] Errors from large matrices > From: Barry Smith > To: Joon hee Choi > CC: petsc-users at mcs.anl.gov > > > > On May 26, 2013, at 6:19 AM, Joon hee Choi wrote: > > > Hello all, > > > > I need to multiply a large seqaij matrix(X1) and a maij(or baij) matrix(CC). I set up X1 (size:4273949x108965941330383, nonzeros:143599552) and C (size:25495389x10, nonzeros:254953890) and created a maij matrix CC from C. However, I got errors such as out of memory and Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range. Is this memory problem, and do I have to change seqaij into mpiaij and use multi processors? Or do I have another methods fixing it? If you know the method, then please let me know it. Thank you. > > http://www.mcs.anl.gov/petsc/documentation/faq.html#with-64-bit-indices > > > > > > Joon > > > > > > Code: > > ... > > ierr = MatCreate(PETSC_COMM_SELF, &X1); CHKERRQ(ierr); > > ierr = MatSetSizes(X1, PETSC_DECIDE, PETSC_DECIDE, I, J*K); CHKERRQ(ierr); > > ierr = MatSetBlockSizes(X1, I, J); CHKERRQ(ierr); > > ierr = MatSetType(X1, MATSEQAIJ); CHKERRQ(ierr); > > ierr = MatSeqAIJSetPreallocation(X1, 0, nnz); CHKERRQ(ierr); > > > > for (int x=0; x > i = std::tr1::get<0>(tups[x]); > > j = std::tr1::get<2>(tups[x]) + std::tr1::get<1>(tups[x])*J; > > val = std::tr1::get<3>(tups[x]); > > ierr = MatSetValues(X1, 1, &i, 1, &j, &val, INSERT_VALUES); CHKERRQ(ierr); > > } > > ierr = MatAssemblyBegin(X1, MAT_FINAL_ASSEMBLY); > > ierr = MatAssemblyEnd(X1, MAT_FINAL_ASSEMBLY); > > ierr = PetscGetTime(&v1); CHKERRQ(ierr); > > ierr = PetscPrintf(PETSC_COMM_WORLD, "Setup Time: %2.1e \n", v1-v); CHKERRQ(ierr); > > > > // Create a matrix C (K x R) with all values 1 > > ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, K, R, R, NULL, &C); CHKERRQ(ierr); > > for (k=0; k > for (r=0; r > ierr = MatSetValues(C, 1, &k, 1, &r, &one, INSERT_VALUES); CHKERRQ(ierr); > > } > > } > > ierr = MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > ierr = MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > > > ierr = M... From jedbrown at mcs.anl.gov Sun May 26 19:39:25 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Sun, 26 May 2013 19:39:25 -0500 Subject: [petsc-users] Errors from large matrices In-Reply-To: <1728308087.740.1369614220334.JavaMail.root@mailhub028.itcs.purdue.edu> References: <1728308087.740.1369614220334.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <87ppwdl0hu.fsf@mcs.anl.gov> Joon hee Choi writes: > Thank you very much. I was confused between 64 bit pointers and 64 bit > indices. The module I am using supports 64 bit indices. I succeeded in > setting up the matrix with 4273949 x 108965941330383 size (non-zeros: > 143599552), but failed to set up the maij matrix with 108965941330383 > x 42739470 size and 4273947 block size using MatCreateMAIJ(Mat A, > PetscInt dof, Mat *maij). The number of non-zeros are 25495389 x 10 x > 4273947 = 1.1 x 10^15. Do I have the method to set up the matrix > successfully? What do you want to do with a matrix with this enormous number of columns? PETSc matrix data structures are not intended to be used in this way. You should reduce the dimension of the column space so that the corresponding dense vector can be stored. From choi240 at purdue.edu Sun May 26 20:36:05 2013 From: choi240 at purdue.edu (Choi240) Date: Sun, 26 May 2013 21:36:05 -0400 (EDT) Subject: [petsc-users] Errors from large matrices Message-ID: Dear Jed, I wanted to multiply those large matrices. I thought it was fast because I can get the results by just one multiplication. I think I was wrong and so I will try to get the result by the multiplication of a matrix and each vector. Thank you very much. Joon -------- Original message -------- Subject: Re: [petsc-users] Errors from large matrices From: Jed Brown To: Joon hee Choi ,Barry Smith CC: petsc-users at mcs.anl.gov Joon hee Choi writes: > Thank you very much. I was confused between 64 bit pointers and 64 bit > indices. The module I am using supports 64 bit indices. I succeeded in > setting up the matrix with 4273949 x 108965941330383 size (non-zeros: > 143599552), but failed to set up the maij matrix with 108965941330383 > x 42739470 size and 4273947 block size using MatCreateMAIJ(Mat A, > PetscInt dof, Mat *maij). The number of non-zeros are 25495389 x 10 x > 4273947 = 1.1 x 10^15. Do I have the method to set up the matrix > successfully? What do you want to do with a matrix with this enormous number of columns? PETSc matrix data structures are not intended to be used in this way. You should reduce the dimension of the column space so that the corresponding dense vector can be stored.? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ztdepyahoo at 163.com Sun May 26 23:18:36 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Mon, 27 May 2013 12:18:36 +0800 (CST) Subject: [petsc-users] Some confusion about the vec tutorial ex8.c Message-ID: <2ab9e07c.867e.13ee4359477.Coremail.ztdepyahoo@163.com> In ex8.c, ISLocalToGlobalMappingCreate usd the "ng" as the number of local element. 'ng' is te local size plus 2 ghost positions. But we create the vector x with VecCreate, x does not has ghost points. how to explain this mapping. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon May 27 04:40:54 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 27 May 2013 05:40:54 -0400 Subject: [petsc-users] Some confusion about the vec tutorial ex8.c In-Reply-To: <2ab9e07c.867e.13ee4359477.Coremail.ztdepyahoo@163.com> References: <2ab9e07c.867e.13ee4359477.Coremail.ztdepyahoo@163.com> Message-ID: On Mon, May 27, 2013 at 12:18 AM, ??? wrote: > In ex8.c, ISLocalToGlobalMappingCreate usd the "ng" as the number of > local element. 'ng' is te local size plus 2 ghost positions. But we create > the vector x with VecCreate, x does not has ghost points. > how to explain this mapping. > 1) Global vectors have no ghost points 2) We declare that this vector will have 2 ghost points for the example Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.prabhakaran at grs-sim.de Mon May 27 09:03:53 2013 From: s.prabhakaran at grs-sim.de (Suraj Prabhakaran) Date: Mon, 27 May 2013 16:03:53 +0200 Subject: [petsc-users] MPI_Comm_spawn when using PETSc Message-ID: <61654236-C81A-4128-AEA8-C3017D602F63@grs-sim.de> Dear all, I am completely new to PETSc. I would like to know if it is possible to use PETSc after increasing the number of processes in an MPI program through MPI_Comm_spawn. The application is started with a set of MPI processes and the PETSC is initialized with this set. At a later point, the application could indeed increase the set of processes (through MPI_Comm_spawn) and create a intra communicator with the parent and the children. I would like to use these processes too with PETSC. Would this be possible (by providing the new communicator for instance)? Thanks! Regards, Suraj From jedbrown at mcs.anl.gov Mon May 27 09:19:50 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Mon, 27 May 2013 09:19:50 -0500 Subject: [petsc-users] MPI_Comm_spawn when using PETSc In-Reply-To: <61654236-C81A-4128-AEA8-C3017D602F63@grs-sim.de> References: <61654236-C81A-4128-AEA8-C3017D602F63@grs-sim.de> Message-ID: <8738t8ld2x.fsf@mcs.anl.gov> Suraj Prabhakaran writes: > Dear all, > > I am completely new to PETSc. I would like to know if it is possible > to use PETSc after increasing the number of processes in an MPI > program through MPI_Comm_spawn. The application is started with a set > of MPI processes and the PETSC is initialized with this set. At a > later point, the application could indeed increase the set of > processes (through MPI_Comm_spawn) and create a intra communicator > with the parent and the children. I would like to use these processes > too with PETSC. Would this be possible (by providing the new > communicator for instance)? The logging and profiling features will need PETSC_COMM_WORLD to contain all processes. So for that to work properly after spawning new processes, you would call PetscFinalize on the original comm, then set PETSC_COMM_WORLD = new_comm, then PetscInitialize again. It's possible that you can just call PetscInitialize on the spawned comm containing only the new members, then create an object on the union. I'm reluctant to recommend this, and it's not tested in the present test suite, but I can't think of anything that will truly break. If you wanted logging information in that setting, you should send it to different files so that it's doesn't arrive all jumbled up on stdout. From knepley at gmail.com Mon May 27 09:26:06 2013 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 27 May 2013 10:26:06 -0400 Subject: [petsc-users] MPI_Comm_spawn when using PETSc In-Reply-To: <61654236-C81A-4128-AEA8-C3017D602F63@grs-sim.de> References: <61654236-C81A-4128-AEA8-C3017D602F63@grs-sim.de> Message-ID: On Mon, May 27, 2013 at 10:03 AM, Suraj Prabhakaran < s.prabhakaran at grs-sim.de> wrote: > Dear all, > > I am completely new to PETSc. I would like to know if it is possible to > use PETSc after increasing the number of processes in an MPI program > through MPI_Comm_spawn. > The application is started with a set of MPI processes and the PETSC is > initialized with this set. At a later point, the application could indeed > increase the set of processes (through MPI_Comm_spawn) and create a intra > communicator with the parent and the children. I would like to use these > processes too with PETSC. Would this be possible (by providing the new > communicator for instance)? > You can give a PetscObject any communicator, however PETSC_COMM_WORLD is used for a bunch of global stuff, like logging, so you may get unexpected behavior. Matt > Thanks! > Regards, > Suraj > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon May 27 18:37:10 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 May 2013 18:37:10 -0500 Subject: [petsc-users] MPI_Comm_spawn when using PETSc In-Reply-To: References: <61654236-C81A-4128-AEA8-C3017D602F63@grs-sim.de> Message-ID: I have added a new issue at https://bitbucket.org/petsc/petsc/issue/43/add-petscreinitialize to list this as a future enhancement to PETSc. Barry On May 27, 2013, at 9:26 AM, Matthew Knepley wrote: > On Mon, May 27, 2013 at 10:03 AM, Suraj Prabhakaran wrote: > Dear all, > > I am completely new to PETSc. I would like to know if it is possible to use PETSc after increasing the number of processes in an MPI program through MPI_Comm_spawn. > The application is started with a set of MPI processes and the PETSC is initialized with this set. At a later point, the application could indeed increase the set of processes (through MPI_Comm_spawn) and create a intra communicator with the parent and the children. I would like to use these processes too with PETSC. Would this be possible (by providing the new communicator for instance)? > > You can give a PetscObject any communicator, however PETSC_COMM_WORLD is used for a bunch of global stuff, > like logging, so you may get unexpected behavior. > > Matt > > Thanks! > Regards, > Suraj > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From fande.kong at colorado.edu Mon May 27 22:22:58 2013 From: fande.kong at colorado.edu (Fande Kong) Date: Tue, 28 May 2013 11:22:58 +0800 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? Message-ID: Hi all, How to measure the memory usage of the application built on the Petsc? I am now solving linear elasticity equations with fgmres preconditioned by two-level method, that is, preconditioned by multigrid method where on each level the additive Schwarz method is adopted. More than 1000 cores are adopted to solve this problem on the supercomputer. When the total freedom of the problem is about 60M, the application correctly run and produce correct results. But when the total freedom increases to 600M, the application abort and say there is not enough memory ( the system administrator of the supercomputer told me that my application run out memory). Thus, I want to monitor the memory usage dynamically when the application running. Are there any functions or strategies that could be used for this purpose? The error information is attached. Regards, -- Fande Kong Department of Computer Science University of Colorado at Boulder -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: solid3dcube2.o1603352 Type: application/octet-stream Size: 103682 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure and make log.zip Type: application/zip Size: 526635 bytes Desc: not available URL: From bsmith at mcs.anl.gov Mon May 27 22:48:48 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 May 2013 22:48:48 -0500 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: There are several ways to monitor the memory usage. You can divide them into two categories: those that monitor how much memory has been malloced specifically by PETSc and how much is used totally be the process. PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which only work with the command line option -malloc provide how much PETSc has malloced. PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for this one to work) provide total memory usage. These are called on each process so use a MPI_Reduce() to gather the total memory across all processes to process 0 to print it out. Suggest calling it after the mesh as been set up, then call again immediately before the XXXSolve() is called and then after the XXXSolve() is called. Please let us know if you have any difficulties. As always we recommend you upgrade to PETSc 3.4 Barry On May 27, 2013, at 10:22 PM, Fande Kong wrote: > Hi all, > > How to measure the memory usage of the application built on the Petsc? I am now solving linear elasticity equations with fgmres preconditioned by two-level method, that is, preconditioned by multigrid method where on each level the additive Schwarz method is adopted. More than 1000 cores are adopted to solve this problem on the supercomputer. When the total freedom of the problem is about 60M, the application correctly run and produce correct results. But when the total freedom increases to 600M, the application abort and say there is not enough memory ( the system administrator of the supercomputer told me that my application run out memory). > > Thus, I want to monitor the memory usage dynamically when the application running. Are there any functions or strategies that could be used for this purpose? > > The error information is attached. > > Regards, > -- > Fande Kong > Department of Computer Science > University of Colorado at Boulder > From Fande.Kong at Colorado.EDU Tue May 28 04:54:07 2013 From: Fande.Kong at Colorado.EDU (Fande Kong) Date: Tue, 28 May 2013 03:54:07 -0600 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: Hi Smith, Thank you very much. According to your suggestions and information, I added these functions into my code to measure the memory usage. Now I am confused, since the small problem needs large memory. I added the function PetscMemorySetGetMaximumUsage() immediately after PetscInitialize(). And then I added the following code into several positions in the code (before & after setting up unstructured mesh, before & after KSPSetUp(), before & after KSPSolve(), and Destroy all stuffs): PetscLogDouble space =0; ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Current process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Max process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); In order to measure the memory usage, I just used only one core (mpirun -n 1 ./program ) to solve a small problem with 12691 mesh nodes (the freedom is about 12691*3= 4 *10^4 ). I solve the linear elasticity problem by using FGMRES preconditioned by multigrid method (PCMG). I use all petsc standard routines except that I construct coarse matrix and interpolation matrix by myself. I used the following run script to set up solver and preconditioner: mpirun -n 1 ./linearElasticity -ksp_type fgmres -pc_type mg -pc_mg_levels 2 -pc_mg_cycle_type v -pc_mg_type multiplicative -mg_levels_1_ksp_type richardson -mg_levels_1_ksp_max_it 1 -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type preonly -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 4 -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type cg -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 10 -mg_coarse_pc_type asm -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu -mg_coarse_sub_pc_factor_levels 2 -mg_coarse_sub_pc_factor_mat_ordering_type rcm -ksp_view -log_summary -pc_mg_log I got the following results: (1) before setting up mesh, Current space PetscMalloc()ed 0.075882 M Max space PetscMalloced() 0.119675 M Current process memory 7.83203 M Max process memory 0 M (2) after setting up mesh, Current space PetscMalloc()ed 16.8411 M Max space PetscMalloced() 22.1353 M Current process memory 28.4336 M Max process memory 33.0547 M (3) before calling KSPSetUp() Current space PetscMalloc()ed 16.868 M Max space PetscMalloced() 22.1353 M Current process memory 28.6914 M Max process memory 33.0547 M (4) after calling KSPSetUp() Current space PetscMalloc()ed 74.3354 M Max space PetscMalloced() 74.3355 M Current process memory 85.6953 M Max process memory 84.9258 M (5) before calling KSPSolve() Current space PetscMalloc()ed 74.3354 M Max space PetscMalloced() 74.3355 M Current process memory 85.8711 M Max process memory 84.9258 M (6) after calling KSPSolve() Current space PetscMalloc()ed 290.952 M Max space PetscMalloced() 593.367 M Current process memory 306.852 M Max process memory 301.441 M (7) After destroying all stuffs Current space PetscMalloc()ed 0.331482 M Max space PetscMalloced() 593.367 M Current process memory 67.2539 M Max process memory 309.137 M So my question is why/if I need so much memory (306.852 M) for so small problem (freedom: 4*10^4). Or is it normal case? Or my run script used to set up solver is not reasonable? Regards, Fande Kong, Department of Computer Science University of Colorado Boulder On Mon, May 27, 2013 at 9:48 PM, Barry Smith wrote: > > There are several ways to monitor the memory usage. You can divide them > into two categories: those that monitor how much memory has been malloced > specifically by PETSc and how much is used totally be the process. > > PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which only > work with the command line option -malloc provide how much PETSc has > malloced. > > PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call > PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for > this one to work) provide total memory usage. > > These are called on each process so use a MPI_Reduce() to gather the total > memory across all processes to process 0 to print it out. Suggest calling > it after the mesh as been set up, then call again immediately before the > XXXSolve() is called and then after the XXXSolve() is called. > > Please let us know if you have any difficulties. > > As always we recommend you upgrade to PETSc 3.4 > > Barry > > > > On May 27, 2013, at 10:22 PM, Fande Kong wrote: > > > Hi all, > > > > How to measure the memory usage of the application built on the Petsc? > I am now solving linear elasticity equations with fgmres preconditioned by > two-level method, that is, preconditioned by multigrid method where on each > level the additive Schwarz method is adopted. More than 1000 cores are > adopted to solve this problem on the supercomputer. When the total freedom > of the problem is about 60M, the application correctly run and produce > correct results. But when the total freedom increases to 600M, the > application abort and say there is not enough memory ( the system > administrator of the supercomputer told me that my application run out > memory). > > > > Thus, I want to monitor the memory usage dynamically when the > application running. Are there any functions or strategies that could be > used for this purpose? > > > > The error information is attached. > > > > Regards, > > -- > > Fande Kong > > Department of Computer Science > > University of Colorado at Boulder > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue May 28 05:05:04 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 28 May 2013 06:05:04 -0400 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: On Tue, May 28, 2013 at 5:54 AM, Fande Kong wrote: > Hi Smith, > > Thank you very much. According to your suggestions and information, I > added these functions into my code to measure the memory usage. Now I am > confused, since the small problem needs large memory. > > I added the function PetscMemorySetGetMaximumUsage() immediately after > PetscInitialize(). And then I added the following code into several > positions in the code (before & after setting up unstructured mesh, before > & after KSPSetUp(), before & after KSPSolve(), and Destroy all stuffs): > > PetscLogDouble space =0; > ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", > space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", > space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > > In order to measure the memory usage, I just used only one core (mpirun -n > 1 ./program ) to solve a small problem with 12691 mesh nodes (the freedom > is about 12691*3= 4 *10^4 ). I solve the linear elasticity problem by using > FGMRES preconditioned by multigrid method (PCMG). I use all petsc standard > routines except that I construct coarse matrix and interpolation matrix by > myself. I used the following run script to set up solver and preconditioner: > > mpirun -n 1 ./linearElasticity -ksp_type fgmres -pc_type mg > -pc_mg_levels 2 -pc_mg_cycle_type v -pc_mg_type multiplicative > -mg_levels_1_ksp_type richardson -mg_levels_1_ksp_max_it 1 > -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type preonly > -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 4 > -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type cg > -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 10 -mg_coarse_pc_type asm > -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu > -mg_coarse_sub_pc_factor_levels 2 > -mg_coarse_sub_pc_factor_mat_ordering_type rcm -ksp_view -log_summary > -pc_mg_log > > > I got the following results: > > (1) before setting up mesh, > > Current space PetscMalloc()ed 0.075882 M > Max space PetscMalloced() 0.119675 M > Current process memory 7.83203 M > Max process memory 0 M > > (2) after setting up mesh, > > Current space PetscMalloc()ed 16.8411 M > Max space PetscMalloced() 22.1353 M > Current process memory 28.4336 M > Max process memory 33.0547 M > > (3) before calling KSPSetUp() > > Current space PetscMalloc()ed 16.868 M > Max space PetscMalloced() 22.1353 M > Current process memory 28.6914 M > Max process memory 33.0547 M > > > (4) after calling KSPSetUp() > > Current space PetscMalloc()ed 74.3354 M > Max space PetscMalloced() 74.3355 M > This makes sense. It is 20M for your mesh, 20M for the Krylov space on the fine level, and I am guessing 35M for the Jacobian and the ILU factors. > Current process memory 85.6953 M > Max process memory 84.9258 M > > (5) before calling KSPSolve() > > Current space PetscMalloc()ed 74.3354 M > Max space PetscMalloced() 74.3355 M > Current process memory 85.8711 M > Max process memory 84.9258 M > > (6) after calling KSPSolve() > The question is what was malloc'd here. There is no way we could tell without seeing the code and probably running it. I suggest using http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscMallocDump.html to see what was allocated. The solvers tend not to allocated during the solve, as that is slow. So I would be inclined to check user code first. Matt > Current space PetscMalloc()ed 290.952 M > Max space PetscMalloced() 593.367 M > Current process memory 306.852 M > Max process memory 301.441 M > > (7) After destroying all stuffs > > Current space PetscMalloc()ed 0.331482 M > Max space PetscMalloced() 593.367 M > Current process memory 67.2539 M > Max process memory 309.137 M > > > So my question is why/if I need so much memory (306.852 M) for so small > problem (freedom: 4*10^4). Or is it normal case? Or my run script used to > set up solver is not reasonable? > > > Regards, > > Fande Kong, > > Department of Computer Science > University of Colorado Boulder > > > > > > > > > > > On Mon, May 27, 2013 at 9:48 PM, Barry Smith wrote: > >> >> There are several ways to monitor the memory usage. You can divide >> them into two categories: those that monitor how much memory has been >> malloced specifically by PETSc and how much is used totally be the process. >> >> PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which only >> work with the command line option -malloc provide how much PETSc has >> malloced. >> >> PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call >> PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for >> this one to work) provide total memory usage. >> >> These are called on each process so use a MPI_Reduce() to gather the >> total memory across all processes to process 0 to print it out. Suggest >> calling it after the mesh as been set up, then call again immediately >> before the XXXSolve() is called and then after the XXXSolve() is called. >> >> Please let us know if you have any difficulties. >> >> As always we recommend you upgrade to PETSc 3.4 >> >> Barry >> >> >> >> On May 27, 2013, at 10:22 PM, Fande Kong wrote: >> >> > Hi all, >> > >> > How to measure the memory usage of the application built on the Petsc? >> I am now solving linear elasticity equations with fgmres preconditioned by >> two-level method, that is, preconditioned by multigrid method where on each >> level the additive Schwarz method is adopted. More than 1000 cores are >> adopted to solve this problem on the supercomputer. When the total freedom >> of the problem is about 60M, the application correctly run and produce >> correct results. But when the total freedom increases to 600M, the >> application abort and say there is not enough memory ( the system >> administrator of the supercomputer told me that my application run out >> memory). >> > >> > Thus, I want to monitor the memory usage dynamically when the >> application running. Are there any functions or strategies that could be >> used for this purpose? >> > >> > The error information is attached. >> > >> > Regards, >> > -- >> > Fande Kong >> > Department of Computer Science >> > University of Colorado at Boulder >> > >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fande.kong at colorado.edu Tue May 28 07:42:54 2013 From: fande.kong at colorado.edu (Fande Kong) Date: Tue, 28 May 2013 20:42:54 +0800 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: Hi Matthew, Thanks, I added the function PetscMallocDump() into the code after calling KSPSolve(): (6) after calling KSPSolve() ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Current process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Max process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMallocDump(PETSC_NULL);CHKERRQ(ierr); Current space PetscMalloc()ed 290.952 M Max space PetscMalloced() 593.367 M Current process memory 306.852 M Max process memory 301.441 M The printed detailed petscmalloc information is attached. The output seems too many lines to understand. How to understand this information? On Tue, May 28, 2013 at 6:05 PM, Matthew Knepley wrote: > On Tue, May 28, 2013 at 5:54 AM, Fande Kong wrote: > >> Hi Smith, >> >> Thank you very much. According to your suggestions and information, I >> added these functions into my code to measure the memory usage. Now I am >> confused, since the small problem needs large memory. >> >> I added the function PetscMemorySetGetMaximumUsage() immediately after >> PetscInitialize(). And then I added the following code into several >> positions in the code (before & after setting up unstructured mesh, before >> & after KSPSetUp(), before & after KSPSolve(), and Destroy all stuffs): >> >> PetscLogDouble space =0; >> ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); >> ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", >> space/(1024*1024));CHKERRQ(ierr); >> ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); >> ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", >> space/(1024*1024));CHKERRQ(ierr); >> ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); >> ierr = PetscPrintf(comm,"Current process memory %G M\n", >> space/(1024*1024));CHKERRQ(ierr); >> ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); >> ierr = PetscPrintf(comm,"Max process memory %G M\n", >> space/(1024*1024));CHKERRQ(ierr); >> >> >> In order to measure the memory usage, I just used only one core (mpirun >> -n 1 ./program ) to solve a small problem with 12691 mesh nodes (the >> freedom is about 12691*3= 4 *10^4 ). I solve the linear elasticity problem >> by using FGMRES preconditioned by multigrid method (PCMG). I use all petsc >> standard routines except that I construct coarse matrix and interpolation >> matrix by myself. I used the following run script to set up solver and >> preconditioner: >> >> mpirun -n 1 ./linearElasticity -ksp_type fgmres -pc_type mg >> -pc_mg_levels 2 -pc_mg_cycle_type v -pc_mg_type multiplicative >> -mg_levels_1_ksp_type richardson -mg_levels_1_ksp_max_it 1 >> -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type preonly >> -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 4 >> -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type cg >> -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 10 -mg_coarse_pc_type asm >> -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu >> -mg_coarse_sub_pc_factor_levels 2 >> -mg_coarse_sub_pc_factor_mat_ordering_type rcm -ksp_view -log_summary >> -pc_mg_log >> >> >> I got the following results: >> >> (1) before setting up mesh, >> >> Current space PetscMalloc()ed 0.075882 M >> Max space PetscMalloced() 0.119675 M >> Current process memory 7.83203 M >> Max process memory 0 M >> >> (2) after setting up mesh, >> >> Current space PetscMalloc()ed 16.8411 M >> Max space PetscMalloced() 22.1353 M >> Current process memory 28.4336 M >> Max process memory 33.0547 M >> >> (3) before calling KSPSetUp() >> >> Current space PetscMalloc()ed 16.868 M >> Max space PetscMalloced() 22.1353 M >> Current process memory 28.6914 M >> Max process memory 33.0547 M >> >> >> (4) after calling KSPSetUp() >> >> Current space PetscMalloc()ed 74.3354 M >> Max space PetscMalloced() 74.3355 M >> > > This makes sense. It is 20M for your mesh, 20M > for the Krylov space on the fine level, and I am guessing > 35M for the Jacobian and the ILU factors. > > >> Current process memory 85.6953 M >> Max process memory 84.9258 M >> >> (5) before calling KSPSolve() >> >> Current space PetscMalloc()ed 74.3354 M >> Max space PetscMalloced() 74.3355 M >> Current process memory 85.8711 M >> Max process memory 84.9258 M >> >> (6) after calling KSPSolve() >> > > The question is what was malloc'd here. There is no way we could > tell without seeing the code and probably running it. I suggest > using > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscMallocDump.html > to see what was allocated. The solvers tend not to allocated during > the solve, as that is slow. So I would be inclined to check user code > first. > > Matt > > >> Current space PetscMalloc()ed 290.952 M >> Max space PetscMalloced() 593.367 M >> Current process memory 306.852 M >> Max process memory 301.441 M >> >> (7) After destroying all stuffs >> >> Current space PetscMalloc()ed 0.331482 M >> Max space PetscMalloced() 593.367 M >> Current process memory 67.2539 M >> Max process memory 309.137 M >> >> >> So my question is why/if I need so much memory (306.852 M) for so small >> problem (freedom: 4*10^4). Or is it normal case? Or my run script used to >> set up solver is not reasonable? >> >> >> Regards, >> >> Fande Kong, >> >> Department of Computer Science >> University of Colorado Boulder >> >> >> >> >> >> >> >> >> >> >> On Mon, May 27, 2013 at 9:48 PM, Barry Smith wrote: >> >>> >>> There are several ways to monitor the memory usage. You can divide >>> them into two categories: those that monitor how much memory has been >>> malloced specifically by PETSc and how much is used totally be the process. >>> >>> PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which only >>> work with the command line option -malloc provide how much PETSc has >>> malloced. >>> >>> PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call >>> PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for >>> this one to work) provide total memory usage. >>> >>> These are called on each process so use a MPI_Reduce() to gather the >>> total memory across all processes to process 0 to print it out. Suggest >>> calling it after the mesh as been set up, then call again immediately >>> before the XXXSolve() is called and then after the XXXSolve() is called. >>> >>> Please let us know if you have any difficulties. >>> >>> As always we recommend you upgrade to PETSc 3.4 >>> >>> Barry >>> >>> >>> >>> On May 27, 2013, at 10:22 PM, Fande Kong >>> wrote: >>> >>> > Hi all, >>> > >>> > How to measure the memory usage of the application built on the Petsc? >>> I am now solving linear elasticity equations with fgmres preconditioned by >>> two-level method, that is, preconditioned by multigrid method where on each >>> level the additive Schwarz method is adopted. More than 1000 cores are >>> adopted to solve this problem on the supercomputer. When the total freedom >>> of the problem is about 60M, the application correctly run and produce >>> correct results. But when the total freedom increases to 600M, the >>> application abort and say there is not enough memory ( the system >>> administrator of the supercomputer told me that my application run out >>> memory). >>> > >>> > Thus, I want to monitor the memory usage dynamically when the >>> application running. Are there any functions or strategies that could be >>> used for this purpose? >>> > >>> > The error information is attached. >>> > >>> > Regards, >>> > -- >>> > Fande Kong >>> > Department of Computer Science >>> > University of Colorado at Boulder >>> > >>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- Fande Kong Department of Computer Science University of Colorado at Boulder -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: PetscMallocDum_printed.zip Type: application/zip Size: 39462 bytes Desc: not available URL: From knepley at gmail.com Tue May 28 11:01:04 2013 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 28 May 2013 12:01:04 -0400 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: On Tue, May 28, 2013 at 8:42 AM, Fande Kong wrote: > Hi Matthew, > > Thanks, > > I added the function PetscMallocDump() into the code after calling > KSPSolve(): > > (6) after calling KSPSolve() > > ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", > space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", > space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMallocDump(PETSC_NULL);CHKERRQ(ierr); > > Current space PetscMalloc()ed 290.952 M > Max space PetscMalloced() 593.367 M > Current process memory 306.852 M > Max process memory 301.441 M > > The printed detailed petscmalloc information is attached. The output seems > too many lines to understand. How to understand this information? > 1) Why would you ever start with a complex, parallel example for debugging? This is crazy, and not how anyone would ever start a scientific investigation. You simplify the problem until you understand everything, and then slowly add complexity. 2) The idea is to dump once before and once after the solve, and diff Matt > On Tue, May 28, 2013 at 6:05 PM, Matthew Knepley wrote: > >> On Tue, May 28, 2013 at 5:54 AM, Fande Kong wrote: >> >>> Hi Smith, >>> >>> Thank you very much. According to your suggestions and information, I >>> added these functions into my code to measure the memory usage. Now I am >>> confused, since the small problem needs large memory. >>> >>> I added the function PetscMemorySetGetMaximumUsage() immediately after >>> PetscInitialize(). And then I added the following code into several >>> positions in the code (before & after setting up unstructured mesh, before >>> & after KSPSetUp(), before & after KSPSolve(), and Destroy all stuffs): >>> >>> PetscLogDouble space =0; >>> ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); >>> ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", >>> space/(1024*1024));CHKERRQ(ierr); >>> ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); >>> ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", >>> space/(1024*1024));CHKERRQ(ierr); >>> ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); >>> ierr = PetscPrintf(comm,"Current process memory %G M\n", >>> space/(1024*1024));CHKERRQ(ierr); >>> ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); >>> ierr = PetscPrintf(comm,"Max process memory %G M\n", >>> space/(1024*1024));CHKERRQ(ierr); >>> >>> >>> In order to measure the memory usage, I just used only one core (mpirun >>> -n 1 ./program ) to solve a small problem with 12691 mesh nodes (the >>> freedom is about 12691*3= 4 *10^4 ). I solve the linear elasticity problem >>> by using FGMRES preconditioned by multigrid method (PCMG). I use all petsc >>> standard routines except that I construct coarse matrix and interpolation >>> matrix by myself. I used the following run script to set up solver and >>> preconditioner: >>> >>> mpirun -n 1 ./linearElasticity -ksp_type fgmres -pc_type mg >>> -pc_mg_levels 2 -pc_mg_cycle_type v -pc_mg_type multiplicative >>> -mg_levels_1_ksp_type richardson -mg_levels_1_ksp_max_it 1 >>> -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type preonly >>> -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 4 >>> -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type cg >>> -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 10 -mg_coarse_pc_type asm >>> -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu >>> -mg_coarse_sub_pc_factor_levels 2 >>> -mg_coarse_sub_pc_factor_mat_ordering_type rcm -ksp_view -log_summary >>> -pc_mg_log >>> >>> >>> I got the following results: >>> >>> (1) before setting up mesh, >>> >>> Current space PetscMalloc()ed 0.075882 M >>> Max space PetscMalloced() 0.119675 M >>> Current process memory 7.83203 M >>> Max process memory 0 M >>> >>> (2) after setting up mesh, >>> >>> Current space PetscMalloc()ed 16.8411 M >>> Max space PetscMalloced() 22.1353 M >>> Current process memory 28.4336 M >>> Max process memory 33.0547 M >>> >>> (3) before calling KSPSetUp() >>> >>> Current space PetscMalloc()ed 16.868 M >>> Max space PetscMalloced() 22.1353 M >>> Current process memory 28.6914 M >>> Max process memory 33.0547 M >>> >>> >>> (4) after calling KSPSetUp() >>> >>> Current space PetscMalloc()ed 74.3354 M >>> Max space PetscMalloced() 74.3355 M >>> >> >> This makes sense. It is 20M for your mesh, 20M >> for the Krylov space on the fine level, and I am guessing >> 35M for the Jacobian and the ILU factors. >> >> >>> Current process memory 85.6953 M >>> Max process memory 84.9258 M >>> >>> (5) before calling KSPSolve() >>> >>> Current space PetscMalloc()ed 74.3354 M >>> Max space PetscMalloced() 74.3355 M >>> Current process memory 85.8711 M >>> Max process memory 84.9258 M >>> >>> (6) after calling KSPSolve() >>> >> >> The question is what was malloc'd here. There is no way we could >> tell without seeing the code and probably running it. I suggest >> using >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscMallocDump.html >> to see what was allocated. The solvers tend not to allocated during >> the solve, as that is slow. So I would be inclined to check user code >> first. >> >> Matt >> >> >>> Current space PetscMalloc()ed 290.952 M >>> Max space PetscMalloced() 593.367 M >>> Current process memory 306.852 M >>> Max process memory 301.441 M >>> >>> (7) After destroying all stuffs >>> >>> Current space PetscMalloc()ed 0.331482 M >>> Max space PetscMalloced() 593.367 M >>> Current process memory 67.2539 M >>> Max process memory 309.137 M >>> >>> >>> So my question is why/if I need so much memory (306.852 M) for so small >>> problem (freedom: 4*10^4). Or is it normal case? Or my run script used to >>> set up solver is not reasonable? >>> >>> >>> Regards, >>> >>> Fande Kong, >>> >>> Department of Computer Science >>> University of Colorado Boulder >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> >>> On Mon, May 27, 2013 at 9:48 PM, Barry Smith wrote: >>> >>>> >>>> There are several ways to monitor the memory usage. You can divide >>>> them into two categories: those that monitor how much memory has been >>>> malloced specifically by PETSc and how much is used totally be the process. >>>> >>>> PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which >>>> only work with the command line option -malloc provide how much PETSc has >>>> malloced. >>>> >>>> PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call >>>> PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for >>>> this one to work) provide total memory usage. >>>> >>>> These are called on each process so use a MPI_Reduce() to gather the >>>> total memory across all processes to process 0 to print it out. Suggest >>>> calling it after the mesh as been set up, then call again immediately >>>> before the XXXSolve() is called and then after the XXXSolve() is called. >>>> >>>> Please let us know if you have any difficulties. >>>> >>>> As always we recommend you upgrade to PETSc 3.4 >>>> >>>> Barry >>>> >>>> >>>> >>>> On May 27, 2013, at 10:22 PM, Fande Kong >>>> wrote: >>>> >>>> > Hi all, >>>> > >>>> > How to measure the memory usage of the application built on the >>>> Petsc? I am now solving linear elasticity equations with fgmres >>>> preconditioned by two-level method, that is, preconditioned by multigrid >>>> method where on each level the additive Schwarz method is adopted. More >>>> than 1000 cores are adopted to solve this problem on the supercomputer. >>>> When the total freedom of the problem is about 60M, the application >>>> correctly run and produce correct results. But when the total freedom >>>> increases to 600M, the application abort and say there is not enough memory >>>> ( the system administrator of the supercomputer told me that my >>>> application run out memory). >>>> > >>>> > Thus, I want to monitor the memory usage dynamically when the >>>> application running. Are there any functions or strategies that could be >>>> used for this purpose? >>>> > >>>> > The error information is attached. >>>> > >>>> > Regards, >>>> > -- >>>> > Fande Kong >>>> > Department of Computer Science >>>> > University of Colorado at Boulder >>>> > >>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > > -- > Fande Kong > Department of Computer Science > University of Colorado at Boulder > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue May 28 11:21:38 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 28 May 2013 11:21:38 -0500 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: First find the big items in the allocations with /Downloads$ grep bytes PetscMallocDump | sed "s?\[ 0\]??g" | sort -n -r | more 111132656 bytes MatILUFactorSymbolic_SeqAIJ() line 1867 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 111132656 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 2239488 bytes SpmcsDMMeshSymmetrize() line 983 in spmcsdmmesh.cpp 2239488 bytes DMSetUp_SpmcsDMMesh() line 139 in spmcsdmmeshcreate.cpp 2239488 bytes DMSetUp_SpmcsDMMesh() line 138 in spmcsdmmeshcreate.cpp 1493504 bytes MatILUFactorSymbolic_SeqAIJ() line 1867 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 1493504 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp 661408 bytes SpmcsDMMeshPreallocateSieveLabel() line 1935 in spmcsdmmesh.cpp 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c by far the biggest hogs are the first two MatILUFactorSymbolic_SeqAIJ followed by the original matrix storage. Then find the large on in the file [ 0]111132656 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c [0] MatILUFactorSymbolic() line 6114 in /home/fdkong/math/petsc-3.3-p7/src/mat/interface/matrix.c [0] PCSetUp_ILU() line 173 in /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/impls/factor/ilu/ilu.c [0] PCSetUp() line 810 in /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/interface/precon.c [0] KSPSetUp() line 182 in /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c [0] PCSetUpOnBlocks_ASM() line 416 in /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/impls/asm/asm.c [0] PCSetUpOnBlocks() line 861 in /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/interface/precon.c [0] KSPSetUpOnBlocks() line 151 in /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c [0] KSPSolve() line 351 in /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c [0] PCGMGMCycle_Private() line 20 in gmg.cpp [0] PCApply_GMG() line 329 in gmg.cpp [0] PCApply() line 373 in /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/interface/precon.c [0] KSPFGMRESCycle() line 114 in /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/impls/gmres/fgmres/fgmres.c [0] KSPSolve_FGMRES() line 277 in /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/impls/gmres/fgmres/fgmres.c [0] KSPSolve() line 351 in /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c so it is doing the ILU on each block (process) that is taking all the space. Since the ILU is taking much more than the matrix I'm guessing you are running ILU(k > 0) which will require a lot of memory. You can try ILU(0), the default, and it should require much less memory. Or you can use something like -sub_pc_type sor so that it does not need to allocate any factored matrices at all. Barry On May 28, 2013, at 7:42 AM, Fande Kong wrote: > Hi Matthew, > > Thanks, > > I added the function PetscMallocDump() into the code after calling KSPSolve(): > > (6) after calling KSPSolve() > > ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMallocDump(PETSC_NULL);CHKERRQ(ierr); > > Current space PetscMalloc()ed 290.952 M > Max space PetscMalloced() 593.367 M > Current process memory 306.852 M > Max process memory 301.441 M > > The printed detailed petscmalloc information is attached. The output seems too many lines to understand. How to understand this information? > > > > On Tue, May 28, 2013 at 6:05 PM, Matthew Knepley wrote: > On Tue, May 28, 2013 at 5:54 AM, Fande Kong wrote: > Hi Smith, > > Thank you very much. According to your suggestions and information, I added these functions into my code to measure the memory usage. Now I am confused, since the small problem needs large memory. > > I added the function PetscMemorySetGetMaximumUsage() immediately after PetscInitialize(). And then I added the following code into several positions in the code (before & after setting up unstructured mesh, before & after KSPSetUp(), before & after KSPSolve(), and Destroy all stuffs): > > PetscLogDouble space =0; > ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Current process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); > ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); > ierr = PetscPrintf(comm,"Max process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); > > > In order to measure the memory usage, I just used only one core (mpirun -n 1 ./program ) to solve a small problem with 12691 mesh nodes (the freedom is about 12691*3= 4 *10^4 ). I solve the linear elasticity problem by using FGMRES preconditioned by multigrid method (PCMG). I use all petsc standard routines except that I construct coarse matrix and interpolation matrix by myself. I used the following run script to set up solver and preconditioner: > > mpirun -n 1 ./linearElasticity -ksp_type fgmres -pc_type mg -pc_mg_levels 2 -pc_mg_cycle_type v -pc_mg_type multiplicative -mg_levels_1_ksp_type richardson -mg_levels_1_ksp_max_it 1 -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type preonly -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 4 -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type cg -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 10 -mg_coarse_pc_type asm -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu -mg_coarse_sub_pc_factor_levels 2 -mg_coarse_sub_pc_factor_mat_ordering_type rcm -ksp_view -log_summary -pc_mg_log > > > I got the following results: > > (1) before setting up mesh, > > Current space PetscMalloc()ed 0.075882 M > Max space PetscMalloced() 0.119675 M > Current process memory 7.83203 M > Max process memory 0 M > > (2) after setting up mesh, > > Current space PetscMalloc()ed 16.8411 M > Max space PetscMalloced() 22.1353 M > Current process memory 28.4336 M > Max process memory 33.0547 M > > (3) before calling KSPSetUp() > > Current space PetscMalloc()ed 16.868 M > Max space PetscMalloced() 22.1353 M > Current process memory 28.6914 M > Max process memory 33.0547 M > > > (4) after calling KSPSetUp() > > Current space PetscMalloc()ed 74.3354 M > Max space PetscMalloced() 74.3355 M > > This makes sense. It is 20M for your mesh, 20M > for the Krylov space on the fine level, and I am guessing > 35M for the Jacobian and the ILU factors. > > Current process memory 85.6953 M > Max process memory 84.9258 M > > (5) before calling KSPSolve() > > Current space PetscMalloc()ed 74.3354 M > Max space PetscMalloced() 74.3355 M > Current process memory 85.8711 M > Max process memory 84.9258 M > > (6) after calling KSPSolve() > > The question is what was malloc'd here. There is no way we could > tell without seeing the code and probably running it. I suggest > using http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscMallocDump.html > to see what was allocated. The solvers tend not to allocated during > the solve, as that is slow. So I would be inclined to check user code first. > > Matt > > Current space PetscMalloc()ed 290.952 M > Max space PetscMalloced() 593.367 M > Current process memory 306.852 M > Max process memory 301.441 M > > (7) After destroying all stuffs > > Current space PetscMalloc()ed 0.331482 M > Max space PetscMalloced() 593.367 M > Current process memory 67.2539 M > Max process memory 309.137 M > > > So my question is why/if I need so much memory (306.852 M) for so small problem (freedom: 4*10^4). Or is it normal case? Or my run script used to set up solver is not reasonable? > > > Regards, > > Fande Kong, > > Department of Computer Science > University of Colorado Boulder > > > > > > > > > > > On Mon, May 27, 2013 at 9:48 PM, Barry Smith wrote: > > There are several ways to monitor the memory usage. You can divide them into two categories: those that monitor how much memory has been malloced specifically by PETSc and how much is used totally be the process. > > PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which only work with the command line option -malloc provide how much PETSc has malloced. > > PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for this one to work) provide total memory usage. > > These are called on each process so use a MPI_Reduce() to gather the total memory across all processes to process 0 to print it out. Suggest calling it after the mesh as been set up, then call again immediately before the XXXSolve() is called and then after the XXXSolve() is called. > > Please let us know if you have any difficulties. > > As always we recommend you upgrade to PETSc 3.4 > > Barry > > > > On May 27, 2013, at 10:22 PM, Fande Kong wrote: > > > Hi all, > > > > How to measure the memory usage of the application built on the Petsc? I am now solving linear elasticity equations with fgmres preconditioned by two-level method, that is, preconditioned by multigrid method where on each level the additive Schwarz method is adopted. More than 1000 cores are adopted to solve this problem on the supercomputer. When the total freedom of the problem is about 60M, the application correctly run and produce correct results. But when the total freedom increases to 600M, the application abort and say there is not enough memory ( the system administrator of the supercomputer told me that my application run out memory). > > > > Thus, I want to monitor the memory usage dynamically when the application running. Are there any functions or strategies that could be used for this purpose? > > > > The error information is attached. > > > > Regards, > > -- > > Fande Kong > > Department of Computer Science > > University of Colorado at Boulder > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > > -- > Fande Kong > Department of Computer Science > University of Colorado at Boulder > From ucemckl at ucl.ac.uk Wed May 29 05:20:48 2013 From: ucemckl at ucl.ac.uk (Christian Klettner) Date: Wed, 29 May 2013 11:20:48 +0100 Subject: [petsc-users] Compile flags for a Blue Gene\Q Message-ID: Dear PETSc group, We have been given access to a Blue Gene\Q system to run code we normally run on a traditional cluster architecture. We have ported the code successfully and run some jobs using 16 cores per node however the performance is roughly four times slower to that of a Xeon processor. We expect less performance (due to the slower chips) however this seems a bit excessive. One problem we think is that we are not using all four hardware threads per core. To achieve this do we need to use a threaded version of PETSc? Could someone suggest the additional arguments required to make use of the pthreads when launching an MPI job? We are using the Blue Gene MPI libraries but are currently unable to use the ESSL blas libraries. We found the example compile flags for a Blue Gene \P in the PETSc package but were wondering if anyone had compile flags which they would recommend for a Blue Gene\Q? Best regards, Christian From choi240 at purdue.edu Wed May 29 05:43:37 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Wed, 29 May 2013 06:43:37 -0400 (EDT) Subject: [petsc-users] The multiplication of large matrices In-Reply-To: <1350629403.2682.1369816687803.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <1567053414.2837.1369824217089.JavaMail.root@mailhub028.itcs.purdue.edu> Hello all, I am trying to compute the multiplication of large matrices using petsc. First, I=2.6*10^7, J=4.8*10^7. The matrix X is as follows: size of I*(I*J), block size of I, and non-zeros of 1.4*10^8. -> X=[X1 X2 ...] The matrix B is as follows: size of I*1, and dense. The matrix M is as follows: size of I*J. M is the multiplication of each block of the matrix X and the matrix B. -> M=[X1*B X2*B ...] I have to get the matrix M, given X and B. I successfully set up very large aij matrix X. However, I don't know what I have to do next. I tried two ways, but I failed. First, I tried to get each block X1, X2,... using ISCreate and MatGetLocalSubMatrix, compute X1*B, X2*B,... using MatMatMult, and then set up the matrix M using ISLocalToGlobalMappingCreate and MatSetLocalToGlobalMapping. However, I got "No support for this operation for this object type" error. Also, this code was very slow because of 48 million loops. Second, I tried to set up the new matrix BB from B. The matrix BB has the size of (I*J)*J and the block size of I*1. And every diagonal block of BB is B and all other blocks are 0 matrices. That is, | B 0 0 .. 0 | BB = | 0 B 0 .. 0 | | ... | | 0 0 0 .. B | This is not a block diagonal matrix because B is not square. Anyway, if I set up BB, I can get M easily because M=X*BB. However, I got "out of memory" error from MatCreateSeqAIJ. I think this is because BB has non-zeros of I*J=10^15. I never have any other ideas. If someone can fix my wrong ways correctly or has new ideas, then please please let me know. Thank you very much. Joon From knepley at gmail.com Wed May 29 06:01:26 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 29 May 2013 07:01:26 -0400 Subject: [petsc-users] The multiplication of large matrices In-Reply-To: <1567053414.2837.1369824217089.JavaMail.root@mailhub028.itcs.purdue.edu> References: <1350629403.2682.1369816687803.JavaMail.root@mailhub028.itcs.purdue.edu> <1567053414.2837.1369824217089.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: On Wed, May 29, 2013 at 6:43 AM, Joon hee Choi wrote: > Hello all, > > I am trying to compute the multiplication of large matrices using petsc. > First, > > I=2.6*10^7, J=4.8*10^7. > The matrix X is as follows: size of I*(I*J), block size of I, and > non-zeros of 1.4*10^8. > -> X=[X1 X2 ...] > The matrix B is as follows: size of I*1, and dense. > The matrix M is as follows: size of I*J. M is the multiplication of each > block of the matrix X and the matrix B. > -> M=[X1*B X2*B ...] > > I have to get the matrix M, given X and B. I successfully set up very > large aij matrix X. However, I don't know what I have to do next. I tried > two ways, but I failed. > > First, I tried to get each block X1, X2,... using ISCreate and > MatGetLocalSubMatrix, compute X1*B, X2*B,... using MatMatMult, and then set > up the matrix M using ISLocalToGlobalMappingCreate and > MatSetLocalToGlobalMapping. However, I got "No support for this operation > for this object type" error. Also, this code was very slow because of 48 > million loops. > Look, we have rules on this list. Without them, we cannot help you. You MUST send the ENTIRE error output. Without that, we are just guessing. > Second, I tried to set up the new matrix BB from B. The matrix BB has the > size of (I*J)*J and the block size of I*1. And every diagonal block of BB > is B and all other blocks are 0 matrices. That is, > > | B 0 0 .. 0 | > BB = | 0 B 0 .. 0 | > | ... | > | 0 0 0 .. B | > > This is not a block diagonal matrix because B is not square. Anyway, if I > set up BB, I can get M easily because M=X*BB. However, I got "out of > memory" error from MatCreateSeqAIJ. I think this is because BB has > non-zeros of I*J=10^15. > If you get an Out Of Memory error (which we need to see), it means that you are out of memory. Matt > I never have any other ideas. If someone can fix my wrong ways correctly > or has new ideas, then please please let me know. Thank you very much. > > Joon > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fande.kong at colorado.edu Wed May 29 07:45:10 2013 From: fande.kong at colorado.edu (Fande Kong) Date: Wed, 29 May 2013 06:45:10 -0600 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: Hi Barry, Thanks. Now I know where it takes most of memory. But I have another question. I found that we need many MatSeqAIJSetPreallocations in code. I do not know why. Then I switch to test a standard example built in petsc: petsc-3.3-p7/src/ksp/ksp/examples/tutorials/ex29.c. First, I added the following code in to ex29.c after calling KSPSolve: PetscLogDouble space =0; ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Current process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); ierr = PetscPrintf(comm,"Max process memory %G M\n", space/(1024*1024));CHKERRQ(ierr); ierr = PetscMallocDump(PETSC_NULL);CHKERRQ(ierr); Second, ran the following script: mpirun -n 1 ./ex29 -pc_type mg -pc_mg_type full -ksp_type fgmres -ksp_monitor_short -da_refine 1 >printresult Last, I got: 5279056 bytes MatLUFactorSymbolic_SeqAIJ() line 380 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 5279040 bytes MatLUFactorSymbolic_SeqAIJ() line 392 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c Now, my question is why we call MatSeqAIJSetPreallocation so many tines. Here we use two-level preconditioner, then we only need three MatSeqAIJSetPreallocations, one for fine matrix, one for coarse matrix and one for interpolation matrix (restriction is the transpose of the interpolation). But here we have more than three MatSeqAIJSetPreallocations. It is also similar for MatLUFactorSymbolic. On Tue, May 28, 2013 at 10:21 AM, Barry Smith wrote: > > First find the big items in the allocations with > > /Downloads$ grep bytes PetscMallocDump | sed "s?\[ 0\]??g" | sort -n -r | > more > > 111132656 bytes MatILUFactorSymbolic_SeqAIJ() line 1867 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > 111132656 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 10980656 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 2239488 bytes SpmcsDMMeshSymmetrize() line 983 in spmcsdmmesh.cpp > 2239488 bytes DMSetUp_SpmcsDMMesh() line 139 in spmcsdmmeshcreate.cpp > 2239488 bytes DMSetUp_SpmcsDMMesh() line 138 in spmcsdmmeshcreate.cpp > 1493504 bytes MatILUFactorSymbolic_SeqAIJ() line 1867 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > 1493504 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 1031520 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsSectionSetChart() line 183 in spmcssectionimpl.cpp > 661408 bytes SpmcsDMMeshPreallocateSieveLabel() line 1935 in > spmcsdmmesh.cpp > 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 493856 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > > by far the biggest hogs are the first two MatILUFactorSymbolic_SeqAIJ > followed by the original matrix storage. Then find the large on in the file > > [ 0]111132656 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > [0] MatILUFactorSymbolic() line 6114 in > /home/fdkong/math/petsc-3.3-p7/src/mat/interface/matrix.c > [0] PCSetUp_ILU() line 173 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/impls/factor/ilu/ilu.c > [0] PCSetUp() line 810 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/interface/precon.c > [0] KSPSetUp() line 182 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c > [0] PCSetUpOnBlocks_ASM() line 416 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/impls/asm/asm.c > [0] PCSetUpOnBlocks() line 861 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/interface/precon.c > [0] KSPSetUpOnBlocks() line 151 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c > [0] KSPSolve() line 351 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c > [0] PCGMGMCycle_Private() line 20 in gmg.cpp > [0] PCApply_GMG() line 329 in gmg.cpp > [0] PCApply() line 373 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/pc/interface/precon.c > [0] KSPFGMRESCycle() line 114 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/impls/gmres/fgmres/fgmres.c > [0] KSPSolve_FGMRES() line 277 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/impls/gmres/fgmres/fgmres.c > [0] KSPSolve() line 351 in > /home/fdkong/math/petsc-3.3-p7/src/ksp/ksp/interface/itfunc.c > > so it is doing the ILU on each block (process) that is taking all the > space. Since the ILU is taking much more than the matrix I'm guessing you > are running ILU(k > 0) which will require a lot of memory. You can try > ILU(0), the default, and it should require much less memory. Or you can use > something like -sub_pc_type sor so that it does not need to allocate any > factored matrices at all. > > Barry > > > On May 28, 2013, at 7:42 AM, Fande Kong wrote: > > > Hi Matthew, > > > > Thanks, > > > > I added the function PetscMallocDump() into the code after calling > KSPSolve(): > > > > (6) after calling KSPSolve() > > > > ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Current process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Max process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > ierr = PetscMallocDump(PETSC_NULL);CHKERRQ(ierr); > > > > Current space PetscMalloc()ed 290.952 M > > Max space PetscMalloced() 593.367 M > > Current process memory 306.852 M > > Max process memory 301.441 M > > > > The printed detailed petscmalloc information is attached. The output > seems too many lines to understand. How to understand this information? > > > > > > > > On Tue, May 28, 2013 at 6:05 PM, Matthew Knepley > wrote: > > On Tue, May 28, 2013 at 5:54 AM, Fande Kong > wrote: > > Hi Smith, > > > > Thank you very much. According to your suggestions and information, I > added these functions into my code to measure the memory usage. Now I am > confused, since the small problem needs large memory. > > > > I added the function PetscMemorySetGetMaximumUsage() immediately after > PetscInitialize(). And then I added the following code into several > positions in the code (before & after setting up unstructured mesh, before > & after KSPSetUp(), before & after KSPSolve(), and Destroy all stuffs): > > > > PetscLogDouble space =0; > > ierr = PetscMallocGetCurrentUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Current space PetscMalloc()ed %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > ierr = PetscMallocGetMaximumUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Max space PetscMalloced() %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > ierr = PetscMemoryGetCurrentUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Current process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > ierr = PetscMemoryGetMaximumUsage(&space);CHKERRQ(ierr); > > ierr = PetscPrintf(comm,"Max process memory %G M\n", > space/(1024*1024));CHKERRQ(ierr); > > > > > > In order to measure the memory usage, I just used only one core (mpirun > -n 1 ./program ) to solve a small problem with 12691 mesh nodes (the > freedom is about 12691*3= 4 *10^4 ). I solve the linear elasticity problem > by using FGMRES preconditioned by multigrid method (PCMG). I use all petsc > standard routines except that I construct coarse matrix and interpolation > matrix by myself. I used the following run script to set up solver and > preconditioner: > > > > mpirun -n 1 ./linearElasticity -ksp_type fgmres -pc_type mg > -pc_mg_levels 2 -pc_mg_cycle_type v -pc_mg_type multiplicative > -mg_levels_1_ksp_type richardson -mg_levels_1_ksp_max_it 1 > -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type preonly > -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 4 > -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type cg > -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 10 -mg_coarse_pc_type asm > -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu > -mg_coarse_sub_pc_factor_levels 2 > -mg_coarse_sub_pc_factor_mat_ordering_type rcm -ksp_view -log_summary > -pc_mg_log > > > > > > I got the following results: > > > > (1) before setting up mesh, > > > > Current space PetscMalloc()ed 0.075882 M > > Max space PetscMalloced() 0.119675 M > > Current process memory 7.83203 M > > Max process memory 0 M > > > > (2) after setting up mesh, > > > > Current space PetscMalloc()ed 16.8411 M > > Max space PetscMalloced() 22.1353 M > > Current process memory 28.4336 M > > Max process memory 33.0547 M > > > > (3) before calling KSPSetUp() > > > > Current space PetscMalloc()ed 16.868 M > > Max space PetscMalloced() 22.1353 M > > Current process memory 28.6914 M > > Max process memory 33.0547 M > > > > > > (4) after calling KSPSetUp() > > > > Current space PetscMalloc()ed 74.3354 M > > Max space PetscMalloced() 74.3355 M > > > > This makes sense. It is 20M for your mesh, 20M > > for the Krylov space on the fine level, and I am guessing > > 35M for the Jacobian and the ILU factors. > > > > Current process memory 85.6953 M > > Max process memory 84.9258 M > > > > (5) before calling KSPSolve() > > > > Current space PetscMalloc()ed 74.3354 M > > Max space PetscMalloced() 74.3355 M > > Current process memory 85.8711 M > > Max process memory 84.9258 M > > > > (6) after calling KSPSolve() > > > > The question is what was malloc'd here. There is no way we could > > tell without seeing the code and probably running it. I suggest > > using > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscMallocDump.html > > to see what was allocated. The solvers tend not to allocated during > > the solve, as that is slow. So I would be inclined to check user code > first. > > > > Matt > > > > Current space PetscMalloc()ed 290.952 M > > Max space PetscMalloced() 593.367 M > > Current process memory 306.852 M > > Max process memory 301.441 M > > > > (7) After destroying all stuffs > > > > Current space PetscMalloc()ed 0.331482 M > > Max space PetscMalloced() 593.367 M > > Current process memory 67.2539 M > > Max process memory 309.137 M > > > > > > So my question is why/if I need so much memory (306.852 M) for so small > problem (freedom: 4*10^4). Or is it normal case? Or my run script used to > set up solver is not reasonable? > > > > > > Regards, > > > > Fande Kong, > > > > Department of Computer Science > > University of Colorado Boulder > > > > > > > > > > > > > > > > > > > > > > On Mon, May 27, 2013 at 9:48 PM, Barry Smith wrote: > > > > There are several ways to monitor the memory usage. You can divide > them into two categories: those that monitor how much memory has been > malloced specifically by PETSc and how much is used totally be the process. > > > > PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which only > work with the command line option -malloc provide how much PETSc has > malloced. > > > > PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call > PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for > this one to work) provide total memory usage. > > > > These are called on each process so use a MPI_Reduce() to gather the > total memory across all processes to process 0 to print it out. Suggest > calling it after the mesh as been set up, then call again immediately > before the XXXSolve() is called and then after the XXXSolve() is called. > > > > Please let us know if you have any difficulties. > > > > As always we recommend you upgrade to PETSc 3.4 > > > > Barry > > > > > > > > On May 27, 2013, at 10:22 PM, Fande Kong > wrote: > > > > > Hi all, > > > > > > How to measure the memory usage of the application built on the Petsc? > I am now solving linear elasticity equations with fgmres preconditioned by > two-level method, that is, preconditioned by multigrid method where on each > level the additive Schwarz method is adopted. More than 1000 cores are > adopted to solve this problem on the supercomputer. When the total freedom > of the problem is about 60M, the application correctly run and produce > correct results. But when the total freedom increases to 600M, the > application abort and say there is not enough memory ( the system > administrator of the supercomputer told me that my application run out > memory). > > > > > > Thus, I want to monitor the memory usage dynamically when the > application running. Are there any functions or strategies that could be > used for this purpose? > > > > > > The error information is attached. > > > > > > Regards, > > > -- > > > Fande Kong > > > Department of Computer Science > > > University of Colorado at Boulder > > > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > > > > > -- > > Fande Kong > > Department of Computer Science > > University of Colorado at Boulder > > > > > -- Fande Kong Department of Computer Science University of Colorado at Boulder -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Wed May 29 08:08:52 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 29 May 2013 08:08:52 -0500 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: Message-ID: <8761y2yluj.fsf@mcs.anl.gov> Fande Kong writes: > Last, I got: > > > 5279056 bytes MatLUFactorSymbolic_SeqAIJ() line 380 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > 5279040 bytes MatLUFactorSymbolic_SeqAIJ() line 392 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c Fine-grid matrix, interpolation, coarse grid matrix, and LU factors for coarse grid matrix. Each of these has two large allocations when running in debug mode: ierr = PetscMalloc3(nz,PetscScalar,&b->a,nz,PetscInt,&b->j,B->rmap->n+1,PetscInt,&b->i);CHKERRQ(ierr); In optimized mode (which you should *always* use when measuring performance), PetscMalloc3 is implemented with only one malloc. Add '-malloc' to get a tracing malloc when running with an optimized build of PETSc. From ztdepyahoo at 163.com Wed May 29 08:35:31 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Wed, 29 May 2013 21:35:31 +0800 (CST) Subject: [petsc-users] some problem about VecCreateGhost Message-ID: <6d8a0d.12aa6.13ef08028c2.Coremail.ztdepyahoo@163.com> How to set different number of ghost points for different cpu. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed May 29 08:46:29 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 29 May 2013 09:46:29 -0400 Subject: [petsc-users] some problem about VecCreateGhost In-Reply-To: <6d8a0d.12aa6.13ef08028c2.Coremail.ztdepyahoo@163.com> References: <6d8a0d.12aa6.13ef08028c2.Coremail.ztdepyahoo@163.com> Message-ID: On Wed, May 29, 2013 at 9:35 AM, ??? wrote: > How to set different number of ghost points for different cpu. > This is always true. Can you give an example? Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed May 29 15:02:33 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 29 May 2013 15:02:33 -0500 (CDT) Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: References: Message-ID: This piece of code does stress compilers in complex/optimized mode. When I tried previously - I couldn't reproduce the problem [with VS2012] Now - when was trying out something else - I see the breakage. But subsequent invocation of the compiler went through fine. [without any code changes] Appears to be a bug in the compiler triggered randomly. Note: configure defautls to -O2 for this compiler for --debugging=0 Satish On Sun, 26 May 2013, Agnostic Noname wrote: > Dear Matt and Satish, > > thank you for the immediate reply. I tried looking around for any updates > to VS2012 but I didn't find any. However, adding > #include > to src\mat\impls\baij\seq\baijfact2.c did the trick. Although the make test > failed, the petsc libs generated were libf2cblas.lib, libf2clapack.lib > and libpetsc.lib so I think it should be ok. > Hopefully I will manage to link them to my application. > I know it's an ugly hack but most of my colleagues that will try to compile > petsc will be using VS2012. I don't know the reason for this error, since > in mac and linux petsc works like a charm. > > Thanks again for your help. It's greatly appreciated. > > > > On Sun, May 26, 2013 at 5:33 AM, Satish Balay wrote: > > > On Sat, 25 May 2013, Matthew Knepley wrote: > > > > > On Sat, May 25, 2013 at 6:27 AM, Agnostic Noname > > wrote: > > > > > > > > > 1) Please send messages with logs to petsc-maint at mcs.anl.gov so everyone > > > does not get huge attachments > > > > Its now acceptable to send build logs to petsc-users. petsc-users is > > now support with public archives - and petsc-maint is as usualy > > private communiation. > > > > We should be doing automatic compression of attachments [which we > > haven't figured out]. Currently the limit is set at 5MB - so if the > > e-mail comes through - then its acceptable. Alternative is to compress > > and send logs [or we generate logs in a compressed-ready-to-send > > tarball] > > > > Satish > > > From choi240 at purdue.edu Wed May 29 15:14:59 2013 From: choi240 at purdue.edu (Joon hee Choi) Date: Wed, 29 May 2013 16:14:59 -0400 (EDT) Subject: [petsc-users] The multiplication of large matrices In-Reply-To: Message-ID: <608577468.3451.1369858499778.JavaMail.root@mailhub028.itcs.purdue.edu> Dear Matthew, Thank you for your fast reply. I am attaching the actual error output and my code below. The error output is repeated by for-loop. Could you let me know what might be wrong? Thank you, Joon [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: No support for this operation for this object type! [0]PETSC ERROR: MatMatMult not supported for B of type localref! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 6, Mon Feb 11 12:26:34 CST 2013 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./tensor on a linux-sta named rossmann-a009.rcac.purdue.edu by choi240 Wed May 29 05:53:46 2013 [0]PETSC ERROR: Libraries linked from /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/linux-static/lib [0]PETSC ERROR: Configure run at Tue May 21 15:56:45 2013 [0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-scalar-type=real --with-shared-libraries=0 --with-pic=1 --with-clanguage=C++ --with-fortran --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no --download-metis=no --download-parmetis=no --download-superlu_dist=no --download-mumps=no --download-scalapack=yes --download-blacs=yes --download-hypre=no --download-spooles=no [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatMatMult() line 8603 in /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/src/mat/interface/matrix.c [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: No support for this operation for this object type! [0]PETSC ERROR: MatMatMult not supported for B of type localref! for (i=0; i To: "Joon hee Choi" Cc: petsc-users at mcs.anl.gov Sent: Wednesday, May 29, 2013 7:01:26 AM Subject: Re: [petsc-users] The multiplication of large matrices On Wed, May 29, 2013 at 6:43 AM, Joon hee Choi < choi240 at purdue.edu > wrote: Hello all, I am trying to compute the multiplication of large matrices using petsc. First, I=2.6*10^7, J=4.8*10^7. The matrix X is as follows: size of I*(I*J), block size of I, and non-zeros of 1.4*10^8. -> X=[X1 X2 ...] The matrix B is as follows: size of I*1, and dense. The matrix M is as follows: size of I*J. M is the multiplication of each block of the matrix X and the matrix B. -> M=[X1*B X2*B ...] I have to get the matrix M, given X and B. I successfully set up very large aij matrix X. However, I don't know what I have to do next. I tried two ways, but I failed. First, I tried to get each block X1, X2,... using ISCreate and MatGetLocalSubMatrix, compute X1*B, X2*B,... using MatMatMult, and then set up the matrix M using ISLocalToGlobalMappingCreate and MatSetLocalToGlobalMapping. However, I got "No support for this operation for this object type" error. Also, this code was very slow because of 48 million loops. Look, we have rules on this list. Without them, we cannot help you. You MUST send the ENTIRE error output. Without that, we are just guessing. Second, I tried to set up the new matrix BB from B. The matrix BB has the size of (I*J)*J and the block size of I*1. And every diagonal block of BB is B and all other blocks are 0 matrices. That is, | B 0 0 .. 0 | BB = | 0 B 0 .. 0 | | ... | | 0 0 0 .. B | This is not a block diagonal matrix because B is not square. Anyway, if I set up BB, I can get M easily because M=X*BB. However, I got "out of memory" error from MatCreateSeqAIJ. I think this is because BB has non-zeros of I*J=10^15. If you get an Out Of Memory error (which we need to see), it means that you are out of memory. Matt I never have any other ideas. If someone can fix my wrong ways correctly or has new ideas, then please please let me know. Thank you very much. Joon -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener From jedbrown at mcs.anl.gov Wed May 29 15:38:52 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 29 May 2013 15:38:52 -0500 Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: References: Message-ID: <874ndly10j.fsf@mcs.anl.gov> Satish Balay writes: > This piece of code does stress compilers in complex/optimized > mode. When I tried previously - I couldn't reproduce the problem [with > VS2012] We should shorten the file. From jedbrown at mcs.anl.gov Wed May 29 15:42:16 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 29 May 2013 15:42:16 -0500 Subject: [petsc-users] The multiplication of large matrices In-Reply-To: <608577468.3451.1369858499778.JavaMail.root@mailhub028.itcs.purdue.edu> References: <608577468.3451.1369858499778.JavaMail.root@mailhub028.itcs.purdue.edu> Message-ID: <871u8py0uv.fsf@mcs.anl.gov> Joon hee Choi writes: > Dear Matthew, > > Thank you for your fast reply. > I am attaching the actual error output and my code below. The error output is repeated by for-loop. Could you let me know what might be wrong? > > Thank you, > Joon > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: No support for this operation for this object type! > [0]PETSC ERROR: MatMatMult not supported for B of type localref! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 6, Mon Feb 11 12:26:34 CST 2013 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: ./tensor on a linux-sta named rossmann-a009.rcac.purdue.edu by choi240 Wed May 29 05:53:46 2013 > [0]PETSC ERROR: Libraries linked from /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/linux-static/lib > [0]PETSC ERROR: Configure run at Tue May 21 15:56:45 2013 > [0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-scalar-type=real --with-shared-libraries=0 --with-pic=1 --with-clanguage=C++ --with-fortran --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no --download-metis=no --download-parmetis=no --download-superlu_dist=no --download-mumps=no --download-scalapack=yes --download-blacs=yes --download-hypre=no --download-spooles=no > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: MatMatMult() line 8603 in /apps/rhel5/petsc-3.3-p6/64/impi-4.1.0.024_intel-13.0.1.117_ind64/src/mat/interface/matrix.c > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: No support for this operation for this object type! > [0]PETSC ERROR: MatMatMult not supported for B of type localref! > > for (i=0; i for (i=0; i for (j=0; j ierr = ISCreateGeneral(PETSC_COMM_SELF, I, idxXrow, PETSC_COPY_VALUES, &isXrow); CHKERRQ(ierr); > ierr = ISCreateGeneral(PETSC_COMM_SELF, I, idxBrow, PETSC_COPY_VALUES, &isBrow); CHKERRQ(ierr); > ierr = ISCreateGeneral(PETSC_COMM_SELF, J, idxCrow, PETSC_COPY_VALUES, &isCrow); CHKERRQ(ierr); > > for (r=0; r ierr = ISCreateGeneral(PETSC_COMM_SELF, 1, &r, PETSC_COPY_VALUES, &isBCcol); CHKERRQ(ierr); > ierr = MatGetLocalSubMatrix(B, isBrow, isBCcol, &tempB); CHKERRQ(ierr); > ierr = MatGetLocalSubMatrix(C, isCrow, isBCcol, &tempC); CHKERRQ(ierr); >From the man page: Depending on the format of mat, the returned submat may not implement MatMult(). Its communicator may be the same as mat, it may be PETSC_COMM_SELF, or some other subcomm of mat's. The submat always implements MatSetValuesLocal(). If isrow and iscol have the same block size, then MatSetValuesBlockedLocal() will also be implemented. > ierr = MatAssemblyBegin(tempB, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyEnd(tempB, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyBegin(tempC, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyEnd(tempC, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > > for (j=0; j for (i=0; i ierr = ISCreateGeneral(PETSC_COMM_SELF, I, idxXcol, PETSC_COPY_VALUES, &isXcol); CHKERRQ(ierr); > ierr = MatGetLocalSubMatrix(X1, isXrow, isXcol, &tempX); CHKERRQ(ierr); > ierr = MatAssemblyBegin(tempX, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatAssemblyEnd(tempX, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr); > ierr = MatMatMult(tempX, tempB, MAT_INITIAL_MATRIX, PETSC_DEFAULT, &tempM); It does not implement MatMult. I don't see enough context in this email about what you are trying to do, but MatGetLocalSubMatrix is not intended for what you are doing. MatGetSubMatrices() might do what you want. > ierr = MatRestoreLocalSubMatrix(X1, isXrow, isXcol, &tempX); CHKERRQ(ierr); > } > ierr = MatRestoreLocalSubMatrix(B, isBrow, isBCcol, &tempB); CHKERRQ(ierr); > ierr = MatRestoreLocalSubMatrix(C, isCrow, isBCcol, &tempC); CHKERRQ(ierr); > } > > ----- Original Message ----- > From: "Matthew Knepley" > To: "Joon hee Choi" > Cc: petsc-users at mcs.anl.gov > Sent: Wednesday, May 29, 2013 7:01:26 AM > Subject: Re: [petsc-users] The multiplication of large matrices > > > On Wed, May 29, 2013 at 6:43 AM, Joon hee Choi < choi240 at purdue.edu > wrote: > > > > > Hello all, > > I am trying to compute the multiplication of large matrices using petsc. First, > > I=2.6*10^7, J=4.8*10^7. > The matrix X is as follows: size of I*(I*J), block size of I, and non-zeros of 1.4*10^8. > -> X=[X1 X2 ...] > The matrix B is as follows: size of I*1, and dense. > The matrix M is as follows: size of I*J. M is the multiplication of each block of the matrix X and the matrix B. > -> M=[X1*B X2*B ...] > > I have to get the matrix M, given X and B. I successfully set up very large aij matrix X. However, I don't know what I have to do next. I tried two ways, but I failed. > > First, I tried to get each block X1, X2,... using ISCreate and MatGetLocalSubMatrix, compute X1*B, X2*B,... using MatMatMult, and then set up the matrix M using ISLocalToGlobalMappingCreate and MatSetLocalToGlobalMapping. However, I got "No support for this operation for this object type" error. Also, this code was very slow because of 48 million loops. > > > > Look, we have rules on this list. Without them, we cannot help you. You MUST send the ENTIRE error output. > Without that, we are just guessing. > > > > > Second, I tried to set up the new matrix BB from B. The matrix BB has the size of (I*J)*J and the block size of I*1. And every diagonal block of BB is B and all other blocks are 0 matrices. That is, > > | B 0 0 .. 0 | > BB = | 0 B 0 .. 0 | > | ... | > | 0 0 0 .. B | > > This is not a block diagonal matrix because B is not square. Anyway, if I set up BB, I can get M easily because M=X*BB. However, I got "out of memory" error from MatCreateSeqAIJ. I think this is because BB has non-zeros of I*J=10^15. > > > > If you get an Out Of Memory error (which we need to see), it means that you are out of memory. > > > Matt > > > I never have any other ideas. If someone can fix my wrong ways correctly or has new ideas, then please please let me know. Thank you very much. > > Joon > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From bsmith at mcs.anl.gov Wed May 29 16:10:51 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 29 May 2013 16:10:51 -0500 Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: <874ndly10j.fsf@mcs.anl.gov> References: <874ndly10j.fsf@mcs.anl.gov> Message-ID: <6E78EA6A-0550-481C-B76B-654BE3ED8E9E@mcs.anl.gov> Satish, There appear to be about 70 functions in 6000 lines of code in that file. Please branch off of maint and split it into at least 7 files then test and eventually get it back into maint and master. Thanks Barry On May 29, 2013, at 3:38 PM, Jed Brown wrote: > Satish Balay writes: > >> This piece of code does stress compilers in complex/optimized >> mode. When I tried previously - I couldn't reproduce the problem [with >> VS2012] > > We should shorten the file. From jedbrown at mcs.anl.gov Wed May 29 16:18:51 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 29 May 2013 16:18:51 -0500 Subject: [petsc-users] Compile flags for a Blue Gene\Q In-Reply-To: References: Message-ID: <87mwrdwklg.fsf@mcs.anl.gov> Christian Klettner writes: > Dear PETSc group, > We have been given access to a Blue Gene\Q system to run code we normally > run on a traditional cluster architecture. We have ported the code > successfully and run some jobs using 16 cores per node however the > performance is roughly four times slower to that of a Xeon processor. We > expect less performance (due to the slower chips) however this seems a bit > excessive. One problem we think is that we are not using all four hardware > threads per core. The cache architecture is different, so the factor of 4x is not unreasonable. Sparse linear algebra is bandwidth limited and you don't need to use threads for bandwidth. http://www.alcf.anl.gov/sites/www.alcf.anl.gov/files/miracon_AppPerform_BobWalkup_0.pdf You can configure petsc-dev to use pthreads and/or OpenMP (it was disabled in the release), but that code is experimental and still in flux. You can also experiment with this fork to compare the performance of an all-OpenMP approach to threading. https://bitbucket.org/ggorman/petsc-3.3-omp > To achieve this do we need to use a threaded version of PETSc? Could > someone suggest the additional arguments required to make use of the > pthreads when launching an MPI job? We are using the Blue Gene MPI > libraries but are currently unable to use the ESSL blas libraries. We > found the example compile flags for a Blue Gene \P in the PETSc > package but were wondering if anyone had compile flags which they > would recommend for a Blue Gene\Q? Best regards, Christian You could start with something like this (fixing paths as appropriate; choose your optimization flags). ./configure --with-blas-lapack-lib=/soft/libraries/essl/5.1.1-0.beta/lib64/libesslbg.a --with-mpi-dir=/bgsys/drivers/ppcfloor/comm/xl/ --with-mpi-dir=/bgsys/drivers/ppcfloor/comm/xl.legacy.ndebug --with-debugging=0 COPTFLAGS="-O3 -qnohot -qsimd=noauto -qsmp=omp:noauto" FOPTFLAGS="-O3 -qnohot -qsimd=noauto -qsmp=omp:noauto" PETSC_ARCH=xl-opt From mrosso at uci.edu Wed May 29 16:25:59 2013 From: mrosso at uci.edu (Michele Rosso) Date: Wed, 29 May 2013 14:25:59 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> Message-ID: <51A67267.4010507@uci.edu> Hi, I proceeded as Matt suggested. I am running without nullspace (Dirichlet's BCs) with the following options: -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left The results are fine and the run finishes. The output of -ksp_view does not say anything about the coarse solver being shifted. In fact the option -mg_coarse_sub_pc_factor_shift_type NONZERO is not used: #PETSc Option Table entries: -ksp_view -mg_coarse_sub_pc_factor_shift_type NONZERO -options_left -pc_gamg_agg_nsmooths 1 -pc_mg_cycle_type v -pc_type gamg #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO Michele On 05/24/2013 03:20 PM, Matthew Knepley wrote: > On Fri, May 24, 2013 at 5:18 PM, Michele Rosso > wrote: > > Using > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_sub_pc_factor_shift_type NONZERO -option_left -ksp_view > > > This is what debugging is about. We are not running your problem. How > could this be debugged? > > 1) Run the problem w/o a null space so that it finishes > > 2) Look at the output for -ksp_view > > 3) Does the coarse solver say that it is shifted? > > 4) Are there options which were unused? > > Matt > > still produces the same error: > > [0]PCSetData_AGG bs=1 MM=131072 > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Detected zero pivot in LU factorization: > see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! > [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 tolerance > 2.22045e-14! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > 11:26:24 CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri May 24 > 17:06:50 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatPivotCheck_none() line 583 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatPivotCheck() line 602 in > src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h > [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in > src/mat/impls/aij/seq/aijfact.c > [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in > src/mat/interface/matrix.c > [0]PETSC ERROR: PCSetUp_LU() line 160 in > src/ksp/pc/impls/factor/lu/lu.c > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > src/ksp/pc/impls/bjacobi/bjacobi.c > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in > src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in > src/ksp/pc/impls/bjacobi/bjacobi.c > [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in > src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in > src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: PCMGMCycle_Private() line 20 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCMGMCycle_Private() line 49 in > src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c > [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c > > On 05/24/2013 02:51 PM, Jed Brown wrote: >> Michele Rosso writes: >> >>>> With petsc-3.4 (which you should upgrade to), use >>>> -mg_coarse_sub_pc_factor_shift_type NONZERO >> Actually, use this with petsc-3.3 also (and please upgrade to >> petsc-3.4). >> >> The option you were passing was not being used. >> > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed May 29 16:29:45 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 29 May 2013 17:29:45 -0400 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <51A67267.4010507@uci.edu> References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> Message-ID: On Wed, May 29, 2013 at 5:25 PM, Michele Rosso wrote: > Hi, > > I proceeded as Matt suggested. I am running without nullspace (Dirichlet's > BCs) with the following options: > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left > > The results are fine and the run finishes. The output of -ksp_view does > not say anything about the coarse solver being shifted. > In fact the option -mg_coarse_sub_pc_factor_shift_type NONZERO is not > used: > Please show us the COMPLETE output. Thanks, Matt > #PETSc Option Table entries: > -ksp_view > -mg_coarse_sub_pc_factor_shift_type NONZERO > -options_left > -pc_gamg_agg_nsmooths 1 > -pc_mg_cycle_type v > -pc_type gamg > #End of PETSc Option Table entries > There is one unused database option. It is: > Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO > > Michele > > > On 05/24/2013 03:20 PM, Matthew Knepley wrote: > > On Fri, May 24, 2013 at 5:18 PM, Michele Rosso wrote: > >> Using >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> -mg_coarse_sub_pc_factor_shift_type NONZERO -option_left -ksp_view >> > > This is what debugging is about. We are not running your problem. How > could this be debugged? > > 1) Run the problem w/o a null space so that it finishes > > 2) Look at the output for -ksp_view > > 3) Does the coarse solver say that it is shifted? > > 4) Are there options which were unused? > > Matt > > >> still produces the same error: >> >> [0]PCSetData_AGG bs=1 MM=131072 >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: Detected zero pivot in LU factorization: >> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >> [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 tolerance >> 2.22045e-14! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 >> CDT 2012 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri May 24 17:06:50 >> 2013 >> [0]PETSC ERROR: Libraries linked from >> [0]PETSC ERROR: Configure run at >> [0]PETSC ERROR: Configure options >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: MatPivotCheck_none() line 583 in >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> [0]PETSC ERROR: MatPivotCheck() line 602 in >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >> src/mat/impls/aij/seq/aijfact.c >> [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >> src/mat/interface/matrix.c >> [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c >> [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >> src/ksp/pc/impls/bjacobi/bjacobi.c >> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >> src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >> src/ksp/pc/impls/bjacobi/bjacobi.c >> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >> src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c >> [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c >> >> On 05/24/2013 02:51 PM, Jed Brown wrote: >> >> Michele Rosso writes: >> >> >> With petsc-3.4 (which you should upgrade to), use >> -mg_coarse_sub_pc_factor_shift_type NONZERO >> >> Actually, use this with petsc-3.3 also (and please upgrade to >> petsc-3.4). >> >> The option you were passing was not being used. >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Wed May 29 18:22:49 2013 From: mrosso at uci.edu (Michele Rosso) Date: Wed, 29 May 2013 16:22:49 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> Message-ID: <51A68DC9.4040602@uci.edu> I attached the complete output. I used 2 processors for this run. I wanted to use only one to have a cleaner output but I could not because of this error: [0]PCSetData_AGG bs=1 MM=32768 [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Arguments are incompatible! [0]PETSC ERROR: MatMatMult requires A, mpiaij, to be compatible with B, seqaij! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid15343 by Unknown Wed May 29 18:15:24 2013 [0]PETSC ERROR: Libraries linked from [0]PETSC ERROR: Configure run at [0]PETSC ERROR: Configure options [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatMatMult() line 8614 in src/mat/interface/matrix.c [0]PETSC ERROR: PCGAMGOptprol_AGG() line 1358 in src/ksp/pc/impls/gamg/agg.c [0]PETSC ERROR: PCSetUp_GAMG() line 673 in src/ksp/pc/impls/gamg/gamg.c [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. On 05/29/2013 02:29 PM, Matthew Knepley wrote: > On Wed, May 29, 2013 at 5:25 PM, Michele Rosso > wrote: > > Hi, > > I proceeded as Matt suggested. I am running without nullspace > (Dirichlet's BCs) with the following options: > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left > > The results are fine and the run finishes. The output of -ksp_view > does not say anything about the coarse solver being shifted. > In fact the option -mg_coarse_sub_pc_factor_shift_type NONZERO is > not used: > > > Please show us the COMPLETE output. > > Thanks, > > Matt > > #PETSc Option Table entries: > -ksp_view > -mg_coarse_sub_pc_factor_shift_type NONZERO > -options_left > -pc_gamg_agg_nsmooths 1 > -pc_mg_cycle_type v > -pc_type gamg > #End of PETSc Option Table entries > There is one unused database option. It is: > Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO > > Michele > > > On 05/24/2013 03:20 PM, Matthew Knepley wrote: >> On Fri, May 24, 2013 at 5:18 PM, Michele Rosso > > wrote: >> >> Using >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> -mg_coarse_sub_pc_factor_shift_type NONZERO -option_left >> -ksp_view >> >> >> This is what debugging is about. We are not running your problem. >> How could this be debugged? >> >> 1) Run the problem w/o a null space so that it finishes >> >> 2) Look at the output for -ksp_view >> >> 3) Does the coarse solver say that it is shifted? >> >> 4) Are there options which were unused? >> >> Matt >> >> still produces the same error: >> >> [0]PCSetData_AGG bs=1 MM=131072 >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: Detected zero pivot in LU factorization: >> see >> http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >> [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 >> tolerance 2.22045e-14! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug >> 29 11:26:24 CDT 2012 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble >> shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri May >> 24 17:06:50 2013 >> [0]PETSC ERROR: Libraries linked from >> [0]PETSC ERROR: Configure run at >> [0]PETSC ERROR: Configure options >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: MatPivotCheck_none() line 583 in >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> [0]PETSC ERROR: MatPivotCheck() line 602 in >> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >> src/mat/impls/aij/seq/aijfact.c >> [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >> src/mat/interface/matrix.c >> [0]PETSC ERROR: PCSetUp_LU() line 160 in >> src/ksp/pc/impls/factor/lu/lu.c >> [0]PETSC ERROR: PCSetUp() line 832 in >> src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUp() line 278 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line >> 715 in src/ksp/pc/impls/bjacobi/bjacobi.c >> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >> src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: KSPSolve() line 403 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line >> 715 in src/ksp/pc/impls/bjacobi/bjacobi.c >> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >> src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: KSPSolve() line 403 in >> src/ksp/ksp/interface/itfunc.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 20 in >> src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >> src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >> src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >> src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c >> [0]PETSC ERROR: PCApply() line 384 in >> src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSolve_CG() line 139 in >> src/ksp/ksp/impls/cg/cg.c >> [0]PETSC ERROR: KSPSolve() line 446 in >> src/ksp/ksp/interface/itfunc.c >> >> On 05/24/2013 02:51 PM, Jed Brown wrote: >>> Michele Rosso writes: >>> >>>>> With petsc-3.4 (which you should upgrade to), use >>>>> -mg_coarse_sub_pc_factor_shift_type NONZERO >>> Actually, use this with petsc-3.3 also (and please upgrade to >>> petsc-3.4). >>> >>> The option you were passing was not being used. >>> >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- [0]PCSetData_AGG bs=1 MM=16384 KSP Object: 2 MPI processes type: cg maximum iterations=10000 tolerances: relative=1e-14, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using PRECONDITIONED norm type for convergence test PC Object: 2 MPI processes type: gamg MG: type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using Galerkin computed coarse grid matrices Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 2 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 2 MPI processes type: bjacobi block Jacobi: number of blocks = 2 Local solve info for each block is in the following KSP and PC objects: [0] number of local blocks = 1, first local block number = 0 [0] local block number 0 KSP Object: KSP Object: (mg_coarse_sub_) (mg_coarse_sub_) 1 MPI processes 1 MPI processes type: preonly type: preonly maximum iterations=10000, initial guess is zero maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning left preconditioning using NONE norm type for convergence test using NONE norm type for convergence test PC Object:PC Object: (mg_coarse_sub_)(mg_coarse_sub_) 1 MPI processes 1 MPI processes type: lu type: lu LU: out-of-place factorization LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 tolerance for zero pivot 2.22045e-14 matrix ordering: nd matrix ordering: nd factor fill ratio given 5, needed 1.07862 factor fill ratio given 5, needed 0 Factored matrix follows: Factored matrix follows: Matrix Object:Matrix Object: 1 MPI processes 1 MPI processes type: seqaij type: seqaij rows=61, cols=61 rows=0, cols=0 package used to perform factorization: petsc package used to perform factorization: petsc total: nonzeros=3677, allocated nonzeros=3677 total: nonzeros=1, allocated nonzeros=1 total number of mallocs used during MatSetValues calls =0 total number of mallocs used during MatSetValues calls =0 not using I-node routines not using I-node routines linear system matrix = precond matrix: linear system matrix = precond matrix: Matrix Object:Matrix Object: 1 MPI processes 1 MPI processes type: seqaij type: seqaij rows=61, cols=61 rows=0, cols=0 total: nonzeros=3409, allocated nonzeros=3409 total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 total number of mallocs used during MatSetValues calls =0 not using I-node routines not using I-node routines - - - - - - - - - - - - - - - - - - [1] number of local blocks = 1, first local block number = 1 [1] local block number 0 - - - - - - - - - - - - - - - - - - linear system matrix = precond matrix: Matrix Object: 2 MPI processes type: mpiaij rows=61, cols=61 total: nonzeros=3409, allocated nonzeros=3409 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 2 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.0266892, max = 1.42507 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_1_) 2 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 2 MPI processes type: mpiaij rows=3102, cols=3102 total: nonzeros=108330, allocated nonzeros=108330 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 2 MPI processes type: chebyshev Chebyshev: eigenvalue estimates: min = 0.182939, max = 2.0291 maximum iterations=2 tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using nonzero initial guess using NONE norm type for convergence test PC Object: (mg_levels_2_) 2 MPI processes type: jacobi linear system matrix = precond matrix: Matrix Object: 2 MPI processes type: mpiaij rows=32768, cols=32768 total: nonzeros=229376, allocated nonzeros=229376 total number of mallocs used during MatSetValues calls =0 Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Matrix Object: 2 MPI processes type: mpiaij rows=32768, cols=32768 total: nonzeros=229376, allocated nonzeros=229376 total number of mallocs used during MatSetValues calls =0 #PETSc Option Table entries: -ksp_view -mg_coarse_sub_pc_factor_shift_type NONZERO -options_left -pc_gamg_agg_nsmooths 1 -pc_mg_cycle_type v -pc_type gamg #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO From knepley at gmail.com Wed May 29 19:17:07 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 29 May 2013 20:17:07 -0400 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <51A68DC9.4040602@uci.edu> References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> <51A68DC9.4040602@uci.edu> Message-ID: On Wed, May 29, 2013 at 7:22 PM, Michele Rosso wrote: > I attached the complete output. > I used 2 processors for this run. I wanted to use only one to have a > cleaner output but I could not because of this error: > There are 2 problems here: 1) GAMG is not accepting options for the coarse grid solver 2) You do not appear to using PETSc 3.4, or the shift you are trying to set would be there by default Matt > [0]PCSetData_AGG bs=1 MM=32768 > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Arguments are incompatible! > [0]PETSC ERROR: MatMatMult requires A, mpiaij, to be compatible with B, > seqaij! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 > CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid15343 by Unknown Wed May 29 18:15:24 > 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatMatMult() line 8614 in src/mat/interface/matrix.c > [0]PETSC ERROR: PCGAMGOptprol_AGG() line 1358 in > src/ksp/pc/impls/gamg/agg.c > [0]PETSC ERROR: PCSetUp_GAMG() line 673 in src/ksp/pc/impls/gamg/gamg.c > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. > > On 05/29/2013 02:29 PM, Matthew Knepley wrote: > > On Wed, May 29, 2013 at 5:25 PM, Michele Rosso wrote: > >> Hi, >> >> I proceeded as Matt suggested. I am running without nullspace >> (Dirichlet's BCs) with the following options: >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left >> >> The results are fine and the run finishes. The output of -ksp_view does >> not say anything about the coarse solver being shifted. >> In fact the option -mg_coarse_sub_pc_factor_shift_type NONZERO is not >> used: >> > > Please show us the COMPLETE output. > > Thanks, > > Matt > > >> #PETSc Option Table entries: >> -ksp_view >> -mg_coarse_sub_pc_factor_shift_type NONZERO >> -options_left >> -pc_gamg_agg_nsmooths 1 >> -pc_mg_cycle_type v >> -pc_type gamg >> #End of PETSc Option Table entries >> There is one unused database option. It is: >> Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO >> >> Michele >> >> >> On 05/24/2013 03:20 PM, Matthew Knepley wrote: >> >> On Fri, May 24, 2013 at 5:18 PM, Michele Rosso wrote: >> >>> Using >>> >>> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >>> -mg_coarse_sub_pc_factor_shift_type NONZERO -option_left -ksp_view >>> >> >> This is what debugging is about. We are not running your problem. How >> could this be debugged? >> >> 1) Run the problem w/o a null space so that it finishes >> >> 2) Look at the output for -ksp_view >> >> 3) Does the coarse solver say that it is shifted? >> >> 4) Are there options which were unused? >> >> Matt >> >> >>> still produces the same error: >>> >>> [0]PCSetData_AGG bs=1 MM=131072 >>> [0]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [0]PETSC ERROR: Detected zero pivot in LU factorization: >>> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >>> [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 tolerance >>> 2.22045e-14! >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 >>> 11:26:24 CDT 2012 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri May 24 >>> 17:06:50 2013 >>> [0]PETSC ERROR: Libraries linked from >>> [0]PETSC ERROR: Configure run at >>> [0]PETSC ERROR: Configure options >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: MatPivotCheck_none() line 583 in >>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>> [0]PETSC ERROR: MatPivotCheck() line 602 in >>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >>> src/mat/impls/aij/seq/aijfact.c >>> [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >>> src/mat/interface/matrix.c >>> [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c >>> [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >>> src/ksp/pc/impls/bjacobi/bjacobi.c >>> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >>> src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >>> src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >>> src/ksp/pc/impls/bjacobi/bjacobi.c >>> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >>> src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >>> src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c >>> [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c >>> >>> On 05/24/2013 02:51 PM, Jed Brown wrote: >>> >>> Michele Rosso writes: >>> >>> >>> With petsc-3.4 (which you should upgrade to), use >>> -mg_coarse_sub_pc_factor_shift_type NONZERO >>> >>> Actually, use this with petsc-3.3 also (and please upgrade to >>> petsc-3.4). >>> >>> The option you were passing was not being used. >>> >>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrosso at uci.edu Wed May 29 19:29:40 2013 From: mrosso at uci.edu (Michele Rosso) Date: Wed, 29 May 2013 17:29:40 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> <51A68DC9.4040602@uci.edu> Message-ID: <51A69D74.9030401@uci.edu> 1) How do I solve this? 2) So upgrading to version 3.4 would take care of the shift without the need to specify any option? Michele On 05/29/2013 05:17 PM, Matthew Knepley wrote: > On Wed, May 29, 2013 at 7:22 PM, Michele Rosso > wrote: > > I attached the complete output. > I used 2 processors for this run. I wanted to use only one to have > a cleaner output but I could not because of this error: > > > There are 2 problems here: > > 1) GAMG is not accepting options for the coarse grid solver > > 2) You do not appear to using PETSc 3.4, or the shift you are trying > to set would be there by default > > Matt > > [0]PCSetData_AGG bs=1 MM=32768 > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Arguments are incompatible! > [0]PETSC ERROR: MatMatMult requires A, mpiaij, to be compatible > with B, seqaij! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 > 11:26:24 CDT 2012 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./hit on a named nid15343 by Unknown Wed May 29 > 18:15:24 2013 > [0]PETSC ERROR: Libraries linked from > [0]PETSC ERROR: Configure run at > [0]PETSC ERROR: Configure options > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatMatMult() line 8614 in src/mat/interface/matrix.c > [0]PETSC ERROR: PCGAMGOptprol_AGG() line 1358 in > src/ksp/pc/impls/gamg/agg.c > [0]PETSC ERROR: PCSetUp_GAMG() line 673 in > src/ksp/pc/impls/gamg/gamg.c > [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c > [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c > hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. > > On 05/29/2013 02:29 PM, Matthew Knepley wrote: >> On Wed, May 29, 2013 at 5:25 PM, Michele Rosso > > wrote: >> >> Hi, >> >> I proceeded as Matt suggested. I am running without nullspace >> (Dirichlet's BCs) with the following options: >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view >> -options_left >> >> The results are fine and the run finishes. The output of >> -ksp_view does not say anything about the coarse solver being >> shifted. >> In fact the option -mg_coarse_sub_pc_factor_shift_type >> NONZERO is not used: >> >> >> Please show us the COMPLETE output. >> >> Thanks, >> >> Matt >> >> #PETSc Option Table entries: >> -ksp_view >> -mg_coarse_sub_pc_factor_shift_type NONZERO >> -options_left >> -pc_gamg_agg_nsmooths 1 >> -pc_mg_cycle_type v >> -pc_type gamg >> #End of PETSc Option Table entries >> There is one unused database option. It is: >> Option left: name:-mg_coarse_sub_pc_factor_shift_type value: >> NONZERO >> >> Michele >> >> >> On 05/24/2013 03:20 PM, Matthew Knepley wrote: >>> On Fri, May 24, 2013 at 5:18 PM, Michele Rosso >>> > wrote: >>> >>> Using >>> >>> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths >>> 1 -mg_coarse_sub_pc_factor_shift_type NONZERO >>> -option_left -ksp_view >>> >>> >>> This is what debugging is about. We are not running your >>> problem. How could this be debugged? >>> >>> 1) Run the problem w/o a null space so that it finishes >>> >>> 2) Look at the output for -ksp_view >>> >>> 3) Does the coarse solver say that it is shifted? >>> >>> 4) Are there options which were unused? >>> >>> Matt >>> >>> still produces the same error: >>> >>> [0]PCSetData_AGG bs=1 MM=131072 >>> [0]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [0]PETSC ERROR: Detected zero pivot in LU factorization: >>> see >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >>> [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 >>> tolerance 2.22045e-14! >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, >>> Wed Aug 29 11:26:24 CDT 2012 >>> [0]PETSC ERROR: See docs/changes/index.html for recent >>> updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about >>> trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri >>> May 24 17:06:50 2013 >>> [0]PETSC ERROR: Libraries linked from >>> [0]PETSC ERROR: Configure run at >>> [0]PETSC ERROR: Configure options >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: MatPivotCheck_none() line 583 in >>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>> [0]PETSC ERROR: MatPivotCheck() line 602 in >>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >>> src/mat/impls/aij/seq/aijfact.c >>> [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >>> src/mat/interface/matrix.c >>> [0]PETSC ERROR: PCSetUp_LU() line 160 in >>> src/ksp/pc/impls/factor/lu/lu.c >>> [0]PETSC ERROR: PCSetUp() line 832 in >>> src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSetUp() line 278 in >>> src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() >>> line 715 in src/ksp/pc/impls/bjacobi/bjacobi.c >>> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >>> src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >>> src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: KSPSolve() line 403 in >>> src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() >>> line 715 in src/ksp/pc/impls/bjacobi/bjacobi.c >>> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >>> src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >>> src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: KSPSolve() line 403 in >>> src/ksp/ksp/interface/itfunc.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 20 in >>> src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >>> src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >>> src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in >>> src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply_MG() line 326 in >>> src/ksp/pc/impls/mg/mg.c >>> [0]PETSC ERROR: PCApply() line 384 in >>> src/ksp/pc/interface/precon.c >>> [0]PETSC ERROR: KSPSolve_CG() line 139 in >>> src/ksp/ksp/impls/cg/cg.c >>> [0]PETSC ERROR: KSPSolve() line 446 in >>> src/ksp/ksp/interface/itfunc.c >>> >>> On 05/24/2013 02:51 PM, Jed Brown wrote: >>>> Michele Rosso writes: >>>> >>>>>> With petsc-3.4 (which you should upgrade to), use >>>>>> -mg_coarse_sub_pc_factor_shift_type NONZERO >>>> Actually, use this with petsc-3.3 also (and please upgrade to >>>> petsc-3.4). >>>> >>>> The option you were passing was not being used. >>>> >>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin >>> their experiments is infinitely more interesting than any >>> results to which their experiments lead. >>> -- Norbert Wiener >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to >> which their experiments lead. >> -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed May 29 20:08:05 2013 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 29 May 2013 21:08:05 -0400 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <51A69D74.9030401@uci.edu> References: <519687DD.4050209@uci.edu> <87sj1lnw89.fsf@mcs.anl.gov> <5196AB98.2050800@uci.edu> <87a9ntntfg.fsf@mcs.anl.gov> <5196B421.8070302@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> <51A68DC9.4040602@uci.edu> <51A69D74.9030401@uci.edu> Message-ID: On Wed, May 29, 2013 at 8:29 PM, Michele Rosso wrote: > 1) How do I solve this? > We will fix this, but any fix will necessitate upgrading. 2) So upgrading to version 3.4 would take care of the shift without the > need to specify any option? > Yes Matt > Michele > > On 05/29/2013 05:17 PM, Matthew Knepley wrote: > > On Wed, May 29, 2013 at 7:22 PM, Michele Rosso wrote: > >> I attached the complete output. >> I used 2 processors for this run. I wanted to use only one to have a >> cleaner output but I could not because of this error: >> > > There are 2 problems here: > > 1) GAMG is not accepting options for the coarse grid solver > > 2) You do not appear to using PETSc 3.4, or the shift you are trying > to set would be there by default > > Matt > > >> [0]PCSetData_AGG bs=1 MM=32768 >> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: Arguments are incompatible! >> [0]PETSC ERROR: MatMatMult requires A, mpiaij, to be compatible with B, >> seqaij! >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 >> CDT 2012 >> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: ./hit on a named nid15343 by Unknown Wed May 29 18:15:24 >> 2013 >> [0]PETSC ERROR: Libraries linked from >> [0]PETSC ERROR: Configure run at >> [0]PETSC ERROR: Configure options >> [0]PETSC ERROR: >> ------------------------------------------------------------------------ >> [0]PETSC ERROR: MatMatMult() line 8614 in src/mat/interface/matrix.c >> [0]PETSC ERROR: PCGAMGOptprol_AGG() line 1358 in >> src/ksp/pc/impls/gamg/agg.c >> [0]PETSC ERROR: PCSetUp_GAMG() line 673 in src/ksp/pc/impls/gamg/gamg.c >> [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c >> [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c >> hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed. >> >> On 05/29/2013 02:29 PM, Matthew Knepley wrote: >> >> On Wed, May 29, 2013 at 5:25 PM, Michele Rosso wrote: >> >>> Hi, >>> >>> I proceeded as Matt suggested. I am running without nullspace >>> (Dirichlet's BCs) with the following options: >>> >>> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >>> -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left >>> >>> The results are fine and the run finishes. The output of -ksp_view does >>> not say anything about the coarse solver being shifted. >>> In fact the option -mg_coarse_sub_pc_factor_shift_type NONZERO is not >>> used: >>> >> >> Please show us the COMPLETE output. >> >> Thanks, >> >> Matt >> >> >>> #PETSc Option Table entries: >>> -ksp_view >>> -mg_coarse_sub_pc_factor_shift_type NONZERO >>> -options_left >>> -pc_gamg_agg_nsmooths 1 >>> -pc_mg_cycle_type v >>> -pc_type gamg >>> #End of PETSc Option Table entries >>> There is one unused database option. It is: >>> Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO >>> >>> Michele >>> >>> >>> On 05/24/2013 03:20 PM, Matthew Knepley wrote: >>> >>> On Fri, May 24, 2013 at 5:18 PM, Michele Rosso wrote: >>> >>>> Using >>>> >>>> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >>>> -mg_coarse_sub_pc_factor_shift_type NONZERO -option_left -ksp_view >>>> >>> >>> This is what debugging is about. We are not running your problem. How >>> could this be debugged? >>> >>> 1) Run the problem w/o a null space so that it finishes >>> >>> 2) Look at the output for -ksp_view >>> >>> 3) Does the coarse solver say that it is shifted? >>> >>> 4) Are there options which were unused? >>> >>> Matt >>> >>> >>>> still produces the same error: >>>> >>>> [0]PCSetData_AGG bs=1 MM=131072 >>>> [0]PETSC ERROR: --------------------- Error Message >>>> ------------------------------------ >>>> [0]PETSC ERROR: Detected zero pivot in LU factorization: >>>> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot! >>>> [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17 tolerance >>>> 2.22045e-14! >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 >>>> 11:26:24 CDT 2012 >>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>>> [0]PETSC ERROR: See docs/index.html for manual pages. >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: ./hit on a named nid15363 by Unknown Fri May 24 >>>> 17:06:50 2013 >>>> [0]PETSC ERROR: Libraries linked from >>>> [0]PETSC ERROR: Configure run at >>>> [0]PETSC ERROR: Configure options >>>> [0]PETSC ERROR: >>>> ------------------------------------------------------------------------ >>>> [0]PETSC ERROR: MatPivotCheck_none() line 583 in >>>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>>> [0]PETSC ERROR: MatPivotCheck() line 602 in >>>> src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h >>>> [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in >>>> src/mat/impls/aij/seq/aijfact.c >>>> [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in >>>> src/mat/interface/matrix.c >>>> [0]PETSC ERROR: PCSetUp_LU() line 160 in src/ksp/pc/impls/factor/lu/lu.c >>>> [0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >>>> src/ksp/pc/impls/bjacobi/bjacobi.c >>>> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >>>> src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >>>> src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 715 in >>>> src/ksp/pc/impls/bjacobi/bjacobi.c >>>> [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in >>>> src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in >>>> src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: KSPSolve() line 403 in src/ksp/ksp/interface/itfunc.c >>>> [0]PETSC ERROR: PCMGMCycle_Private() line 20 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCMGMCycle_Private() line 49 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c >>>> [0]PETSC ERROR: PCApply() line 384 in src/ksp/pc/interface/precon.c >>>> [0]PETSC ERROR: KSPSolve_CG() line 139 in src/ksp/ksp/impls/cg/cg.c >>>> [0]PETSC ERROR: KSPSolve() line 446 in src/ksp/ksp/interface/itfunc.c >>>> >>>> On 05/24/2013 02:51 PM, Jed Brown wrote: >>>> >>>> Michele Rosso writes: >>>> >>>> >>>> With petsc-3.4 (which you should upgrade to), use >>>> -mg_coarse_sub_pc_factor_shift_type NONZERO >>>> >>>> Actually, use this with petsc-3.3 also (and please upgrade to >>>> petsc-3.4). >>>> >>>> The option you were passing was not being used. >>>> >>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Wed May 29 23:05:01 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Wed, 29 May 2013 23:05:01 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <51A67267.4010507@uci.edu> References: <519687DD.4050209@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> Message-ID: <8761y1w1si.fsf@mcs.anl.gov> Michele Rosso writes: > Hi, > > I proceeded as Matt suggested. I am running without nullspace > (Dirichlet's BCs) with the following options: > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left > > The results are fine and the run finishes. The output of -ksp_view does > not say anything about the coarse solver being shifted. With petsc-3.4, it has these lines: tolerance for zero pivot 2.22045e-14 using diagonal shift on blocks to prevent zero pivot You can try a proposed fix in the branch 'jed/fix-gamg-coarse'. Hong, I think this diagnostic output is unintuitive for a user that doesn't know this relation. Shall we add the enum names to the output lines below? if (factor->info.shifttype==(PetscReal)MAT_SHIFT_POSITIVE_DEFINITE) { ierr = PetscViewerASCIIPrintf(viewer," using Manteuffel shift\n");CHKERRQ(ierr); } if (factor->info.shifttype==(PetscReal)MAT_SHIFT_NONZERO) { ierr = PetscViewerASCIIPrintf(viewer," using diagonal shift to prevent zero pivot\n");CHKERRQ(ierr); } if (factor->info.shifttype==(PetscReal)MAT_SHIFT_INBLOCKS) { ierr = PetscViewerASCIIPrintf(viewer," using diagonal shift on blocks to prevent zero pivot\n");CHKERRQ(ierr); } From fande.kong at colorado.edu Thu May 30 01:10:55 2013 From: fande.kong at colorado.edu (Fande Kong) Date: Thu, 30 May 2013 00:10:55 -0600 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: <8761y2yluj.fsf@mcs.anl.gov> References: <8761y2yluj.fsf@mcs.anl.gov> Message-ID: Thanks Jed, I understood what you said. If I switch to use another run script which is used to set up two-level preconditioner, I have another question. Without changing the source code, I used the following script: mpirun -n 1 ./ex29 -ksp_type fgmres -pc_type mg -pc_mg_levels 2 -pc_mg_cycle_type v -pc_mg_type multiplicative -mg_levels_1_ksp_type richardson -mg_levels_1_ksp_max_it 1 -mg_levels_1_pc_type asm -mg_levels_1_sub_ksp_type preonly -mg_levels_1_sub_pc_type ilu -mg_levels_1_sub_pc_factor_levels 2 -mg_levels_1_sub_pc_factor_mat_ordering_type rcm -mg_coarse_ksp_type gmres -mg_coarse_ksp_rtol 0.1 -mg_coarse_ksp_max_it 10 -mg_coarse_pc_type asm -mg_coarse_sub_ksp_type preonly -mg_coarse_sub_pc_type ilu -mg_coarse_sub_pc_factor_levels 2 -mg_coarse_sub_pc_factor_mat_ordering_type rcm -ksp_view -pc_mg_log -da_refine 1 -log_summary >printresult then I got: 4653296 bytes MatILUFactorSymbolic_SeqAIJ() line 1867 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 4653296 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 1165376 bytes MatILUFactorSymbolic_SeqAIJ() line 1840 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 1165360 bytes MatILUFactorSymbolic_SeqAIJ() line 1867 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c 520208 bytes VecScatterCreate() line 1008 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/utils/vscat.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c 520208 bytes VecCreate_Seq() line 40 in /home/fdkong/math/petsc-3.3-p7/src/vec/vec/impls/seq/bvec3.c In this case, we have four MatSeqAIJSetPreallocations ( for fine matrix and coarse matrix respectively) rather than two. I want to set up preconditioner from options. Are there some things wrong for the script? On Wed, May 29, 2013 at 7:08 AM, Jed Brown wrote: > Fande Kong writes: > > > Last, I got: > > > > > > 5279056 bytes MatLUFactorSymbolic_SeqAIJ() line 380 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > > 5279040 bytes MatLUFactorSymbolic_SeqAIJ() line 392 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aijfact.c > > 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > > 2592848 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > > 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > > 1167392 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > > 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > > 651264 bytes MatSeqAIJSetPreallocation_SeqAIJ() line 3439 in > > /home/fdkong/math/petsc-3.3-p7/src/mat/impls/aij/seq/aij.c > > Fine-grid matrix, interpolation, coarse grid matrix, and LU factors for > coarse grid matrix. Each of these has two large allocations when > running in debug mode: > > ierr = > PetscMalloc3(nz,PetscScalar,&b->a,nz,PetscInt,&b->j,B->rmap->n+1,PetscInt,&b->i);CHKERRQ(ierr); > > In optimized mode (which you should *always* use when measuring > performance), PetscMalloc3 is implemented with only one malloc. > > Add '-malloc' to get a tracing malloc when running with an optimized > build of PETSc. > > -- Fande Kong Department of Computer Science University of Colorado at Boulder -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Thu May 30 07:31:34 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Thu, 30 May 2013 07:31:34 -0500 Subject: [petsc-users] How to measure the memory usage of the application built on the Petsc? In-Reply-To: References: <8761y2yluj.fsf@mcs.anl.gov> Message-ID: <87wqqgvec9.fsf@mcs.anl.gov> Fande Kong writes: > In this case, we have four MatSeqAIJSetPreallocations ( for fine matrix and > coarse matrix respectively) rather than two. I want to set up > preconditioner from options. Are there some things wrong for the script? This is expected since ASM also needs the overlapping matrices. From ztdepyahoo at 163.com Thu May 30 10:24:21 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Thu, 30 May 2013 23:24:21 +0800 (CST) Subject: [petsc-users] VecGhostGetLocalForm(gx,&lx) Message-ID: <8a084cd.27c0f.13ef60a2c4b.Coremail.ztdepyahoo@163.com> in the VecGhostGetLocalForm(gx,&lx), does the lx reduplicate the memory of global vector gx? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu May 30 10:27:37 2013 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 30 May 2013 11:27:37 -0400 Subject: [petsc-users] VecGhostGetLocalForm(gx,&lx) In-Reply-To: <8a084cd.27c0f.13ef60a2c4b.Coremail.ztdepyahoo@163.com> References: <8a084cd.27c0f.13ef60a2c4b.Coremail.ztdepyahoo@163.com> Message-ID: On Thu, May 30, 2013 at 11:24 AM, ??? wrote: > in the VecGhostGetLocalForm(gx,&lx), does the lx reduplicate the memory of > global vector gx? > No Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Thu May 30 13:43:22 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 30 May 2013 13:43:22 -0500 (CDT) Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: <6E78EA6A-0550-481C-B76B-654BE3ED8E9E@mcs.anl.gov> References: <874ndly10j.fsf@mcs.anl.gov> <6E78EA6A-0550-481C-B76B-654BE3ED8E9E@mcs.anl.gov> Message-ID: Its not clear if the issue is one of these subroutines or - all of them being together. But spliting up the files might be good anyway. $ wc -l *.c | sort -nr 20605 total 6309 baijfact2.c 3640 baij.c 2140 baij2.c 1889 baijfact11.c 1787 baijfact.c 787 baijfact5.c 677 baijfact7.c 608 baijfact9.c 498 baijfact3.c 462 baijfact13.c 441 dgefa2.c 394 dgefa4.c 155 dgefa7.c 151 dgefa6.c 149 dgefa5.c 146 dgefa3.c 104 aijbaij.c 93 baijfact4.c 88 dgefa.c 87 dgedi.c I see the following routines. I can split them as follows. [Or should I be spliting them up differently?] thanks, Satish **************** baijfact2.c ******************* #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ" #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ_ilu0" #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ_inplace" #define __FUNCT__ "MatLUFactorNumeric_SeqBAIJ_15_NaturalOrdering" #define __FUNCT__ "MatLUFactorNumeric_SeqBAIJ_N" #define __FUNCT__ "MatSetUnfactored_SeqBAIJ_4_NaturalOrdering_SSE" #define __FUNCT__ "MatSetUnfactored_SeqBAIJ_4_NaturalOrdering_SSE_usj" **************** baijsolv.c ******************* #define __FUNCT__ "MatSolve_SeqBAIJ_1" #define __FUNCT__ "MatSolve_SeqBAIJ_1_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_2" #define __FUNCT__ "MatSolve_SeqBAIJ_2_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_3" #define __FUNCT__ "MatSolve_SeqBAIJ_3_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_4" #define __FUNCT__ "MatSolve_SeqBAIJ_4_Demotion" #define __FUNCT__ "MatSolve_SeqBAIJ_4_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_4_SSE_Demotion" #define __FUNCT__ "MatSolve_SeqBAIJ_5" #define __FUNCT__ "MatSolve_SeqBAIJ_5_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_6" #define __FUNCT__ "MatSolve_SeqBAIJ_6_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_7" #define __FUNCT__ "MatSolve_SeqBAIJ_7_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_N_inplace" **************** baijsolvnat.c ******************* #define __FUNCT__ "MatSolve_SeqBAIJ_15_NaturalOrdering_ver1" #define __FUNCT__ "MatSolve_SeqBAIJ_15_NaturalOrdering_ver2" #define __FUNCT__ "MatSolve_SeqBAIJ_1_NaturalOrdering" #define __FUNCT__ "MatSolve_SeqBAIJ_1_NaturalOrdering_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_2_NaturalOrdering" #define __FUNCT__ "MatSolve_SeqBAIJ_2_NaturalOrdering_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_3_NaturalOrdering" #define __FUNCT__ "MatSolve_SeqBAIJ_3_NaturalOrdering_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering" #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_Demotion" #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_SSE_Demotion" #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_SSE_Demotion_usj" #define __FUNCT__ "MatSolve_SeqBAIJ_5_NaturalOrdering" #define __FUNCT__ "MatSolve_SeqBAIJ_5_NaturalOrdering_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_6_NaturalOrdering" #define __FUNCT__ "MatSolve_SeqBAIJ_6_NaturalOrdering_inplace" #define __FUNCT__ "MatSolve_SeqBAIJ_7_NaturalOrdering" #define __FUNCT__ "MatSolve_SeqBAIJ_7_NaturalOrdering_inplace" **************** baijsolvtran.c ******************* #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_N" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_N_inplace" **************** baijsolvtrannat.c ******************* #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_NaturalOrdering" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_NaturalOrdering_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_NaturalOrdering" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_NaturalOrdering_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_NaturalOrdering" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_NaturalOrdering_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_NaturalOrdering" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_NaturalOrdering_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_NaturalOrdering" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_NaturalOrdering_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_NaturalOrdering" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_NaturalOrdering_inplace" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_NaturalOrdering" #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_NaturalOrdering_inplace" On Wed, 29 May 2013, Barry Smith wrote: > > Satish, > > There appear to be about 70 functions in 6000 lines of code in that file. Please branch off of maint and split it into at least 7 files then test and eventually get it back into maint and master. > > Thanks > > Barry > > On May 29, 2013, at 3:38 PM, Jed Brown wrote: > > > Satish Balay writes: > > > >> This piece of code does stress compilers in complex/optimized > >> mode. When I tried previously - I couldn't reproduce the problem [with > >> VS2012] > > > > We should shorten the file. > > From bsmith at mcs.anl.gov Thu May 30 13:58:54 2013 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 30 May 2013 13:58:54 -0500 Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: References: <874ndly10j.fsf@mcs.anl.gov> <6E78EA6A-0550-481C-B76B-654BE3ED8E9E@mcs.anl.gov> Message-ID: <079C25A8-C33D-4D8E-9029-6235CCF5C407@mcs.anl.gov> On May 30, 2013, at 1:43 PM, Satish Balay wrote: > > I see the following routines. I can split them as follows. [Or should > I be spliting them up differently?] That is a fine way to start > > thanks, > Satish > > **************** baijfact2.c ******************* > #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ" > #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ_ilu0" > #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ_inplace" > #define __FUNCT__ "MatLUFactorNumeric_SeqBAIJ_15_NaturalOrdering" > #define __FUNCT__ "MatLUFactorNumeric_SeqBAIJ_N" > #define __FUNCT__ "MatSetUnfactored_SeqBAIJ_4_NaturalOrdering_SSE" > #define __FUNCT__ "MatSetUnfactored_SeqBAIJ_4_NaturalOrdering_SSE_usj" > > **************** baijsolv.c ******************* > > #define __FUNCT__ "MatSolve_SeqBAIJ_1" > #define __FUNCT__ "MatSolve_SeqBAIJ_1_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_2" > #define __FUNCT__ "MatSolve_SeqBAIJ_2_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_3" > #define __FUNCT__ "MatSolve_SeqBAIJ_3_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_4" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_Demotion" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_SSE_Demotion" > #define __FUNCT__ "MatSolve_SeqBAIJ_5" > #define __FUNCT__ "MatSolve_SeqBAIJ_5_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_6" > #define __FUNCT__ "MatSolve_SeqBAIJ_6_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_7" > #define __FUNCT__ "MatSolve_SeqBAIJ_7_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_N_inplace" > > **************** baijsolvnat.c ******************* > #define __FUNCT__ "MatSolve_SeqBAIJ_15_NaturalOrdering_ver1" > #define __FUNCT__ "MatSolve_SeqBAIJ_15_NaturalOrdering_ver2" > #define __FUNCT__ "MatSolve_SeqBAIJ_1_NaturalOrdering" > #define __FUNCT__ "MatSolve_SeqBAIJ_1_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_2_NaturalOrdering" > #define __FUNCT__ "MatSolve_SeqBAIJ_2_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_3_NaturalOrdering" > #define __FUNCT__ "MatSolve_SeqBAIJ_3_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_Demotion" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_SSE_Demotion" > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_SSE_Demotion_usj" > #define __FUNCT__ "MatSolve_SeqBAIJ_5_NaturalOrdering" > #define __FUNCT__ "MatSolve_SeqBAIJ_5_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_6_NaturalOrdering" > #define __FUNCT__ "MatSolve_SeqBAIJ_6_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolve_SeqBAIJ_7_NaturalOrdering" > #define __FUNCT__ "MatSolve_SeqBAIJ_7_NaturalOrdering_inplace" > > **************** baijsolvtran.c ******************* > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_N" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_N_inplace" > > **************** baijsolvtrannat.c ******************* > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_NaturalOrdering" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_NaturalOrdering" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_NaturalOrdering" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_NaturalOrdering" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_NaturalOrdering" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_NaturalOrdering" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_NaturalOrdering_inplace" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_NaturalOrdering" > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_NaturalOrdering_inplace" > > > > On Wed, 29 May 2013, Barry Smith wrote: > >> >> Satish, >> >> There appear to be about 70 functions in 6000 lines of code in that file. Please branch off of maint and split it into at least 7 files then test and eventually get it back into maint and master. >> >> Thanks >> >> Barry >> >> On May 29, 2013, at 3:38 PM, Jed Brown wrote: >> >>> Satish Balay writes: >>> >>>> This piece of code does stress compilers in complex/optimized >>>> mode. When I tried previously - I couldn't reproduce the problem [with >>>> VS2012] >>> >>> We should shorten the file. >> >> > From balay at mcs.anl.gov Thu May 30 15:56:58 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 30 May 2013 15:56:58 -0500 (CDT) Subject: [petsc-users] Petsc configuration failure in windows 7 x64 In-Reply-To: <079C25A8-C33D-4D8E-9029-6235CCF5C407@mcs.anl.gov> References: <874ndly10j.fsf@mcs.anl.gov> <6E78EA6A-0550-481C-B76B-654BE3ED8E9E@mcs.anl.gov> <079C25A8-C33D-4D8E-9029-6235CCF5C407@mcs.anl.gov> Message-ID: pushed 'balay/bajfact-split' as a start. $ wc -l baijfact2.c baijsolv.c baijsolvnat.c baijsolvtran.c baijsolvtrannat.c 790 baijfact2.c 1500 baijsolv.c 1773 baijsolvnat.c 1383 baijsolvtran.c 870 baijsolvtrannat.c 6316 total $ wc -l *.c | sort -nr 20612 total 3640 baij.c 2140 baij2.c 1889 baijfact11.c 1787 baijfact.c 1773 baijsolvnat.c 1500 baijsolv.c 1383 baijsolvtran.c 870 baijsolvtrannat.c 790 baijfact2.c 787 baijfact5.c 677 baijfact7.c 608 baijfact9.c 498 baijfact3.c 462 baijfact13.c 441 dgefa2.c 394 dgefa4.c 155 dgefa7.c 151 dgefa6.c 149 dgefa5.c 146 dgefa3.c 104 aijbaij.c 93 baijfact4.c 88 dgefa.c 87 dgedi.c Satish On Thu, 30 May 2013, Barry Smith wrote: > > On May 30, 2013, at 1:43 PM, Satish Balay wrote: > > > > > I see the following routines. I can split them as follows. [Or should > > I be spliting them up differently?] > > That is a fine way to start > > > > > thanks, > > Satish > > > > **************** baijfact2.c ******************* > > #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ" > > #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ_ilu0" > > #define __FUNCT__ "MatILUFactorSymbolic_SeqBAIJ_inplace" > > #define __FUNCT__ "MatLUFactorNumeric_SeqBAIJ_15_NaturalOrdering" > > #define __FUNCT__ "MatLUFactorNumeric_SeqBAIJ_N" > > #define __FUNCT__ "MatSetUnfactored_SeqBAIJ_4_NaturalOrdering_SSE" > > #define __FUNCT__ "MatSetUnfactored_SeqBAIJ_4_NaturalOrdering_SSE_usj" > > > > **************** baijsolv.c ******************* > > > > #define __FUNCT__ "MatSolve_SeqBAIJ_1" > > #define __FUNCT__ "MatSolve_SeqBAIJ_1_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_2" > > #define __FUNCT__ "MatSolve_SeqBAIJ_2_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_3" > > #define __FUNCT__ "MatSolve_SeqBAIJ_3_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_Demotion" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_SSE_Demotion" > > #define __FUNCT__ "MatSolve_SeqBAIJ_5" > > #define __FUNCT__ "MatSolve_SeqBAIJ_5_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_6" > > #define __FUNCT__ "MatSolve_SeqBAIJ_6_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_7" > > #define __FUNCT__ "MatSolve_SeqBAIJ_7_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_N_inplace" > > > > **************** baijsolvnat.c ******************* > > #define __FUNCT__ "MatSolve_SeqBAIJ_15_NaturalOrdering_ver1" > > #define __FUNCT__ "MatSolve_SeqBAIJ_15_NaturalOrdering_ver2" > > #define __FUNCT__ "MatSolve_SeqBAIJ_1_NaturalOrdering" > > #define __FUNCT__ "MatSolve_SeqBAIJ_1_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_2_NaturalOrdering" > > #define __FUNCT__ "MatSolve_SeqBAIJ_2_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_3_NaturalOrdering" > > #define __FUNCT__ "MatSolve_SeqBAIJ_3_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_Demotion" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_SSE_Demotion" > > #define __FUNCT__ "MatSolve_SeqBAIJ_4_NaturalOrdering_SSE_Demotion_usj" > > #define __FUNCT__ "MatSolve_SeqBAIJ_5_NaturalOrdering" > > #define __FUNCT__ "MatSolve_SeqBAIJ_5_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_6_NaturalOrdering" > > #define __FUNCT__ "MatSolve_SeqBAIJ_6_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolve_SeqBAIJ_7_NaturalOrdering" > > #define __FUNCT__ "MatSolve_SeqBAIJ_7_NaturalOrdering_inplace" > > > > **************** baijsolvtran.c ******************* > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_N" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_N_inplace" > > > > **************** baijsolvtrannat.c ******************* > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_NaturalOrdering" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_1_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_NaturalOrdering" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_2_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_NaturalOrdering" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_3_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_NaturalOrdering" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_4_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_NaturalOrdering" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_5_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_NaturalOrdering" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_6_NaturalOrdering_inplace" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_NaturalOrdering" > > #define __FUNCT__ "MatSolveTranspose_SeqBAIJ_7_NaturalOrdering_inplace" > > > > > > > > On Wed, 29 May 2013, Barry Smith wrote: > > > >> > >> Satish, > >> > >> There appear to be about 70 functions in 6000 lines of code in that file. Please branch off of maint and split it into at least 7 files then test and eventually get it back into maint and master. > >> > >> Thanks > >> > >> Barry > >> > >> On May 29, 2013, at 3:38 PM, Jed Brown wrote: > >> > >>> Satish Balay writes: > >>> > >>>> This piece of code does stress compilers in complex/optimized > >>>> mode. When I tried previously - I couldn't reproduce the problem [with > >>>> VS2012] > >>> > >>> We should shorten the file. > >> > >> > > > > From ucemckl at ucl.ac.uk Thu May 30 21:41:34 2013 From: ucemckl at ucl.ac.uk (Christian Klettner) Date: Fri, 31 May 2013 03:41:34 +0100 Subject: [petsc-users] Compile flags for a Blue Gene\Q In-Reply-To: <87mwrdwklg.fsf@mcs.anl.gov> References: <87mwrdwklg.fsf@mcs.anl.gov> Message-ID: Thanks Jed, for this information. We'll try it out. Best regards, Christian > Christian Klettner writes: > >> Dear PETSc group, >> We have been given access to a Blue Gene\Q system to run code we >> normally >> run on a traditional cluster architecture. We have ported the code >> successfully and run some jobs using 16 cores per node however the >> performance is roughly four times slower to that of a Xeon processor. We >> expect less performance (due to the slower chips) however this seems a >> bit >> excessive. One problem we think is that we are not using all four >> hardware >> threads per core. > > The cache architecture is different, so the factor of 4x is not > unreasonable. Sparse linear algebra is bandwidth limited and you don't > need to use threads for bandwidth. > > http://www.alcf.anl.gov/sites/www.alcf.anl.gov/files/miracon_AppPerform_BobWalkup_0.pdf > > You can configure petsc-dev to use pthreads and/or OpenMP (it was > disabled in the release), but that code is experimental and still in > flux. You can also experiment with this fork to compare the performance > of an all-OpenMP approach to threading. > > https://bitbucket.org/ggorman/petsc-3.3-omp > >> To achieve this do we need to use a threaded version of PETSc? Could >> someone suggest the additional arguments required to make use of the >> pthreads when launching an MPI job? We are using the Blue Gene MPI >> libraries but are currently unable to use the ESSL blas libraries. We >> found the example compile flags for a Blue Gene \P in the PETSc >> package but were wondering if anyone had compile flags which they >> would recommend for a Blue Gene\Q? Best regards, Christian > > You could start with something like this (fixing paths as appropriate; > choose your optimization flags). > > ./configure > --with-blas-lapack-lib=/soft/libraries/essl/5.1.1-0.beta/lib64/libesslbg.a > --with-mpi-dir=/bgsys/drivers/ppcfloor/comm/xl/ > --with-mpi-dir=/bgsys/drivers/ppcfloor/comm/xl.legacy.ndebug > --with-debugging=0 COPTFLAGS="-O3 -qnohot -qsimd=noauto -qsmp=omp:noauto" > FOPTFLAGS="-O3 -qnohot -qsimd=noauto -qsmp=omp:noauto" PETSC_ARCH=xl-opt > From ztdepyahoo at 163.com Fri May 31 10:43:11 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Fri, 31 May 2013 23:43:11 +0800 (CST) Subject: [petsc-users] confusion about MatSetValues(A, ,ADD_VALUES); Message-ID: <21c91be4.130d5.13efb41c4dc.Coremail.ztdepyahoo@163.com> I write a simple c procedure to test the MatSetValues. the main body of procedure is like this int row=1; int col=10; double v=1.0; MatSetValues(A,1,&row,1,&col,&v,INSERT_VALUES); MatSetValues(A,1,&row,1,&row,&v,ADD_VALUES); MatSetValues(A,1,&col,1,&col,&v,ADD_VALUES); MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); I run with 4 cpus, when i view the matrix, the diagonal value is wrong ,it is 1 not 4.0. but if i delete the code "MatSetValues(A,1,&row,1,&col,&v,INSERT_VALUES)", it can gives me the correct answer. could you please tell me the reason. Regards. thank you very much. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 31 11:30:45 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 31 May 2013 11:30:45 -0500 Subject: [petsc-users] confusion about MatSetValues(A, ,ADD_VALUES); In-Reply-To: <21c91be4.130d5.13efb41c4dc.Coremail.ztdepyahoo@163.com> References: <21c91be4.130d5.13efb41c4dc.Coremail.ztdepyahoo@163.com> Message-ID: <87li6vqfgq.fsf@mcs.anl.gov> ??? writes: > I write a simple c procedure to test the MatSetValues. the main body of procedure is like this > > int row=1; > int col=10; > double v=1.0; > > MatSetValues(A,1,&row,1,&col,&v,INSERT_VALUES); > MatSetValues(A,1,&row,1,&row,&v,ADD_VALUES); > MatSetValues(A,1,&col,1,&col,&v,ADD_VALUES); You can't mix INSERT_VALUES and ADD_VALUES. Please use debugging PETSc for development. It warns about this. From mohsin139 at gmail.com Fri May 31 11:43:30 2013 From: mohsin139 at gmail.com (Mohsin Iqbal) Date: Fri, 31 May 2013 18:43:30 +0200 Subject: [petsc-users] Problem in header file petscerror.h Message-ID: Hello, I am compiling an application that is using petsc. *petsc-3.0.0-p12* is installed in /opt/petsc. During the compilation I am getting this stream of errors. In file included from /opt/petsc/include/petsc.h:1353, from /opt/petsc/include/petscis.h:7, from /opt/petsc/include/petscvec.h:9, from /opt/petsc/include/petscmat.h:6, from /opt/petsc/include/petscpc.h:6, from /opt/petsc/include/petscksp.h:6, from /root/resmgmt/applications/Malleability/Quadflow/quadflow/Library/flow_solver/Source/Include/petsc_c.h:17, from Main/misc_dir/alloc_mem.c:4: /opt/petsc/include/petscerror.h:320: error: expected ?=?, ?,?, ?;?, ?asm? or ?__attribute__? before ?PetscTruth? /opt/petsc/include/petscerror.h:355: error: expected ?=?, ?,?, ?;?, ?asm? or ?__attribute__? before ?PetscTruth? /opt/petsc/include/petscerror.h:398: error: expected ?=?, ?,?, ?;?, ?asm? or ?__attribute__? before ?PetscErrorCode? In file included from /opt/petsc/include/petsc.h:1435, from /opt/petsc/include/petscis.h:7, from /opt/petsc/include/petscvec.h:9, from /opt/petsc/include/petscmat.h:6, from /opt/petsc/include/petscpc.h:6, from /opt/petsc/include/petscksp.h:6, from /root/resmgmt/applications/Malleability/Quadflow/quadflow/Library/flow_solver/Source/Include/petsc_c.h:17, from Main/misc_dir/alloc_mem.c:4: /opt/petsc/include/petsclog.h:309: error: expected ?=?, ?,?, ?;?, ?asm? or ?__attribute__? before ?PetscErrorCode? In file included from /opt/petsc/include/petscvec.h:9, from /opt/petsc/include/petscmat.h:6, from /opt/petsc/include/petscpc.h:6, from /opt/petsc/include/petscksp.h:6, from /root/resmgmt/applications/Malleability/Quadflow/quadflow/Library/flow_solver/Source/Include/petsc_c.h:17, from Main/misc_dir/alloc_mem.c:4: /opt/petsc/include/petscis.h:127: error: expected ?=?, ?,?, ?;?, ?asm? or ?__attribute__? before ?PetscErrorCode? Can someone give any clue to rectify this problem. Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri May 31 11:49:02 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 31 May 2013 11:49:02 -0500 (CDT) Subject: [petsc-users] Problem in header file petscerror.h In-Reply-To: References: Message-ID: On Fri, 31 May 2013, Mohsin Iqbal wrote: > Hello, > > I am compiling an application that is using petsc. *petsc-3.0.0-p12* is this versions is very old. We suggest using latest petsc release [currently 3.4] - and upgrading your code to work with it. > installed in /opt/petsc. During the compilation I am getting this stream of > errors. Can you verify if a PETSc exmaple [from this version of PETSc] compile and run fine with this install of PETSc? > > In file included from /opt/petsc/include/petsc.h:1353, > from /opt/petsc/include/petscis.h:7, > from /opt/petsc/include/petscvec.h:9, > from /opt/petsc/include/petscmat.h:6, > from /opt/petsc/include/petscpc.h:6, > from /opt/petsc/include/petscksp.h:6, > from > /root/resmgmt/applications/Malleability/Quadflow/quadflow/Library/flow_solver/Source/Include/petsc_c.h:17, > from Main/misc_dir/alloc_mem.c:4: > /opt/petsc/include/petscerror.h:320: error: expected ?=?, ?,?, ?;?, ?asm? > or ?__attribute__? before ?PetscTruth? The line of code here is: >>>> PETSC_STATIC_INLINE PetscTruth PetscExceptionCaught(PetscErrorCode xierr,PetscErrorCode zierr) <<< So perhaps the compiler is not able to resolve the macro 'PETSC_STATIC_INLINE'. Are you using PETSc makefiles for your application? If not - compare your compile command with the one you get when you build a PETSc example using PETSc makefile [say src/ksp/ksp/examples/tuotirals/ex2.c] Satish > /opt/petsc/include/petscerror.h:355: error: expected ?=?, ?,?, ?;?, ?asm? > or ?__attribute__? before ?PetscTruth? > /opt/petsc/include/petscerror.h:398: error: expected ?=?, ?,?, ?;?, ?asm? > or ?__attribute__? before ?PetscErrorCode? > In file included from /opt/petsc/include/petsc.h:1435, > from /opt/petsc/include/petscis.h:7, > from /opt/petsc/include/petscvec.h:9, > from /opt/petsc/include/petscmat.h:6, > from /opt/petsc/include/petscpc.h:6, > from /opt/petsc/include/petscksp.h:6, > from > /root/resmgmt/applications/Malleability/Quadflow/quadflow/Library/flow_solver/Source/Include/petsc_c.h:17, > from Main/misc_dir/alloc_mem.c:4: > /opt/petsc/include/petsclog.h:309: error: expected ?=?, ?,?, ?;?, ?asm? or > ?__attribute__? before ?PetscErrorCode? > In file included from /opt/petsc/include/petscvec.h:9, > from /opt/petsc/include/petscmat.h:6, > from /opt/petsc/include/petscpc.h:6, > from /opt/petsc/include/petscksp.h:6, > from > /root/resmgmt/applications/Malleability/Quadflow/quadflow/Library/flow_solver/Source/Include/petsc_c.h:17, > from Main/misc_dir/alloc_mem.c:4: > /opt/petsc/include/petscis.h:127: error: expected ?=?, ?,?, ?;?, ?asm? or > ?__attribute__? before ?PetscErrorCode? > > Can someone give any clue to rectify this problem. > > Thanks. > From jedbrown at mcs.anl.gov Fri May 31 11:51:25 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 31 May 2013 11:51:25 -0500 Subject: [petsc-users] Problem in header file petscerror.h In-Reply-To: References: Message-ID: <87fvx3qeia.fsf@mcs.anl.gov> Mohsin Iqbal writes: > Hello, > > I am compiling an application that is using petsc. *petsc-3.0.0-p12* is > installed in /opt/petsc. During the compilation I am getting this stream of > errors. > > In file included from /opt/petsc/include/petsc.h:1353, > from /opt/petsc/include/petscis.h:7, > from /opt/petsc/include/petscvec.h:9, > from /opt/petsc/include/petscmat.h:6, > from /opt/petsc/include/petscpc.h:6, > from /opt/petsc/include/petscksp.h:6, > from > /root/resmgmt/applications/Malleability/Quadflow/quadflow/Library/flow_solver/Source/Include/petsc_c.h:17, > from Main/misc_dir/alloc_mem.c:4: > /opt/petsc/include/petscerror.h:320: error: expected ?=?, ?,?, ?;?, ?asm? > or ?__attribute__? before ?PetscTruth? > /opt/petsc/include/petscerror.h:355: error: expected ?=?, ?,?, ?;?, ?asm? > or ?__attribute__? before ?PetscTruth? > /opt/petsc/include/petscerror.h:398: error: expected ?=?, ?,?, ?;?, ?asm? > or ?__attribute__? before ?PetscErrorCode? That line is PETSC_STATIC_INLINE, which should have be defined in petscconf.h. I'm guessing you have a mixed up environment. You can try using the -M compiler option to find out which petscconf.h is being included. And please upgrade to petsc-3.4. From balay at mcs.anl.gov Fri May 31 11:59:01 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 31 May 2013 11:59:01 -0500 (CDT) Subject: [petsc-users] confusion about MatSetValues(A, ,ADD_VALUES); In-Reply-To: <87li6vqfgq.fsf@mcs.anl.gov> References: <21c91be4.130d5.13efb41c4dc.Coremail.ztdepyahoo@163.com> <87li6vqfgq.fsf@mcs.anl.gov> Message-ID: On Fri, 31 May 2013, Jed Brown wrote: > ??? writes: > > > I write a simple c procedure to test the MatSetValues. the main body of procedure is like this > > > > int row=1; > > int col=10; > > double v=1.0; > > > > MatSetValues(A,1,&row,1,&col,&v,INSERT_VALUES); > > MatSetValues(A,1,&row,1,&row,&v,ADD_VALUES); > > MatSetValues(A,1,&col,1,&col,&v,ADD_VALUES); > > You can't mix INSERT_VALUES and ADD_VALUES. Please use debugging PETSc > for development. It warns about this. i.e use: MatSetValues(A,1,&row,1,&col,&v,INSERT_VALUES); MatAssemblyBegin(A,MAT_FLUSH_ASSEMBLY); MatAssemblyEnd(A,MAT_FLUSH_ASSEMBLY); MatSetValues(A,1,&row,1,&row,&v,ADD_VALUES); MatSetValues(A,1,&col,1,&col,&v,ADD_VALUES); MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY); Satish From gokhalen at gmail.com Fri May 31 14:47:28 2013 From: gokhalen at gmail.com (Nachiket Gokhale) Date: Fri, 31 May 2013 15:47:28 -0400 Subject: [petsc-users] Error in Lapack xTREXC 1 Message-ID: I am trying various options to solve a quadratic eigenvalue problem using SlepC. I get the following error message Calling QEPSolve [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Error in external library! [0]PETSC ERROR: Error in Lapack xTREXC 1! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Development GIT revision: ff331c249a8bbf694711b310f25fec0e839b33db GIT Date: 2013-05-19 22:14:01 -0500 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: /home/gokhale/hyser.clone/src/forward on a linux-gcc-gpp-mumps named asd1.wai.com by gokhale Fri May 31 15:09:49 2013 [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-git/linux-gcc-gpp-mumps/lib [0]PETSC ERROR: Configure run at Mon May 20 17:02:54 2013 [0]PETSC ERROR: Configure options --download-blacs=1 --download-f-blas-lapack=1 --download-metis --download-mpich=yes --download-mumps=yes --download-parmetis=1 --download-scalapack=1 --with-cc=/usr/local/bin/gcc --with-clanguage=C++ --with-cmake=/usr/bin/cmake28 --with-cxx=/usr/local/bin/g++ --with-debugging=0 --with-fc=/usr/local/bin/gfortran --with-mpi=1 --with-shared-libraries=1 --with-x11=0 --with-x=0 PETSC_ARCH=linux-gcc-gpp-mumps [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: DSSort_NHEP_Total() line 459 in /opt/SlepC/slepc-git/src/ds/impls/nhep/dsnhep.c [0]PETSC ERROR: DSSort_NHEP() line 493 in /opt/SlepC/slepc-git/src/ds/impls/nhep/dsnhep.c [0]PETSC ERROR: DSSort() line 510 in /opt/SlepC/slepc-git/src/ds/interface/dsops.c [0]PETSC ERROR: QEPSolve_QArnoldi() line 259 in /opt/SlepC/slepc-git/src/qep/impls/qarnoldi/qarnoldi.c [0]PETSC ERROR: QEPSolve() line 107 in /opt/SlepC/slepc-git/src/qep/interface/qepsolve.c [0]PETSC ERROR: solvecomplex() line 161 in "unknowndirectory/"/home/gokhale/hyser.clone/lib/eqnlib/slepc_quad_system.cpp I am not sure what it means; Google/mailing list search didn't seem to help. My command line is ~/hyser.clone/src/forward -qep_nev 10 -qep_ncv 32 -qep_monitor_conv -qep_max_it 10000 -qep_smallest_imaginary -qep_type qarnoldi -qep_general -qep_linear_explicitmatrix -qep_eps_type krylovschur -qep_eps_shift -0.1 If it helps, the matrices are small - 60x60. -Nachiket -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Fri May 31 14:59:42 2013 From: jroman at dsic.upv.es (Jose E. Roman) Date: Fri, 31 May 2013 21:59:42 +0200 Subject: [petsc-users] Error in Lapack xTREXC 1 In-Reply-To: References: Message-ID: <35AF611B-B4D7-4E50-84C0-FDFA84FEE4BA@dsic.upv.es> El 31/05/2013, a las 21:47, Nachiket Gokhale escribi?: > > > I am trying various options to solve a quadratic eigenvalue problem using SlepC. I get the following error message > > Calling QEPSolve > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Error in external library! > [0]PETSC ERROR: Error in Lapack xTREXC 1! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Development GIT revision: ff331c249a8bbf694711b310f25fec0e839b33db GIT Date: 2013-05-19 22:14:01 -0500 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: /home/gokhale/hyser.clone/src/forward on a linux-gcc-gpp-mumps named asd1.wai.com by gokhale Fri May 31 15:09:49 2013 > [0]PETSC ERROR: Libraries linked from /opt/petsc/petsc-git/linux-gcc-gpp-mumps/lib > [0]PETSC ERROR: Configure run at Mon May 20 17:02:54 2013 > [0]PETSC ERROR: Configure options --download-blacs=1 --download-f-blas-lapack=1 --download-metis --download-mpich=yes --download-mumps=yes --download-parmetis=1 --download-scalapack=1 --with-cc=/usr/local/bin/gcc --with-clanguage=C++ --with-cmake=/usr/bin/cmake28 --with-cxx=/usr/local/bin/g++ --with-debugging=0 --with-fc=/usr/local/bin/gfortran --with-mpi=1 --with-shared-libraries=1 --with-x11=0 --with-x=0 PETSC_ARCH=linux-gcc-gpp-mumps > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: DSSort_NHEP_Total() line 459 in /opt/SlepC/slepc-git/src/ds/impls/nhep/dsnhep.c > [0]PETSC ERROR: DSSort_NHEP() line 493 in /opt/SlepC/slepc-git/src/ds/impls/nhep/dsnhep.c > [0]PETSC ERROR: DSSort() line 510 in /opt/SlepC/slepc-git/src/ds/interface/dsops.c > [0]PETSC ERROR: QEPSolve_QArnoldi() line 259 in /opt/SlepC/slepc-git/src/qep/impls/qarnoldi/qarnoldi.c > [0]PETSC ERROR: QEPSolve() line 107 in /opt/SlepC/slepc-git/src/qep/interface/qepsolve.c > [0]PETSC ERROR: solvecomplex() line 161 in "unknowndirectory/"/home/gokhale/hyser.clone/lib/eqnlib/slepc_quad_system.cpp > > > I am not sure what it means; Google/mailing list search didn't seem to help. My command line is > > ~/hyser.clone/src/forward -qep_nev 10 -qep_ncv 32 -qep_monitor_conv -qep_max_it 10000 -qep_smallest_imaginary -qep_type qarnoldi -qep_general -qep_linear_explicitmatrix -qep_eps_type krylovschur -qep_eps_shift -0.1 > > If it helps, the matrices are small - 60x60. > > -Nachiket > > This is quite strange. Can you try with --download-f2cblaslapack instead of --download-f-blas-lapack? If it does not help, send the matrices to slepc-maint. By the way, the -qep_eps_shift option does not exist. Jose From mrosso at uci.edu Fri May 31 16:48:10 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 31 May 2013 14:48:10 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <8761y1w1si.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <8738tlnrpe.fsf@mcs.anl.gov> <5196B806.5020605@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> <8761y1w1si.fsf@mcs.anl.gov> Message-ID: <51A91A9A.4050102@uci.edu> Hi, I confirm that -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 produces the correct shift in PETSc 3.4. So my problem is solved. I will upgrade to 3.4 in my productive machine installation. It is a Cray machine (Blue Waters). Assuming I want to use the Cray compiler, which options should I use for ./configure for the installation. Thank you On 05/29/2013 09:05 PM, Jed Brown wrote: > Michele Rosso writes: > >> Hi, >> >> I proceeded as Matt suggested. I am running without nullspace >> (Dirichlet's BCs) with the following options: >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left >> >> The results are fine and the run finishes. The output of -ksp_view does >> not say anything about the coarse solver being shifted. > With petsc-3.4, it has these lines: > > tolerance for zero pivot 2.22045e-14 > using diagonal shift on blocks to prevent zero pivot > > You can try a proposed fix in the branch 'jed/fix-gamg-coarse'. > > > > Hong, I think this diagnostic output is unintuitive for a user that > doesn't know this relation. Shall we add the enum names to the output > lines below? > > if (factor->info.shifttype==(PetscReal)MAT_SHIFT_POSITIVE_DEFINITE) { > ierr = PetscViewerASCIIPrintf(viewer," using Manteuffel shift\n");CHKERRQ(ierr); > } > if (factor->info.shifttype==(PetscReal)MAT_SHIFT_NONZERO) { > ierr = PetscViewerASCIIPrintf(viewer," using diagonal shift to prevent zero pivot\n");CHKERRQ(ierr); > } > if (factor->info.shifttype==(PetscReal)MAT_SHIFT_INBLOCKS) { > ierr = PetscViewerASCIIPrintf(viewer," using diagonal shift on blocks to prevent zero pivot\n");CHKERRQ(ierr); > } > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 31 16:57:07 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 31 May 2013 16:57:07 -0500 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <51A91A9A.4050102@uci.edu> References: <519687DD.4050209@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> <8761y1w1si.fsf@mcs.anl.gov> <51A91A9A.4050102@uci.edu> Message-ID: <8761xyrex8.fsf@mcs.anl.gov> Michele Rosso writes: > Hi, > > I confirm that > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > > produces the correct shift in PETSc 3.4. > So my problem is solved. I will upgrade to 3.4 in my productive > machine installation. > It is a Cray machine (Blue Waters). Assuming I want to use the Cray > compiler, which options should I use for ./configure > for the installation. Or the Intel compiler or GCC. PETSc should run at similar speed with any. IIRC, Cray recommends the Intel compiler on their machines with Intel CPUs. There are examples in config/examples/. From mrosso at uci.edu Fri May 31 17:44:35 2013 From: mrosso at uci.edu (Michele Rosso) Date: Fri, 31 May 2013 15:44:35 -0700 Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <8761xyrex8.fsf@mcs.anl.gov> References: <519687DD.4050209@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> <8761y1w1si.fsf@mcs.anl.gov> <51A91A9A.4050102@uci.edu> <8761xyrex8.fsf@mcs.anl.gov> Message-ID: <51A927D3.10500@uci.edu> Thanks, but I'd rather use Cray since only a small part of my code relies on PETSc and, as suggested by the BW staff, Cray-compiled code performs generally better on Cray system. Michele On 05/31/2013 02:57 PM, Jed Brown wrote: > Michele Rosso writes: > >> Hi, >> >> I confirm that >> >> -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 >> >> produces the correct shift in PETSc 3.4. >> So my problem is solved. I will upgrade to 3.4 in my productive >> machine installation. >> It is a Cray machine (Blue Waters). Assuming I want to use the Cray >> compiler, which options should I use for ./configure >> for the installation. > Or the Intel compiler or GCC. PETSc should run at similar speed with > any. IIRC, Cray recommends the Intel compiler on their machines with > Intel CPUs. There are examples in config/examples/. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri May 31 19:08:44 2013 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 31 May 2013 19:08:44 -0500 (CDT) Subject: [petsc-users] Solving Poisson equation with multigrid In-Reply-To: <51A927D3.10500@uci.edu> References: <519687DD.4050209@uci.edu> <87ppwpmbne.fsf@mcs.anl.gov> <5196C171.7060400@uci.edu> <5196C5BE.7060601@uci.edu> <87mwrtm9l0.fsf@mcs.anl.gov> <5196CA3D.3070001@uci.edu> <87k3mxm8is.fsf@mcs.anl.gov> <5196CF3A.3030000@uci.edu> <87ehd5m4gn.fsf@mcs.anl.gov> <5196E8DC.1010602@uci.edu> <519FC4B0.9080702@uci.edu> <8738tcp2ya.fsf@mcs.anl.gov> <519FDD10.3060900@uci.edu> <87r4gwnj1a.fsf@mcs.anl.gov> <519FE730.9000309@uci.edu> <51A67267.4010507@uci.edu> <8761y1w1si.fsf@mcs.anl.gov> <51A91A9A.4050102@uci.edu> <8761xyrex8.fsf@mcs.anl.gov> <51A927D3.10500@uci.edu> Message-ID: The following is my configure command to build PETSc on a cray with cray compilers: >>>>>>> $ cat reconfigure-arch-test-cray.py #!/usr/bin/python if __name__ == '__main__': import sys import os sys.path.insert(0, os.path.abspath('config')) import configure configure_options = [ '--with-cc=cc', '--with-clanguage=C++', '--with-clib-autodetect=0', '--with-cxx=CC', '--with-cxxlib-autodetect=0', '--with-fc=ftn', '--with-fortranlib-autodetect=0', '--with-x=0', 'FFLAGS=-F -em', 'LIBS=-L/opt/cray/cce/8.1.4/CC/x86-64/lib/x86-64/ -lcray-c++-rts -lcraystdc++ -lsupc++ -lgcc_eh', 'PETSC_ARCH=arch-test-cray', ] configure.petsc_configure(configure_options) <<<<<<<<< And you might have to look for the recommended optimization flags - and use with: --with-debgging=0 COPTFLAGS= FOPTFLAGS= CXXOPTFLAGS= etc. And on BlueWaters - you would have to remove the following lines from PETSC_ARCH/include/petscconf.h [before runing 'make all' to build the libraries] >>>>>> #ifndef PETSC_HAVE_GETPWUID #define PETSC_HAVE_GETPWUID 1 #endif <<<<< Satish On Fri, 31 May 2013, Michele Rosso wrote: > Thanks, but I'd rather use Cray since only a small part of my code relies on > PETSc and, as suggested by the BW staff, > Cray-compiled code performs generally better on Cray system. > > Michele > > On 05/31/2013 02:57 PM, Jed Brown wrote: > > Michele Rosso writes: > > > > > Hi, > > > > > > I confirm that > > > > > > -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1 > > > > > > produces the correct shift in PETSc 3.4. > > > So my problem is solved. I will upgrade to 3.4 in my productive > > > machine installation. > > > It is a Cray machine (Blue Waters). Assuming I want to use the Cray > > > compiler, which options should I use for ./configure > > > for the installation. > > Or the Intel compiler or GCC. PETSc should run at similar speed with > > any. IIRC, Cray recommends the Intel compiler on their machines with > > Intel CPUs. There are examples in config/examples/. > > > > From ztdepyahoo at 163.com Fri May 31 20:25:19 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Sat, 1 Jun 2013 09:25:19 +0800 (CST) Subject: [petsc-users] confusion about MatSetValues(A, ,ADD_VALUES); In-Reply-To: <87li6vqfgq.fsf@mcs.anl.gov> References: <21c91be4.130d5.13efb41c4dc.Coremail.ztdepyahoo@163.com> <87li6vqfgq.fsf@mcs.anl.gov> Message-ID: <21378e15.184a.13efd56b8a9.Coremail.ztdepyahoo@163.com> in the user manual, it taks about the MAT_FLUSH_ASSEMbLY, and MAT_FINAL_ASSEMBLY, how to use the MAT_FLUSH_ASSEMBLY, if i want to mix the add value and insert value. ? 2013-06-01 00:30:45?"Jed Brown" ??? >??? writes: > >> I write a simple c procedure to test the MatSetValues. the main body of procedure is like this >> >> int row=1; >> int col=10; >> double v=1.0; >> >> MatSetValues(A,1,&row,1,&col,&v,INSERT_VALUES); >> MatSetValues(A,1,&row,1,&row,&v,ADD_VALUES); >> MatSetValues(A,1,&col,1,&col,&v,ADD_VALUES); > >You can't mix INSERT_VALUES and ADD_VALUES. Please use debugging PETSc >for development. It warns about this. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 31 20:31:31 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 31 May 2013 20:31:31 -0500 Subject: [petsc-users] confusion about MatSetValues(A, ,ADD_VALUES); In-Reply-To: <21378e15.184a.13efd56b8a9.Coremail.ztdepyahoo@163.com> References: <21c91be4.130d5.13efb41c4dc.Coremail.ztdepyahoo@163.com> <87li6vqfgq.fsf@mcs.anl.gov> <21378e15.184a.13efd56b8a9.Coremail.ztdepyahoo@163.com> Message-ID: <87ehcmpqfg.fsf@mcs.anl.gov> ??? writes: > in the user manual, it taks about the MAT_FLUSH_ASSEMbLY, and MAT_FINAL_ASSEMBLY, > how to use the MAT_FLUSH_ASSEMBLY, if i want to mix the add value and insert value. Was something unclear in Satish's mail from 8 hours ago? http://lists.mcs.anl.gov/pipermail/petsc-users/2013-May/017617.html From ztdepyahoo at 163.com Fri May 31 23:06:33 2013 From: ztdepyahoo at 163.com (=?GBK?B?tqHAz8qm?=) Date: Sat, 1 Jun 2013 12:06:33 +0800 (CST) Subject: [petsc-users] PCILU does not work Message-ID: <423aecf2.3926.13efdea5743.Coremail.ztdepyahoo@163.com> My problem can be solved sucessfully with the default ksp setting, but if i change the pc with PCsetTye(pc,PCILU), it gives me the following error. Error mesage no supoort for the operation ofr this object type! matrix format mpiaij does not shave a build in PETSc ILU! Regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jedbrown at mcs.anl.gov Fri May 31 23:13:16 2013 From: jedbrown at mcs.anl.gov (Jed Brown) Date: Fri, 31 May 2013 23:13:16 -0500 Subject: [petsc-users] PCILU does not work In-Reply-To: <423aecf2.3926.13efdea5743.Coremail.ztdepyahoo@163.com> References: <423aecf2.3926.13efdea5743.Coremail.ztdepyahoo@163.com> Message-ID: <877giepixv.fsf@mcs.anl.gov> ??? writes: > My problem can be solved sucessfully with the default ksp setting, but if i change the pc with > PCsetTye(pc,PCILU), it gives me the following error. > > > Error mesage > > > no supoort for the operation ofr this object type! > matrix format mpiaij does not shave a build in PETSc ILU! It perplexes me why people insist on re-typing error messages, incompletely, introducing spelling errors in the process, instead of pasting the whole thing. Anyway, it's true that PETSc does not have native parallel ILU. ILU is used as a local solver in the default domain decomposition method. You can also try a parallel ILU from another package using run-time options. http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html Note that ILU is not a great parallel algorithm.