[petsc-users] TS example tutorial example 4 adapted - not parallelizing
Matthew Knepley
knepley at gmail.com
Thu Oct 16 07:53:59 CDT 2014
On Thu, Oct 16, 2014 at 4:08 AM, Analabha Roy <hariseldon99 at gmail.com>
wrote:
> Hi all,
>
> Update: Turns out i set my initial condition wrong due to incorrectly
> using local indices while setting the value, and running it on multiple
> processor causes overstriding, setting all vector elements to zero.
>
I am glad its working. One way to find out of bounds access is to use
valgrind, which works great.
Thanks,
Matt
> I have fixed it. Basically, line 407 changes from the local index
> (i-ibase) to the global index (just i).
>
> for (i = mybase; i < myend; i++)
> if (i == floor ((appctx->m) / 2))
> u_localptr[i - mybase] = one;
> else
> u_localptr[i - mybase] = zero;
>
> A trivially silly thing I'm afraid.
>
> Thanks for pointing out my other unrelated error. I think I will compute
> the variance using MatMult() instead so as to keep everything collective.
>
> Basically set a diagonal matrix to X_{ii} = i and Xsq_{ii} = i^2 and
> compute expectation values using MatMult() and VecDot()
>
> Regards,
> Analabha
>
>
>
>
> On Thu, Oct 16, 2014 at 1:18 PM, Analabha Roy <hariseldon99 at gmail.com>
> wrote:
>
>> Hi all,
>>
>> Thanks for the response.
>>
>> On Wed, Oct 15, 2014 at 4:15 PM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>>>
>>> I am not sure what you are doing at line 483. You seem to have all
>>> processes indexing a Vec
>>> at all entries, which cannot be correct.
>>>
>>>
>> Yes, you're right, of course. I have removed that bit (I'll do it in
>> post-processing after I have dumped the full state data to file).
>>
>> In the meantime, the graphical output is still messed up when the code is
>> run with debug flag in parallel. However, parallel graphical output works
>> fine in the original ex4.c. I can;t figure out why...
>>
>> I have updated diff (link to pretty version
>> <http://www.diffnow.com/?report=49thu>) attached and posted below.
>>
>>
>> Thanks and Regards,
>> Analabha
>>
>> 0a1,2
>> > //PROBLEMS:
>> > //FIX PARALLELIZATION PROBLEM
>> 3,8c5,11
>> < "Solves a simple time-dependent linear PDE (the heat equation).\n\
>> < Input parameters include:\n\
>> < -m <points>, where <points> = number of grid points\n\
>> < -time_dependent_rhs : Treat the problem as having a time-dependent
>> right-hand side\n\
>> < -debug : Activate debugging printouts\n\
>> < -nox : Deactivate x-window graphics\n\n";
>> ---
>> > "Solves a simple time-dependent Schroedinger equation for a TB model
>> with periodic drive.\n\
>> > Apart from PetSc's routine options, input parameters include:\n\
>> > -m <pnts>, where <pnts> = lattice size\n\
>> > -a <ampl>, where <ampl> = drive amplitude\n\
>> > -d <strg>, where <strg> = disorder strength\n\
>> > -w <freq>, where <freq> = drive frequency\n\
>> > -debug : Activate debugging printouts\n\n";
>> 11,14c14
>> < Concepts: TS^time-dependent linear problems
>> < Concepts: TS^heat equation
>> < Concepts: TS^diffusion equation
>> < Processors: n
>> ---
>> > Default problem parameters. These are overwritten by argument vectors
>> 15a16,29
>> > #define DEFAULT_SIZE 100
>> > #define DEFAULT_ALPHA 0.3
>> > #define DEFAULT_AMPL 3.0
>> > #define DEFAULT_W 3.0
>> > #define TIME_TOTAL_MAX 100.0 /* default max total time */
>> > #define TIME_STEPS_MAX 1E5 /* default max timesteps */
>> > /*
>> > Default problem parameters. These are always hard coded and never
>> > overwritten
>> > */
>> > #define INIT_STEP 1E-4
>> > #define INIT_TIME 0.0
>> > #define ABSERROR 1E-9
>> > #define RELERROR 1E-9
>> 19,31c33,39
>> < This program solves the one-dimensional heat equation (also called
>> the
>> < diffusion equation),
>> < u_t = u_xx,
>> < on the domain 0 <= x <= 1, with the boundary conditions
>> < u(t,0) = 0, u(t,1) = 0,
>> < and the initial condition
>> < u(0,x) = sin(6*pi*x) + 3*sin(2*pi*x).
>> < This is a linear, second-order, parabolic equation.
>> <
>> < We discretize the right-hand side using finite differences with
>> < uniform grid spacing h:
>> < u_xx = (u_{i+1} - 2u_{i} + u_{i-1})/(h^2)
>> < We then demonstrate time evolution using the various TS methods by
>> ---
>> > This program solves the one-dimensional Schroedinger equation in a
>> > tightly bound lattice with a periodic time dependence
>> > \partial_t u_m(t) = \sum_m J_{lm}(t) u_m(t),
>> > on the domain 0 <= m < L, with the boundary conditions
>> > u_m(0) = 0 for all m except m=[L/2], where u_m(0) = 1
>> > This is a set of linear, first-order ODEs
>> > The time evolution using the various TS methods can be run by
>> 34,46d41
>> <
>> < We compare the approximate solution with the exact solution, given by
>> < u_exact(x,t) = exp(-36*pi*pi*t) * sin(6*pi*x) +
>> < 3*exp(-4*pi*pi*t) * sin(2*pi*x)
>> <
>> < Notes:
>> < This code demonstrates the TS solver interface to two variants of
>> < linear problems, u_t = f(u,t), namely
>> < - time-dependent f: f(u,t) is a function of t
>> < - time-independent f: f(u,t) is simply f(u)
>> <
>> < The uniprocessor version of this code is ts/examples/tutorials/ex3.c
>> <
>> 59a55
>> >
>> 64d59
>> <
>> 75d69
>> < Vec solution; /* global exact solution vector */
>> 77d70
>> < PetscReal h; /* mesh width h = 1/(m-1) */
>> 79,80c72,77
>> < PetscViewer viewer1, viewer2; /* viewers for the solution and error
>> */
>> < PetscReal norm_2, norm_max; /* error norms */
>> ---
>> > PetscViewer viewer1; /* viewer for the solution */
>> > PetscReal norm_2; /* wavefunction norm */
>> > Vec randoms; /* random numbers between -1 and 1 */
>> > PetscReal alpha; /* disorder strength */
>> > PetscReal h0; /* drive amplitude */
>> > PetscReal w; /* drive frequency */
>> 85a83
>> > extern PetscInt kdel (PetscInt, PetscInt);
>> 87,88c85,86
>> < extern PetscErrorCode RHSMatrixHeat (TS, PetscReal, Vec, Mat, Mat, void
>> *);
>> < extern PetscErrorCode RHSFunctionHeat (TS, PetscReal, Vec, Vec, void *);
>> ---
>> > extern PetscErrorCode RHSMatrixSchrodinger (TS, PetscReal, Vec, Mat,
>> Mat,
>> > void *);
>> 90d87
>> < extern PetscErrorCode ExactSolution (PetscReal, Vec, AppCtx *);
>> 101,102c98,99
>> < PetscReal time_total_max = 1.0; /* default max total time */
>> < PetscInt time_steps_max = 100; /* default max timesteps */
>> ---
>> > PetscReal time_total_max = TIME_TOTAL_MAX; /* default max total time
>> */
>> > PetscInt time_steps_max = TIME_STEPS_MAX; /* default max timesteps */
>> 108,109c105,107
>> < PetscBool flg;
>> < TSProblemType tsproblem = TS_LINEAR;
>> ---
>> > PetscReal alpha; /* Disorder strength */
>> > PetscReal h0; /* Drive amplitude */
>> > PetscReal w; /* drive frequency */
>> 119c117
>> < m = 60;
>> ---
>> > m = DEFAULT_SIZE;
>> 121a120,132
>> >
>> > h0 = DEFAULT_AMPL;
>> > ierr = PetscOptionsGetReal (NULL, "-a", &h0, NULL);
>> > CHKERRQ (ierr);
>> >
>> > alpha = DEFAULT_ALPHA;
>> > ierr = PetscOptionsGetReal (NULL, "-d", &alpha, NULL);
>> > CHKERRQ (ierr);
>> >
>> > w = DEFAULT_W;
>> > ierr = PetscOptionsGetReal (NULL, "-w", &w, NULL);
>> > CHKERRQ (ierr);
>> >
>> 123a135
>> >
>> 125,127c137,141
>> < appctx.h = 1.0 / (m - 1.0);
>> < appctx.norm_2 = 0.0;
>> < appctx.norm_max = 0.0;
>> ---
>> > appctx.alpha = alpha;
>> > appctx.h0 = h0;
>> > appctx.w = w;
>> >
>> > appctx.norm_2 = 1.0;
>> 163d176
>> < create global work vector for storing exact solution.
>> 167,168d179
>> < ierr = VecDuplicate (u, &appctx.solution);
>> < CHKERRQ (ierr);
>> 171c182
>> < Set up displays to show graphs of the solution and error
>> ---
>> > Set up displays to show graphs of the solution
>> 173,189c184,195
>> <
>> < ierr =
>> < PetscViewerDrawOpen (PETSC_COMM_WORLD, 0, "", 80, 380, 400, 160,
>> < &appctx.viewer1);
>> < CHKERRQ (ierr);
>> < ierr = PetscViewerDrawGetDraw (appctx.viewer1, 0, &draw);
>> < CHKERRQ (ierr);
>> < ierr = PetscDrawSetDoubleBuffer (draw);
>> < CHKERRQ (ierr);
>> < ierr =
>> < PetscViewerDrawOpen (PETSC_COMM_WORLD, 0, "", 80, 0, 400, 160,
>> < &appctx.viewer2);
>> < CHKERRQ (ierr);
>> < ierr = PetscViewerDrawGetDraw (appctx.viewer2, 0, &draw);
>> < CHKERRQ (ierr);
>> < ierr = PetscDrawSetDoubleBuffer (draw);
>> < CHKERRQ (ierr);
>> ---
>> > if (appctx.debug)
>> > {
>> > ierr =
>> > PetscViewerDrawOpen (PETSC_COMM_WORLD, 0, "Probability Distribution",
>> > PETSC_DECIDE, PETSC_DECIDE, PETSC_DRAW_HALF_SIZE,
>> > PETSC_DRAW_HALF_SIZE, &appctx.viewer1);
>> > CHKERRQ (ierr);
>> > ierr = PetscViewerDrawGetDraw (appctx.viewer1, 0, &draw);
>> > CHKERRQ (ierr);
>> > ierr = PetscDrawSetDoubleBuffer (draw);
>> > CHKERRQ (ierr);
>> > }
>> 197,202c203
>> <
>> < flg = PETSC_FALSE;
>> < ierr = PetscOptionsGetBool (NULL, "-nonlinear", &flg, NULL);
>> < CHKERRQ (ierr);
>> < ierr = TSSetProblemType (ts, flg ? TS_NONLINEAR : TS_LINEAR);
>> < CHKERRQ (ierr);
>> ---
>> > TSSetProblemType (ts, TS_LINEAR); /* In this case, the dynamics is
>> always linear */
>> 205c206
>> < Set optional user-defined monitoring routine
>> ---
>> > Set user-defined monitoring routine
>> 211d211
>> <
>> 224,225c224,257
>> < flg = PETSC_FALSE;
>> < ierr = PetscOptionsGetBool (NULL, "-time_dependent_rhs", &flg, NULL);
>> ---
>> > /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>> > Create random number context and generate random numbers
>> > into application context
>> > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>> - */
>> > PetscRandom rctx; /* random number generator context */
>> > PetscScalar randomscale = 2.0 * PETSC_i * alpha;
>> >
>> > ierr = PetscRandomCreate (PETSC_COMM_SELF, &rctx);
>> > CHKERRQ (ierr);
>> > ierr = PetscRandomSetFromOptions (rctx);
>> > CHKERRQ (ierr);
>> > ierr = PetscRandomSetInterval (rctx, -1.0, 1.0);
>> > CHKERRQ (ierr);
>> > ierr = VecCreate (PETSC_COMM_SELF, &appctx.randoms);
>> > CHKERRQ (ierr);
>> > ierr = VecSetSizes (appctx.randoms, PETSC_DECIDE, m);
>> > CHKERRQ (ierr);
>> > ierr = VecSetFromOptions (appctx.randoms);
>> > CHKERRQ (ierr);
>> > ierr = VecSetRandom (appctx.randoms, rctx);
>> > CHKERRQ (ierr);
>> > ierr = VecScale (appctx.randoms, randomscale);
>> > CHKERRQ (ierr);
>> > ierr = PetscRandomDestroy (&rctx);
>> > CHKERRQ (ierr);
>> >
>> > /*
>> > For linear problems with a time-dependent f(u,t) in the equation
>> > u_t = f(u,t), the user provides the discretized right-hand-side
>> > as a time-dependent matrix.
>> > */
>> > ierr = TSSetRHSFunction (ts, NULL, TSComputeRHSFunctionLinear,
>> &appctx);
>> > CHKERRQ (ierr);
>> > ierr = TSSetRHSJacobian (ts, A, A, RHSMatrixSchrodinger, &appctx);
>> 227,254d258
>> < if (flg)
>> < {
>> < /*
>> < For linear problems with a time-dependent f(u,t) in the
>> equation
>> < u_t = f(u,t), the user provides the discretized right-hand-side
>> < as a time-dependent matrix.
>> < */
>> < ierr = TSSetRHSFunction (ts, NULL, TSComputeRHSFunctionLinear,
>> &appctx);
>> < CHKERRQ (ierr);
>> < ierr = TSSetRHSJacobian (ts, A, A, RHSMatrixHeat, &appctx);
>> < CHKERRQ (ierr);
>> < }
>> < else
>> < {
>> < /*
>> < For linear problems with a time-independent f(u) in the
>> equation
>> < u_t = f(u), the user provides the discretized right-hand-side
>> < as a matrix only once, and then sets a null matrix evaluation
>> < routine.
>> < */
>> < ierr = RHSMatrixHeat (ts, 0.0, u, A, A, &appctx);
>> < CHKERRQ (ierr);
>> < ierr = TSSetRHSFunction (ts, NULL, TSComputeRHSFunctionLinear,
>> &appctx);
>> < CHKERRQ (ierr);
>> < ierr =
>> < TSSetRHSJacobian (ts, A, A, TSComputeRHSJacobianConstant, &appctx);
>> < CHKERRQ (ierr);
>> < }
>> 256,266d259
>> < if (tsproblem == TS_NONLINEAR)
>> < {
>> < SNES snes;
>> < ierr = TSSetRHSFunction (ts, NULL, RHSFunctionHeat, &appctx);
>> < CHKERRQ (ierr);
>> < ierr = TSGetSNES (ts, &snes);
>> < CHKERRQ (ierr);
>> < ierr =
>> < SNESSetJacobian (snes, NULL, NULL, SNESComputeJacobianDefault, NULL);
>> < CHKERRQ (ierr);
>> < }
>> 272,273c265,266
>> < dt = appctx.h * appctx.h / 2.0;
>> < ierr = TSSetInitialTimeStep (ts, 0.0, dt);
>> ---
>> > dt = INIT_STEP;
>> > ierr = TSSetInitialTimeStep (ts, INIT_TIME, dt);
>> 281a275
>> > - Set default tolerances
>> 287a282,285
>> > ierr = TSSetType (ts, TSRK);
>> > CHKERRQ (ierr);
>> > ierr = TSRKSetType (ts, TSRK5DP); //Default solver is Runge Kutta
>> (5) Prince-Dormand (4)
>> > CHKERRQ (ierr);
>> 289a288,291
>> >
>> > ierr = TSSetTolerances (ts, ABSERROR, NULL, RELERROR, NULL);
>> > CHKERRQ (ierr);
>> >
>> 321,324c323,324
>> < PetscPrintf (PETSC_COMM_WORLD,
>> < "Avg. error (2 norm) = %g Avg. error (max norm) = %g\n",
>> < (double) (appctx.norm_2 / steps),
>> < (double) (appctx.norm_max / steps));
>> ---
>> > PetscPrintf (PETSC_COMM_WORLD, "Final. (2 norm) = %g \n",
>> > (double) (appctx.norm_2));
>> 338,341c338,343
>> < ierr = PetscViewerDestroy (&appctx.viewer1);
>> < CHKERRQ (ierr);
>> < ierr = PetscViewerDestroy (&appctx.viewer2);
>> < CHKERRQ (ierr);
>> ---
>> >
>> > if (appctx.debug)
>> > {
>> > ierr = PetscViewerDestroy (&appctx.viewer1);
>> > CHKERRQ (ierr);
>> > }
>> 344,345d345
>> < ierr = VecDestroy (&appctx.solution);
>> < CHKERRQ (ierr);
>> 347a348,349
>> > ierr = VecDestroy (&appctx.randoms);
>> > CHKERRQ (ierr);
>> 377c379
>> < PetscScalar *u_localptr, h = appctx->h;
>> ---
>> > PetscScalar *u_localptr;
>> 379a382
>> > PetscScalar zero = 0.0, one = 1.0;
>> 394,395d396
>> < - Note that the Fortran interface to VecGetArray() differs from the
>> < C version. See the users manual for details.
>> 402,403c403,404
>> < directly into the array locations. Alternatively, we could use
>> < VecSetValues() or VecSetValuesLocal().
>> ---
>> > directly into the array locations.
>> > This will set the midpoint to unity
>> 406,408c407,410
>> < u_localptr[i - mybase] =
>> < PetscSinScalar (PETSC_PI * i * 6. * h) +
>> < 3. * PetscSinScalar (PETSC_PI * i * 2. * h);
>> ---
>> > if (i - mybase == floor ((appctx->m) / 2))
>> > u_localptr[i - mybase] = one;
>> > else
>> > u_localptr[i - mybase] = zero;
>> 421c423
>> < ierr = PetscPrintf (appctx->comm, "initial guess vector\n");
>> ---
>> > ierr = PetscPrintf (appctx->comm, "initial vector\n");
>> 432,486d433
>> < #define __FUNCT__ "ExactSolution"
>> < /*
>> < ExactSolution - Computes the exact solution at a given time.
>> <
>> < Input Parameters:
>> < t - current time
>> < solution - vector in which exact solution will be computed
>> < appctx - user-defined application context
>> <
>> < Output Parameter:
>> < solution - vector with the newly computed exact solution
>> < */
>> < PetscErrorCode
>> < ExactSolution (PetscReal t, Vec solution, AppCtx * appctx)
>> < {
>> < PetscScalar *s_localptr, h = appctx->h, ex1, ex2, sc1, sc2;
>> < PetscInt i, mybase, myend;
>> < PetscErrorCode ierr;
>> <
>> < /*
>> < Determine starting and ending points of each processor's
>> < range of grid values
>> < */
>> < ierr = VecGetOwnershipRange (solution, &mybase, &myend);
>> < CHKERRQ (ierr);
>> <
>> < /*
>> < Get a pointer to vector data.
>> < */
>> < ierr = VecGetArray (solution, &s_localptr);
>> < CHKERRQ (ierr);
>> <
>> < /*
>> < Simply write the solution directly into the array locations.
>> < Alternatively, we culd use VecSetValues() or VecSetValuesLocal().
>> < */
>> < ex1 = PetscExpReal (-36. * PETSC_PI * PETSC_PI * t);
>> < ex2 = PetscExpReal (-4. * PETSC_PI * PETSC_PI * t);
>> < sc1 = PETSC_PI * 6. * h;
>> < sc2 = PETSC_PI * 2. * h;
>> < for (i = mybase; i < myend; i++)
>> < s_localptr[i - mybase] =
>> < PetscSinScalar (sc1 * (PetscReal) i) * ex1 +
>> < 3. * PetscSinScalar (sc2 * (PetscReal) i) * ex2;
>> <
>> < /*
>> < Restore vector
>> < */
>> < ierr = VecRestoreArray (solution, &s_localptr);
>> < CHKERRQ (ierr);
>> < return 0;
>> < }
>> <
>> < /*
>> --------------------------------------------------------------------- */
>> < #undef __FUNCT__
>> 490,491c437
>> < each timestep. This example plots the solution and computes the
>> < error in two different norms.
>> ---
>> > each timestep. This example plots the solution and returns the norm
>> 509,515c455
>> < PetscReal norm_2, norm_max;
>> <
>> < /*
>> < View a graph of the current iterate
>> < */
>> < ierr = VecView (u, appctx->viewer2);
>> < CHKERRQ (ierr);
>> ---
>> > PetscReal norm_2;
>> 518c458
>> < Compute the exact solution
>> ---
>> > Compute and store norm of wavefunction in norm_2
>> 520c460
>> < ierr = ExactSolution (time, appctx->solution, appctx);
>> ---
>> > ierr = VecNorm (u, NORM_2, &norm_2);
>> 521a462
>> > appctx->norm_2 = norm_2;
>> 523,536d463
>> < /*
>> < Print debugging information if desired
>> < */
>> < if (appctx->debug)
>> < {
>> < ierr = PetscPrintf (appctx->comm, "Computed solution vector\n");
>> < CHKERRQ (ierr);
>> < ierr = VecView (u, PETSC_VIEWER_STDOUT_WORLD);
>> < CHKERRQ (ierr);
>> < ierr = PetscPrintf (appctx->comm, "Exact solution vector\n");
>> < CHKERRQ (ierr);
>> < ierr = VecView (appctx->solution, PETSC_VIEWER_STDOUT_WORLD);
>> < CHKERRQ (ierr);
>> < }
>> 538,541c465,467
>> < /*
>> < Compute the 2-norm and max-norm of the error
>> < */
>> < ierr = VecAXPY (appctx->solution, -1.0, u);
>> ---
>> > /* Build the lattice sites */
>> > Vec uabs;
>> > ierr = VecDuplicate (u, &uabs);
>> 543c469,470
>> < ierr = VecNorm (appctx->solution, NORM_2, &norm_2);
>> ---
>> >
>> > ierr = VecCopy (u, uabs);
>> 545,546c472
>> < norm_2 = PetscSqrtReal (appctx->h) * norm_2;
>> < ierr = VecNorm (appctx->solution, NORM_MAX, &norm_max);
>> ---
>> > ierr = VecAbs (uabs);
>> 549,556c475
>> < /*
>> < PetscPrintf() causes only the first processor in this
>> < communicator to print the timestep information.
>> < */
>> < ierr =
>> < PetscPrintf (appctx->comm,
>> < "Timestep %D: time = %g 2-norm error = %g max norm error = %g\n",
>> < step, (double) time, (double) norm_2, (double) norm_max);
>> ---
>> > ierr = VecPointwiseMult (uabs, uabs, uabs);
>> 558,559d476
>> < appctx->norm_2 += norm_2;
>> < appctx->norm_max += norm_max;
>> 561,565c478,483
>> < /*
>> < View a graph of the error
>> < */
>> < ierr = VecView (appctx->solution, appctx->viewer1);
>> < CHKERRQ (ierr);
>> ---
>> > //if (!appctx->debug)
>> > //{
>> > /* Create another viewer in application context and dump full output
>> to it */
>> > //ierr = VecView (u, appctx->viewer2);
>> > //CHKERRQ (ierr);
>> > //}
>> 572,574c490,497
>> < ierr = PetscPrintf (appctx->comm, "Error vector\n");
>> < CHKERRQ (ierr);
>> < ierr = VecView (appctx->solution, PETSC_VIEWER_STDOUT_WORLD);
>> ---
>> > /*
>> > View a graph of the current iterate
>> > */
>> > ierr =
>> > PetscPrintf (PETSC_COMM_WORLD, "time = %lf\t norm = %lf\n", time,
>> > norm_2);
>> >
>> > ierr = VecView (uabs, appctx->viewer1);
>> 576a500,501
>> > ierr = VecDestroy (&uabs);
>> > CHKERRQ (ierr);
>> 583c508
>> < #define __FUNCT__ "RHSMatrixHeat"
>> ---
>> > #define __FUNCT__ "RHSMatrixSchrodinger"
>> 585c510
>> < RHSMatrixHeat - User-provided routine to compute the right-hand-side
>> ---
>> > RHSMatrixSchrodinger - User-provided routine to compute the
>> right-hand-side
>> 599,600c524
>> < Notes:
>> < RHSMatrixHeat computes entries for the locally owned part of the
>> system.
>> ---
>> > RHSMatrixSchrodinger computes entries for the locally owned part of
>> the system.
>> 613c537
>> < RHSMatrixHeat (TS ts, PetscReal t, Vec X, Mat AA, Mat BB, void *ctx)
>> ---
>> > RHSMatrixSchrodinger (TS ts, PetscReal t, Vec X, Mat AA, Mat BB, void
>> *ctx)
>> 618,619c542,554
>> < PetscInt i, mstart, mend, idx[3];
>> < PetscScalar v[3], stwo = -2. / (appctx->h * appctx->h), sone = -.5 *
>> stwo;
>> ---
>> > PetscInt m = appctx->m;
>> > PetscScalar drive;
>> > PetscReal h0 = appctx->h0;
>> > PetscReal w = appctx->w;
>> > PetscScalar *randoms_loc;
>> >
>> > ierr = VecGetArray (appctx->randoms, &randoms_loc);
>> > CHKERRQ (ierr);
>> > /*Calculate drive amplitude at this time */
>> >
>> > drive = w * t;
>> > drive = PetscSinReal (drive);
>> > drive = 2.0 * PETSC_i * h0 * drive;
>> 623,625c558,566
>> < - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>> - */
>> <
>> < ierr = MatGetOwnershipRange (A, &mstart, &mend);
>> ---
>> > The Jacobian Matrix in our case is blocked
>> > - The values are A_{lm} =
>> i\left(\delta_{lm+1}+\delta_{lm-1}\right)+2i \alpha h_m\delta_{lm} + 2i
>> h(t) \delta_{lm}
>> > - Thus the superdiagonal and subdiagonal elements are just i, and
>> diagonal elements are random + drive
>> > - The elements are set rowwise.
>> > - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>> - */
>> > PetscInt rowstart, rowend, rowcount;
>> > PetscScalar edge[2], middle[3];
>> > PetscInt idxedge[2], idxmiddle[3];
>> > ierr = MatGetOwnershipRange (A, &rowstart, &rowend);
>> 627,655c568
>> <
>> < /*
>> < Set matrix rows corresponding to boundary data
>> < */
>> <
>> < if (mstart == 0)
>> < { /* first processor only */
>> < v[0] = 1.0;
>> < ierr = MatSetValues (A, 1, &mstart, 1, &mstart, v, INSERT_VALUES);
>> < CHKERRQ (ierr);
>> < mstart++;
>> < }
>> <
>> < if (mend == appctx->m)
>> < { /* last processor only */
>> < mend--;
>> < v[0] = 1.0;
>> < ierr = MatSetValues (A, 1, &mend, 1, &mend, v, INSERT_VALUES);
>> < CHKERRQ (ierr);
>> < }
>> <
>> < /*
>> < Set matrix rows corresponding to interior data. We construct the
>> < matrix one row at a time.
>> < */
>> < v[0] = sone;
>> < v[1] = stwo;
>> < v[2] = sone;
>> < for (i = mstart; i < mend; i++)
>> ---
>> > for (rowcount = rowstart; rowcount < rowend; rowcount++)
>> 657,661c570,602
>> < idx[0] = i - 1;
>> < idx[1] = i;
>> < idx[2] = i + 1;
>> < ierr = MatSetValues (A, 1, &i, 3, idx, v, INSERT_VALUES);
>> < CHKERRQ (ierr);
>> ---
>> > if (rowcount == 0) //If it is the first row of the whole matrix
>> > {
>> > edge[0] = drive + randoms_loc[0];
>> > edge[1] = PETSC_i;
>> > idxedge[0] = 0;
>> > idxedge[1] = 1;
>> > ierr =
>> > MatSetValues (A, 1, &rowcount, 2, idxedge, edge, INSERT_VALUES);
>> > CHKERRQ (ierr);
>> > }
>> > else if (rowcount == m - 1) //If it is the last row of the whole
>> matrix
>> > {
>> > edge[0] = PETSC_i;
>> > edge[1] = drive + randoms_loc[m - 1];
>> > idxedge[0] = m - 2;
>> > idxedge[1] = m - 1;
>> > ierr =
>> > MatSetValues (A, 1, &rowcount, 2, idxedge, edge, INSERT_VALUES);
>> > CHKERRQ (ierr);
>> > }
>> > else
>> > {
>> > middle[0] = PETSC_i;
>> > middle[1] = drive + randoms_loc[rowcount];
>> > middle[2] = PETSC_i;
>> > idxmiddle[0] = rowcount - 1;
>> > idxmiddle[1] = rowcount;
>> > idxmiddle[2] = rowcount + 1;
>> > ierr =
>> > MatSetValues (A, 1, &rowcount, 3, idxmiddle, middle,
>> > INSERT_VALUES);
>> > CHKERRQ (ierr);
>> > }
>> 683a625,626
>> > ierr = VecRestoreArray (appctx->randoms, &randoms_loc);
>> > CHKERRQ (ierr);
>> 689,691c632,634
>> < #define __FUNCT__ "RHSFunctionHeat"
>> < PetscErrorCode
>> < RHSFunctionHeat (TS ts, PetscReal t, Vec globalin, Vec globalout, void
>> *ctx)
>> ---
>> > #define __FUNCT__ "kdel"
>> > PetscInt
>> > kdel (PetscInt i, PetscInt j)
>> 693,704c636,639
>> < PetscErrorCode ierr;
>> < Mat A;
>> <
>> < PetscFunctionBeginUser;
>> < ierr = TSGetRHSJacobian (ts, &A, NULL, NULL, &ctx);
>> < CHKERRQ (ierr);
>> < ierr = RHSMatrixHeat (ts, t, globalin, A, NULL, ctx);
>> < CHKERRQ (ierr);
>> < /* ierr = MatView(A,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); */
>> < ierr = MatMult (A, globalin, globalout);
>> < CHKERRQ (ierr);
>> < PetscFunctionReturn (0);
>> ---
>> > PetscInt value = 0;
>> > if (i == j)
>> > value = 1;
>> > return value;
>>
>>
>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>>
>> --
>> ---
>> *Analabha Roy*
>> C.S.I.R <http://www.csir.res.in> Senior Research Associate
>> <http://csirhrdg.res.in/poolsra.htm>
>> Saha Institute of Nuclear Physics <http://www.saha.ac.in>
>> Section 1, Block AF
>> Bidhannagar, Calcutta 700064
>> India
>> *Emails*: daneel at physics.utexas.edu, hariseldon99 at gmail.com
>> *Webpage*: http://www.ph.utexas.edu/~daneel/
>>
>
>
>
> --
> ---
> *Analabha Roy*
> C.S.I.R <http://www.csir.res.in> Senior Research Associate
> <http://csirhrdg.res.in/poolsra.htm>
> Saha Institute of Nuclear Physics <http://www.saha.ac.in>
> Section 1, Block AF
> Bidhannagar, Calcutta 700064
> India
> *Emails*: daneel at physics.utexas.edu, hariseldon99 at gmail.com
> *Webpage*: http://www.ph.utexas.edu/~daneel/
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141016/17064ce6/attachment-0001.html>
More information about the petsc-users
mailing list