[petsc-users] a question about DAE Function

Jin, Shuangshuang Shuangshuang.Jin at pnnl.gov
Wed Jun 5 13:45:37 CDT 2013


Hi, thanks for your message, but I still have a few questions here.

1. How to "reuse the VecScatter"?

2. The data movement of O(N) is for scattering X vector to each processor? And the computation of O(N^3) is for solving the DAE equation in parallel? So "the parallelism will be useful if N is large enough" as you indicated. Then does that mean my current implementation of scattering X and defining f[] on different processors to solve the DAE in parallel is the right way to do it? Then what's the benefit of using MatElemental?

3. When I looked into the example of MatElemental, it seems to me that they're all matrix operations. I cannot understand its connection with my code, especially the IFUNCTION part. I only have several functions definitions f[] using the data of a distributed array X. Don't know where and how to use MatElemental here.

Please help me to clarify this questions which I'm really confused with.

Thanks,
Shuangshuang


-----Original Message-----
From: Jed Brown [mailto:five9a2 at gmail.com] On Behalf Of Jed Brown
Sent: Tuesday, June 04, 2013 4:28 PM
To: Jin, Shuangshuang
Cc: petsc-users at mcs.anl.gov
Subject: RE: [petsc-users] a question about DAE Function

"Jin, Shuangshuang" <Shuangshuang.Jin at pnnl.gov> writes:

> Sorry about the confusion. Please allow me to illustrate the problem in a better way below:
>
> I have a set of DAE equations, which can be defined as 4*n f[] equations:
>   

[...]

> As you may see, all the f functions are dependent on some x values. 
> However, the x is a distributed array of size 4*n.

Thanks, this problem is dense.

> In order to use the access x[] from different processor, I have to 
> scatter x before I use them to define f[], the scatterMyVec function 
> is shown below.
>
> // Scatter the context of v to each processor to get a global array 
> (vArray) values static PetscErrorCode scatterMyVec(Vec vin, 
> PetscScalar **voutArray) {
>   PetscErrorCode ierr;
>   VecScatter ctx;
>   Vec vout;
>
>   ierr = VecScatterCreateToAll(vin, &ctx, &vout); CHKERRQ(ierr);
>   ierr = VecScatterBegin(ctx, vin, vout, INSERT_VALUES, SCATTER_FORWARD); CHKERRQ(ierr);
>   ierr = VecScatterEnd(ctx, vin, vout, INSERT_VALUES, SCATTER_FORWARD); CHKERRQ(ierr);
>   ierr = VecGetArray(vout, voutArray); CHKERRQ(ierr);
>
>   ierr = VecScatterDestroy(&ctx); CHKERRQ(ierr);

Note that you can reuse the VecScatter.

>   ierr = VecRestoreArray(vout, PETSC_NULL);

Please don't abuse VecRestoreArray to lie to PETSc.  Yes, it will "work"
with normal PETSc Vecs, but not in general (e.g. GPU-backed Vecs).

>
>   PetscFunctionReturn(0);
> }
>
> As I mentioned in the previous email, I think this is quite 
> inefficient because the scattering of x during each iteration 
> "requires a good chunk of communication". And as Barry pointed out, "
> the Jacobian will be dense and the linear solves with the dense 
> Jacobian will dominate the time anyways. "
>
> So my question now is: 1. Am I right to use the PETSc TS parallel 
> paradigm in this way to solve my DAE equations? My concern is although 
> I can see the solution be computed across multiple processors, the 
> overhead of scattering x from time to time and the linear solving with 
> dense Jacobian will eat up the performance gains of parallel 
> computation.

The data movement is O(N) while the computation is O(N^3), so parallelism will be useful if N is large enough.

> 2. If the answer of the above question is "NO", what's the correct 
> parallel tool I should look into in PETSc to solve my DAE equations?
> MatElemental as most of you suggested?

TS with MatElemental is right.  Here are some tests that use MatElemental.  You should use MatGetOwnershipIS() to determine which matrix entries you are responsible for computing.  You can use your current code for IFunction evaluation.

src/ksp/ksp/examples/tests/ex40.c
src/mat/examples/tests/ex104.c
src/mat/examples/tests/ex145.c
src/mat/examples/tests/ex38.c
src/mat/examples/tests/ex39.c


More information about the petsc-users mailing list