[petsc-dev] Possible bugs when using TS with ViennaCL

Mani Chandra mc0710 at gmail.com
Sun Jan 26 12:52:36 CST 2014


Hi Karl,

Thanks! It worked. However the following cases still don't work:

1)  Using ComputeResidual (not ComputeResidualViennaCL) but with
DMSetVecType(da, VECVIENNACL) and DMSetMatType(da, MATAIJVIENNACL). The
SNES solver converges but gives nonsensical results. I just get a flashing
blob.

2) Using ComputeResidualViennaCL (not ComputeResidual) and
*without*DMSetVecType(da, VECVIENNACL) and DMSetMatType(da,
MATAIJVIENNACL). The
SNES solver converges but gives nonsensical results. I just get a flashing
blob as 1) above. This case probably should not work anyway cause I'm not
sure if VecViennaCLGetArray works if the Vecs have not been set to ViennaCL.

Cheers,
Mani


On Sun, Jan 26, 2014 at 4:25 AM, Karl Rupp <rupp at mcs.anl.gov> wrote:

> Hi Mani,
>
> I tested your code with branch `karlrupp/viennacl-getarray`, which is also
> in `next` and just got merged to `master`.
>
> Initially I had to fix up a missing double precision pragma for OpenCL,
> two GCC warnings and a missing local work group size specification in your
> code to get it to run at all, but then I immediately got the correct
> results. The fixed code is attached, changes are tagged with '//FIXED'.
>
> Except for possibly machine-specific issues, the problem was the missing
> local work size specification in your code. Always specify both the local
> and the global work sizes and make sure that the global sizes are divisible
> by the respective local work sizes. Good local work sizes are usually in
> the range 64-256 workers total, hence I set the two local work size
> dimensions to 8. Also, I recommend some additional checks for 'REAL' to be
> identical to 'PetscScalar', otherwise you will get garbage because of
> obvious float<->double incompatibilities at the binary level.
>
> Best regards,
> Karli
>
>
>
>
> On 01/25/2014 10:22 PM, Mani Chandra wrote:
>
>> Hi Karl,
>>
>> Thanks for looking into it. Do let me know if there is anything I can do
>> to help you debug this. Attached is the code.
>>
>> Cheers,
>> Mani
>>
>>
>> On Sat, Jan 25, 2014 at 3:14 PM, Karl Rupp <rupp at mcs.anl.gov
>> <mailto:rupp at mcs.anl.gov>> wrote:
>>
>>     Hi Mani,
>>
>>     could you please send me the code including the error checks?
>>
>>     Thanks and best regards,
>>     Karli
>>
>>
>>
>>     On 01/25/2014 10:07 PM, Mani Chandra wrote:
>>
>>         Hi Karl,
>>
>>         I now checked the error flags of all petsc functions in all
>>         functions. I
>>         also recompiled petsc with debug. Here's the report:
>>
>>         Case 1) Still works. All good.
>>
>>         Case 2) This is the case with ComputeResidual using VecGetArrays
>> but
>>         with DMSetVecType(da, VECVIENNACL) and DMSetMatType(da,
>>         MATAIJVIENNACL).
>>         The flags don't show any errors. *The programs proceeds smoothly
>>         but the
>>         solution is wrong even though SNES is converging.*
>>
>>
>>         Case 3) This is the case with ComputeResidualViennaCL using
>>         VecViennaCLGetArrays with and without DMSetVecType(da,
>>         VECVIENNACL) and
>>         DMSetMatType(da, MATAIJVIENNACL). In both cases I get the
>>         following error:
>>
>>         [0]PETSC ERROR: TSComputeIFunction() line 676 in
>>         /home/mc/Downloads/petsc/src/__ts/interface/ts.c
>>
>>         [0]PETSC ERROR: SNESTSFormFunction_Theta() line 284 in
>>         /home/mc/Downloads/petsc/src/__ts/impls/implicit/theta/theta.__c
>>
>>         [0]PETSC ERROR: SNESTSFormFunction() line 3499 in
>>         /home/mc/Downloads/petsc/src/__ts/interface/ts.c
>>
>>         [0]PETSC ERROR: SNESComputeFunction() line 2089 in
>>         /home/mc/Downloads/petsc/src/__snes/interface/snes.c
>>
>>         [0]PETSC ERROR: SNESSolve_NEWTONLS() line 175 in
>>         /home/mc/Downloads/petsc/src/__snes/impls/ls/ls.c
>>
>>         [0]PETSC ERROR: SNESSolve() line 3812 in
>>         /home/mc/Downloads/petsc/src/__snes/interface/snes.c
>>
>>         [0]PETSC ERROR: TSStep_Theta() line 183 in
>>         /home/mc/Downloads/petsc/src/__ts/impls/implicit/theta/theta.__c
>>
>>         [0]PETSC ERROR: TSStep() line 2625 in
>>         /home/mc/Downloads/petsc/src/__ts/interface/ts.c
>>
>>         [0]PETSC ERROR: TSSolve() line 2741 in
>>         /home/mc/Downloads/petsc/src/__ts/interface/ts.c
>>
>>         [0]PETSC ERROR: main() line 83 in
>>         /home/mc/PhD/opencl_tests/__petsc_opencl/petsc_opencl.cpp
>>         ------------------------------__----------------------------
>> --__--------------
>>
>>         MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>>         with errorcode -473550369.
>>
>>         NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
>> processes.
>>         You may or may not see output from other processes, depending on
>>         exactly when Open MPI kills them.
>>         ------------------------------__----------------------------
>> --__--------------
>>
>>
>>         The error occured at VecViennaCLGetArrayRead/Write in my
>>         ComputeResidualViennaCL function. Note that with the petsc debug
>>         mode,
>>         the code crashes with the above error even if I don't catch the
>>         error
>>         codes. Might be because of some VecViennaCLGetArray calls inside
>>         petsc.
>>
>>         Cheers,
>>         Mani
>>
>>
>>         On Sat, Jan 25, 2014 at 2:48 AM, Karl Rupp <rupp at mcs.anl.gov
>>         <mailto:rupp at mcs.anl.gov>
>>         <mailto:rupp at mcs.anl.gov <mailto:rupp at mcs.anl.gov>>> wrote:
>>
>>              Hi Mani,
>>
>>              please check the return value of *all* function calls from
>>         PETSc, e.g.
>>                ierr = DMCreateGlobalVector(da, &soln);CHKERRQ(ierr);
>>              instead of just
>>                DMCreateGlobalVector(da, &soln);
>>              Most likely one of the routines threw an error, but your
>>         code just
>>              kept going, producing wrong results.
>>
>>              Best regards,
>>              Karli
>>
>>
>>
>>              On 01/25/2014 05:35 AM, Mani Chandra wrote:
>>
>>                  Hi Everyone,
>>
>>                  I'm trying to use TS with ViennaCL vecs/mats and residual
>>                  evaluation on
>>                  device and have encountered some problems. I have
>>         attached a
>>                  small test
>>                  code that illustrates the issue.
>>
>>                  The code simply advects a blob diagonally using TS. I
>> have
>>                  written the
>>                  residual evaluation function using 1) the usual Petsc
>>         vectors
>>                  (VecGetArray) and 2) using ViennaCL vectors
>>                  (VecViennaCLGetArrayRead/____Write).
>>
>>
>>
>>                  Run the code using the following:
>>                  ./petsc_opencl -ts_monitor -snes_monitor -ts_max_steps
>>         1000 -ts_type
>>                  theta -ts_dt 10 -snes_rtol 1e-4 -ts_final_time 1000
>>                  -ts_monitor_draw_solution
>>
>>                  Case 1) No ViennaCL anywhere. I simply use the usual
>> Petsc
>>                  vectors and
>>                  set the residual evaluation function as ComputeResidual
>>         (line
>>                  no. 55).
>>                  This case works and the blob is indeed advected as can
>>         be seen. (I
>>                  haven't bothered with the boundaries. The simulation
>>         just stops
>>                  before
>>                  the blob hits the boundaries).
>>
>>                  Case 2) We again use the ComputeResidual but now enable
>>         ViennaCL
>>                  vecs
>>                  and mats (line nos. 48, 49). This case does NOT work.
>>         The SNES
>>                  monitor
>>                  shows convergence but the solution makes no sense.
>>
>>                  Case 3) We now use ComputeResidualViennaCL (line no.
>>         56). This
>>                  does NOT
>>                  work either with or without enabling the ViennaCL vecs
>>         (line
>>                  nos. 48, 49).
>>
>>                  Cheers,
>>                  Mani
>>
>>
>>
>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140126/307aa5f7/attachment.html>


More information about the petsc-dev mailing list