[petsc-users] Cuda libraries and DMDA
Matthew Knepley
knepley at gmail.com
Mon Apr 9 19:57:52 CDT 2018
On Mon, Apr 9, 2018 at 7:55 PM, Manuel Valera <mvalera-w at sdsu.edu> wrote:
> On Mon, Apr 9, 2018 at 4:53 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Mon, Apr 9, 2018 at 7:52 PM, Manuel Valera <mvalera-w at sdsu.edu> wrote:
>>
>>> Ok thanks, i'm learning more every day,
>>>
>>> I still get the same error, i am running with -dm_vec_type viennacl
>>> -dm_mat_type aijviennacl -pc_type saviennacl
>>>
>>
>> 1) Are you calling DMSetFromOptions()?
>>
>
> Yes, i get:
>
Let me be more precise. For the DMDA you are using, you must call
DMSetFromOptions() before you call
DMCreateMatrix (or before its called automatically), in order to get the
Mat type you want from the command line.
If you do not get the right type, you know this has not happened.
Matt
> [valera at node50 Src]$ grep DMSetFromOptions *.f90
> DMDAmetrics.f90:call DMSetFromOptions(daDerivs,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daGrid,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daSingle,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daConstants,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daSgs,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daLaplacian,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daMetrics,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daCenters,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daPressureCoeffs,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daDivSgs,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daDummy,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daVelocities,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daScalars,ierr)
> dmdaobjs.f90: call DMSetFromOptions(daDensity,ierr)
> fileIO.f90: call DMSetFromOptions(daWriteCenters,ierr)
> fileIO.f90: call DMSetFromOptions(daWriteUgrid,ierr)
> fileIO.f90: call DMSetFromOptions(daWriteVgrid,ierr)
> fileIO.f90: call DMSetFromOptions(daWriteWgrid,ierr)
> romsmodule.f90: call DMSetFromOptions(daSeqEastWest_u,ierrp)
> romsmodule.f90: call DMSetFromOptions(daSeqEastWest_v,ierrp)
> romsmodule.f90: call DMSetFromOptions(daSeqEastWest_w,ierrp)
> seamountbeamroms.f90: call DMSetFromOptions(daMinPlaneZ,
> ierr)
>
>
>>
>> 2) Does your DM have a prefix?
>>
>
> What does this mean? the one used in the KSPSolve is daDummy
>
> Thanks,
>
>
>
>
>>
>> Matt
>>
>>
>>> The error:
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #1 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #2 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #3 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #4 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #5 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> Finished setting up matrix objects
>>> Exiting PrepareNetCDF
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #6 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #7 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #8 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #9 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.7096996307373047E-002
>>> PETSC_Solve_SSG: 1.7321825027465820E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #10 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #11 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #12 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #13 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2702941894531250E-002
>>> PETSC_Solve_SSG: 1.2929439544677734E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #14 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #15 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #16 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #17 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2219429016113281E-002
>>> PETSC_Solve_SSG: 1.2480735778808594E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #18 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #19 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #20 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #21 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2894392013549805E-002
>>> PETSC_Solve_SSG: 1.3124227523803711E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #22 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #23 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #24 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #25 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2183427810668945E-002
>>> PETSC_Solve_SSG: 1.2410402297973633E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #26 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #27 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #28 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #29 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2546300888061523E-002
>>> PETSC_Solve_SSG: 1.2778043746948242E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #30 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #31 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #32 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #33 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2255668640136719E-002
>>> PETSC_Solve_SSG: 1.2504816055297852E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #34 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #35 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #36 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #37 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.3138055801391602E-002
>>> PETSC_Solve_SSG: 1.3362646102905273E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #38 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #39 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #40 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #41 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2072801589965820E-002
>>> PETSC_Solve_SSG: 1.2319087982177734E-002
>>> -.-.-.-.-
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: No support for this operation for this object type
>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
>>> for trouble shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.9-13-g05d412b GIT
>>> Date: 2018-04-09 08:39:52 -0500
>>> [0]PETSC ERROR: ./gcmBEAM on a cuda-debug named node50 by valera Mon
>>> Apr 9 16:50:21 2018
>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda-debug --download-mpich
>>> --with-blaslapack-dir=/usr/lib64 COPTFLAGS=-O2 CXXOPTFLAGS=-O2
>>> FOPTFLAGS=-O2 --with-shared-libraries=1 --download-hypre --with-debugging=1
>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>> --download-cusp --download-cusp-commit=116b090
>>> [0]PETSC ERROR: #42 PCSetUp_SAVIENNACL() line 47 in
>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>> [0]PETSC ERROR: #43 PCSetUp() line 923 in /home/valera/petsc/src/ksp/pc/
>>> interface/precon.c
>>> [0]PETSC ERROR: #44 KSPSetUp() line 381 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> [0]PETSC ERROR: #45 KSPSolve() line 612 in /home/valera/petsc/src/ksp/ksp
>>> /interface/itfunc.c
>>> PETSC_Solve_Sol: 1.2498617172241211E-002
>>> PETSC_Solve_SSG: 1.2726306915283203E-002
>>> -.-.-.-.-
>>> -.-.-.-.-.-
>>> TMain_Loop: 1.1020996570587158
>>> -.-.-.-.-.-
>>> WARNING! There are options you set that were not used!
>>> WARNING! could be spelling mistake, etc!
>>>
>>>
>>> On Mon, Apr 9, 2018 at 4:45 PM, Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>>
>>>> On Mon, Apr 9, 2018 at 7:27 PM, Manuel Valera <mvalera-w at sdsu.edu>
>>>> wrote:
>>>>
>>>>> On Mon, Apr 9, 2018 at 4:09 PM, Matthew Knepley <knepley at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> On Mon, Apr 9, 2018 at 6:12 PM, Manuel Valera <mvalera-w at sdsu.edu>
>>>>>> wrote:
>>>>>>
>>>>>>> Hello guys,
>>>>>>>
>>>>>>> I've made advances in my CUDA acceleration project, as you remember
>>>>>>> i have a CFD model in need of better execution times.
>>>>>>>
>>>>>>> So far i have been able to solve the pressure system in the GPU and
>>>>>>> the rest in serial, using PETSc only for this pressure solve, the library i
>>>>>>> got to work was ViennaCL. First question, do i still have to switch
>>>>>>> installations to use either CUDA library? this was a suggestion before, so
>>>>>>> in order to use CUSP instead of ViennaCL, for example, i currently have to
>>>>>>> change installations, is this still the case?
>>>>>>>
>>>>>>
>>>>>> I am not sure what that means exactly. However, you can build a PETSc
>>>>>> with CUDA and ViennaCL support. The type of Vec/Mat is selected at runtime.
>>>>>>
>>>>>
>>>>> Karl Rupp wrote in a previous email:
>>>>>
>>>>> * * Right now only one of {native CUDA, CUSP, ViennaCL} can be
>>>>> activated at configure time. This will be fixed later this month.*
>>>>>
>>>>> I was asking if this was already solved in 3.9,
>>>>>
>>>>
>>>> Karl knows better than I do. I thought that was fixed, but maybe not in
>>>> this release.
>>>>
>>>>
>>>>>
>>>>>
>>>>>>
>>>>>>
>>>>>>> Now, i started working in a fully parallelized version of the model,
>>>>>>> which uses the DMs and DMDAs to distribute the arrays, if i try the same
>>>>>>> flags as before i get an error saying "Currently only handles ViennaCL
>>>>>>> matrices" when trying to solve for pressure, i get this is a feature still
>>>>>>> not implemented? what options do i have to solve pressure, or assign a DMDA
>>>>>>> array update to be done specifically in a GPU device?
>>>>>>>
>>>>>>
>>>>>> If we can't see the error, we are just guessing. Please send the
>>>>>> entire error message.
>>>>>>
>>>>>
>>>>> Got it, I will paste the error at the end of this email
>>>>>
>>>>
>>>> It is asking for a ViennaCL matrix. You must tell the DM to create one:
>>>>
>>>> -dm_mat_type aijviennacl
>>>>
>>>>
>>>>>
>>>>>
>>>>>>
>>>>>> Note, we only do linear algebra on the GPU, so none of the
>>>>>> FormFunction/FormJacobian stuff for DMDA would be on the GPU.
>>>>>>
>>>>>
>>>>> Yes, we only use it for linear algebra, e.g. solving a linear system
>>>>> and updating an array with a problematic algorithm.
>>>>>
>>>>>
>>>>>
>>>>>>
>>>>>>
>>>>>>> I was thinking of using the VecScatterCreateToZero for a regular
>>>>>>> vector,
>>>>>>>
>>>>>>
>>>>>> Why do you want a serial vector?
>>>>>>
>>>>>
>>>>> Because it looks live ViennaCL doesn't handle arrays created with
>>>>> DMDAVec, it was just an idea
>>>>>
>>>>
>>>> No, it just needs the right type.
>>>>
>>>>
>>>>>
>>>>>
>>>>>> but then i would have to create a vector and copy the DMDAVec into it,
>>>>>>>
>>>>>>
>>>>>> I do not understand what it means to copy the DM into the Vec.
>>>>>>
>>>>>
>>>>> I meant copying a DMDAVec into a Vec object, the first is created with
>>>>> a DMDA object for it's mapping across processors,
>>>>>
>>>>
>>>> There is no such thing as a DMDAVec. Everything is just a Vec.
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>>
>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>>
>>>>>>> is this accomplished with DMDAVecGetArrayReadF90 and then just copy?
>>>>>>> do you think this will generate too much overhead?
>>>>>>>
>>>>>>> Thanks so much for your input,
>>>>>>>
>>>>>>> Manuel
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their
>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>> experiments lead.
>>>>>> -- Norbert Wiener
>>>>>>
>>>>>> https://www.cse.buffalo.edu/~knepley/
>>>>>> <http://www.caam.rice.edu/~mk51/>
>>>>>>
>>>>>
>>>>> The error happens when trying to use KSPSolve() for a vector made with
>>>>> DMDAVec routines, the matrix is created without any DMDA routines
>>>>>
>>>>> Error:
>>>>>
>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>> --------------------------------------------------------------
>>>>> [0]PETSC ERROR: No support for this operation for this object type
>>>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d
>>>>> ocumentation/faq.html for trouble shooting.
>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.8.4-2418-gd9c423b
>>>>> GIT Date: 2018-04-02 11:59:41 +0200
>>>>> [0]PETSC ERROR: ./gcmBEAM on a cuda named node50 by valera Mon Apr 9
>>>>> 16:24:26 2018
>>>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda --download-mpich
>>>>> --download-fblaslapack COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2
>>>>> --with-shared-libraries=1 --download-hypre --with-debugging=no
>>>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>>>> --download-cusp
>>>>> [0]PETSC ERROR: #1 PCSetUp_SAVIENNACL() line 47 in
>>>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>>>> [0]PETSC ERROR: #2 PCSetUp() line 924 in /home/valera/petsc/src/ksp/pc/
>>>>> interface/precon.c
>>>>> [0]PETSC ERROR: #3 KSPSetUp() line 381 in
>>>>> /home/valera/petsc/src/ksp/ksp/interface/itfunc.c
>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>> --------------------------------------------------------------
>>>>> [0]PETSC ERROR: No support for this operation for this object type
>>>>> [0]PETSC ERROR: Currently only handles ViennaCL matrices
>>>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d
>>>>> ocumentation/faq.html for trouble shooting.
>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.8.4-2418-gd9c423b
>>>>> GIT Date: 2018-04-02 11:59:41 +0200
>>>>> [0]PETSC ERROR: ./gcmBEAM on a cuda named node50 by valera Mon Apr 9
>>>>> 16:24:26 2018
>>>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda --download-mpich
>>>>> --download-fblaslapack COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2
>>>>> --with-shared-libraries=1 --download-hypre --with-debugging=no
>>>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>>>> --download-cusp
>>>>> [0]PETSC ERROR: #4 PCSetUp_SAVIENNACL() line 47 in
>>>>> /home/valera/petsc/src/ksp/pc/impls/saviennaclcuda/saviennacl.cu
>>>>> [0]PETSC ERROR: #5 PCSetUp() line 924 in /home/valera/petsc/src/ksp/pc/
>>>>> interface/precon.c
>>>>> Finished setting up matrix objects
>>>>> Exiting PrepareNetCDF
>>>>> [0]PETSC ERROR: ------------------------------
>>>>> ------------------------------------------
>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
>>>>> probably memory access out of range
>>>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>>>> -on_error_attach_debugger
>>>>> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/d
>>>>> ocumentation/faq.html#valgrind
>>>>> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
>>>>> OS X to find memory corruption errors
>>>>> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link,
>>>>> and run
>>>>> [0]PETSC ERROR: to get more information on the crash.
>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>> --------------------------------------------------------------
>>>>> [0]PETSC ERROR: Signal received
>>>>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/d
>>>>> ocumentation/faq.html for trouble shooting.
>>>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.8.4-2418-gd9c423b
>>>>> GIT Date: 2018-04-02 11:59:41 +0200
>>>>> [0]PETSC ERROR: ./gcmBEAM on a cuda named node50 by valera Mon Apr 9
>>>>> 16:24:26 2018
>>>>> [0]PETSC ERROR: Configure options PETSC_ARCH=cuda --download-mpich
>>>>> --download-fblaslapack COPTFLAGS=-O2 CXXOPTFLAGS=-O2 FOPTFLAGS=-O2
>>>>> --with-shared-libraries=1 --download-hypre --with-debugging=no
>>>>> --with-cuda=1 --CUDAFLAGS=-arch=sm_60 --download-hypre --download-viennacl
>>>>> --download-cusp
>>>>> [0]PETSC ERROR: #6 User provided function() line 0 in unknown file
>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>>>>> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
>>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180409/daa4724c/attachment-0001.html>
More information about the petsc-users
mailing list