[petsc-users] Viewing DM Vector stored on Multiple GPUs

Karl Rupp rupp at iue.tuwien.ac.at
Wed Jun 25 04:56:49 CDT 2014


Hi Ashwin,

this stems from a problem related to scattering GPU data across the 
network (you can see VecScatterBegin() in the trace), which we are 
currently working on. Here is the associated pull request:
https://bitbucket.org/petsc/petsc/pull-request/158/pcbjacobi-with-cusp-and-cusparse-solver
It may still take some time to complete, so please remain patient.

Best regards,
Karli


On 06/25/2014 04:23 AM, Ashwin Srinath wrote:
> Here's the error message in it's entirety:
>
> Vec Object: 2 MPI processes
>    type: mpicusp
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Null argument, when expecting valid pointer
> [0]PETSC ERROR: Trying to copy from a null pointer
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.4-4683-ga6c8f22
> GIT Date: 2014-06-24 11:28:06 -0500
> [0]PETSC ERROR: /newscratch/atrikut/issue/main on a
> arch-linux2-cxx-debug named node1774 by atrikut Tue Jun 24 21:59:13 2014
> [0]PETSC ERROR: Configure options --with-cuda=1 --with-cusp=1
> --with-cusp-dir=/home/atrikut/local/cusplibrary --with-thrust=1
> --with-precision=double --with-cuda-arch=sm_21 --with-clanguage=cxx
> --download-txpetscgpu=1 --with-shared-libraries=1
> --with-cuda-dir=/opt/cuda-toolkit/5.5.22 --with-mpi-dir=/opt/mpich2/1.4
> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Null argument, when expecting valid pointer
> [1]PETSC ERROR: Trying to copy from a null pointer
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Development GIT revision: v3.4.4-4683-ga6c8f22
> GIT Date: 2014-06-24 11:28:06 -0500
> [1]PETSC ERROR: /newscratch/atrikut/issue/main on a
> arch-linux2-cxx-debug named node1774 by atrikut Tue Jun 24 21:59:13 2014
> [1]PETSC ERROR: Configure options --with-cuda=1 --with-cusp=1
> --with-cusp-dir=/home/atrikut/local/cusplibrary --with-thrust=1
> --with-precision=double --with-cuda-arch=sm_21 --with-clanguage=cxx
> --download-txpetscgpu=1 --with-shared-libraries=1
> --with-cuda-dir=/opt/cuda-toolkit/5.5.22 --with-mpi-dir=/opt/mpich2/1.4
> [1]PETSC ERROR: #1 PetscMemcpy() line 1892 in
> /home/atrikut/local/petsc-dev/include/petscsys.h
> [1]PETSC ERROR: #1 PetscMemcpy() line 1892 in
> /home/atrikut/local/petsc-dev/include/petscsys.h
> [0]PETSC ERROR: #2 VecScatterBegin_1() line 124 in
> /home/atrikut/local/petsc-dev/include/../src/vec/vec/utils/vpscat.h
> [0]PETSC ERROR: #3 VecScatterBegin() line 1724 in
> /home/atrikut/local/petsc-dev/src/vec/vec/utils/vscat.c
> [0]PETSC ERROR: #4 DMDAGlobalToNaturalBegin() line 171 in
> /home/atrikut/local/petsc-dev/src/dm/impls/da/dagtol.c
> #2 VecScatterBegin_1() line 124 in
> /home/atrikut/local/petsc-dev/include/../src/vec/vec/utils/vpscat.h
> [1]PETSC ERROR: #3 VecScatterBegin() line 1724 in
> /home/atrikut/local/petsc-dev/src/vec/vec/utils/vscat.c
> [0]PETSC ERROR: #5 VecView_MPI_DA() line 721 in
> /home/atrikut/local/petsc-dev/src/dm/impls/da/gr2.c
> [0]PETSC ERROR: [1]PETSC ERROR: #4 DMDAGlobalToNaturalBegin() line 171
> in /home/atrikut/local/petsc-dev/src/dm/impls/da/dagtol.c
> [1]PETSC ERROR: #6 VecView() line 601 in
> /home/atrikut/local/petsc-dev/src/vec/vec/interface/vector.c
> #5 VecView_MPI_DA() line 721 in
> /home/atrikut/local/petsc-dev/src/dm/impls/da/gr2.c
> [1]PETSC ERROR: #6 VecView() line 601 in
> /home/atrikut/local/petsc-dev/src/vec/vec/interface/vector.c
> WARNING! There are options you set that were not used!
> WARNING! could be spelling mistake, etc!
> Option left: name:-vec_type value: cusp
>
>
>
> On Tue, Jun 24, 2014 at 10:15 PM, Ashwin Srinath <ashwinsrnth at gmail.com
> <mailto:ashwinsrnth at gmail.com>> wrote:
>
>     Hello, petsc-users
>
>     I'm having trouble /viewing/ an mpicusp vector. Here's the simplest
>     case that reproduces the problem:
>
>     int main(int argc, char** argv) {
>
>              PetscInitialize(&argc, &argv, NULL, NULL);
>
>              DM da;
>              Vec V;
>
>              DMDACreate2d(   PETSC_COMM_WORLD,
>                              DM_BOUNDARY_NONE, DM_BOUNDARY_NONE,
>                              DMDA_STENCIL_BOX,
>                              5, 5,
>                              PETSC_DECIDE, PETSC_DECIDE,
>                              1,
>                              1,
>                              NULL, NULL,
>                              &da);
>
>              DMCreateGlobalVector(da, &V);
>
>              VecSet(V, 1);
>              VecView(V, PETSC_VIEWER_STDOUT_WORLD);
>
>              PetscFinalize();
>              return 0;
>     }
>
>     I get the error:
>     [1]PETSC ERROR: Null argument, when expecting valid pointer
>     [0]PETSC ERROR: Trying to copy from a null pointer
>
>     I executed with the following command:
>     mpiexec -n 2 ./main -dm_vec_type cusp -vec_type cusp
>     Both GPUs are attached to two different processes.
>
>     This program works fine for vecmpi vectors, i.e., -dm_vec_type mpi
>     and -vec_type mpi. Also, I don't get an error unless I try to /view/
>     the vector. Can someone please point out what I'm doing wrong?
>
>     Thanks for your time,
>     Ashwin Srinath
>
>
>
>
>
>



More information about the petsc-users mailing list