[petsc-users] petsc4py & GPU

Damian Kaliszan damian at man.poznan.pl
Fri Apr 21 09:27:57 CDT 2017


Hi Francesco, Matthew, Lisandro,

Thank you a lot, it works:)

However the next problem I'm facing is:
-  I'm trying to run run a problem (Ax=b) with the size of A (equals to
3375x3375) using PBS on 10 nodes, each node has 2
GPU  cards  with 2GB each.
- 'qstat' shows these nodes are allocated, job
is running
- I use ksp/gmres
- the error I get is as follows:
=>> PBS: job killed: mem job total 5994876 kb exceeded limit 2048000 kb
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] VecNorm_MPICUDA line 34 /home/users/damiank/petsc_bitbucket/src/vec/vec/impls/mpi/mpicuda/mpicuda.cu
[0]PETSC ERROR: [0] VecNorm line 217 /home/users/damiank/petsc_bitbucket/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: [0] VecNormalize line 317 /home/users/damiank/petsc_bitbucket/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: [0] KSPInitialResidual line 42 /home/users/damiank/petsc_bitbucket/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSPGMRESCycle line 122 /home/users/damiank/petsc_bitbucket/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: [0] KSPInitialResidual line 42 /home/users/damiank/petsc_bitbucket/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_bitbucket/include/petsc/private/kspimpl.h

- 'A' matrix is loaded in the following way (so as 'b' vector):
 viewer = petsc4py.PETSc.Viewer().createBinary(A, 'r')
A = PETSc.Mat().create(comm=MPI.COMM_WORLD)
A.setType(PETSc.Mat.Type.MPIAIJCUSPARSE)
A.load(viewer)
- Do you know what is going wrong?

Best,
Damian
 
W liście datowanym 20 kwietnia 2017 (19:15:33) napisano:
 
> Hi Damian,

> You can use the "parse_known_args" method of the ArgumentParser
> class; it will create a Namespace with the args you defined and
> return the uknown ones  as a list of strings you can pass to petsc4py.init.
> See:
> https://docs.python.org/3.6/library/argparse.html
> section 16.4.5.7.

> I used it successfully in the past. Hope this helps.

> Best,
> --
> Francesco Caimmi

> Laboratorio di Ingegneria dei Polimeri
> http://www.chem.polimi.it/polyenglab/

> Politecnico di Milano - Dipartimento di Chimica,
> Materiali e Ingegneria Chimica “Giulio Natta”

> P.zza Leonardo da Vinci, 32
> I-20133 Milano
> Tel. +39.02.2399.4711
> Fax +39.02.7063.8173

> francesco.caimmi at polimi.it
> Skype: fmglcaimmi

> ________________________________________
> From: petsc-users-bounces at mcs.anl.gov
> <petsc-users-bounces at mcs.anl.gov> on behalf of Damian Kaliszan <damian at man.poznan.pl>
> Sent: Thursday, April 20, 2017 6:26 PM
> To: Lisandro Dalcin
> Cc: PETSc
> Subject: Re: [petsc-users] petsc4py & GPU

> Hi,
> There might be the problem because I'm using ArgumentParser class
> to catch complex command line arguments. In this case is there any
> chance to make both to cooperate or the only solution is to pass
> everything through argument to init method of petsc4py?
> Best,
> Damian
> W dniu 20 kwi 2017, o 17:37, użytkownik Lisandro Dalcin
> <dalcinl at gmail.com<mailto:dalcinl at gmail.com>> napisał:

> On 20 April 2017 at 17:09, Damian Kaliszan
> <damian at man.poznan.pl<mailto:damian at man.poznan.pl>> wrote:
> Thank you for reply:) Sorry for maybe stupid question in the scope of setting petsc(4py) options.
> Should the following calls (somewhere before creating matrix & vectors):

> PETSc.Options().setValue("ksp_view", "")
> PETSc.Options().setValue("log_view", "")

> Unfortunately, no. There are a few options (-log_view ?) that you
> should set before calling PetscInitialize() (which happens
> automatically at import time), otherwise things do not work as
> expected. To pass things from the command line and set them before
> PetscInitialize() the usual idiom is:

> import sys, petsc4py
> petsc4py.init(sys.argv)
> from petsc4py import PETSc



> --
> Lisandro Dalcin
> ============
> Research Scientist
> Computer, Electrical and Mathematical Sciences & Engineering (CEMSE)
> Extreme Computing Research Center (ECRC)
> King Abdullah University of Science and Technology (KAUST)
> http://ecrc.kaust.edu.sa/

> 4700 King Abdullah University of Science and Technology
> al-Khawarizmi Bldg (Bldg 1), Office # 0109
> Thuwal 23955-6900, Kingdom of Saudi Arabia
> http://www.kaust.edu.sa

> Office Phone: +966 12 808-0459


 
-------------------------------------------------------
Damian Kaliszan
 
Poznan Supercomputing and Networking Center
HPC and Data Centres Technologies
ul. Jana Pawła II 10
61-139 Poznan
POLAND
 
phone (+48 61) 858 5109
e-mail damian at man.poznan.pl
 www - http://www.man.poznan.pl/
 ------------------------------------------------------- 



More information about the petsc-users mailing list