<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Fri, Apr 21, 2017 at 9:27 AM, Damian Kaliszan <span dir="ltr"><<a href="mailto:damian@man.poznan.pl" target="_blank">damian@man.poznan.pl</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi Francesco, Matthew, Lisandro,<br>
<br>
Thank you a lot, it works:)<br>
<br>
However the next problem I'm facing is:<br>
- I'm trying to run run a problem (Ax=b) with the size of A (equals to<br>
3375x3375) using PBS on 10 nodes, each node has 2<br>
GPU cards with 2GB each.<br>
- 'qstat' shows these nodes are allocated, job<br>
is running<br>
- I use ksp/gmres<br>
- the error I get is as follows:<br>
=>> PBS: job killed: mem job total 5994876 kb exceeded limit 2048000 kb<br></blockquote><div><br></div><div>You exceeded a memory limit. This should be mailed to your cluster admin who can tell</div><div>you how to schedule it appropriately.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
[0]PETSC ERROR: ------------------------------<wbr>------------------------------<wbr>------------<br>
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end<br>
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
[0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/<wbr>documentation/faq.html#<wbr>valgrind</a><br>
[0]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
[0]PETSC ERROR: likely location of problem given in stack below<br>
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------<wbr>------<br>
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
[0]PETSC ERROR: INSTEAD the line number of the start of the function<br>
[0]PETSC ERROR: is given.<br>
[0]PETSC ERROR: [0] VecNorm_MPICUDA line 34 /home/users/damiank/petsc_<wbr>bitbucket/src/vec/vec/impls/<wbr>mpi/mpicuda/<a href="http://mpicuda.cu" rel="noreferrer" target="_blank">mpicuda.cu</a><br>
[0]PETSC ERROR: [0] VecNorm line 217 /home/users/damiank/petsc_<wbr>bitbucket/src/vec/vec/<wbr>interface/rvector.c<br>
[0]PETSC ERROR: [0] VecNormalize line 317 /home/users/damiank/petsc_<wbr>bitbucket/src/vec/vec/<wbr>interface/rvector.c<br>
[0]PETSC ERROR: [0] KSPInitialResidual line 42 /home/users/damiank/petsc_<wbr>bitbucket/src/ksp/ksp/<wbr>interface/itres.c<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSPGMRESCycle line 122 /home/users/damiank/petsc_<wbr>bitbucket/src/ksp/ksp/impls/<wbr>gmres/gmres.c<br>
[0]PETSC ERROR: [0] KSPInitialResidual line 42 /home/users/damiank/petsc_<wbr>bitbucket/src/ksp/ksp/<wbr>interface/itres.c<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
[0]PETSC ERROR: [0] KSP_PCApplyBAorAB line 277 /home/users/damiank/petsc_<wbr>bitbucket/include/petsc/<wbr>private/kspimpl.h<br>
<br>
- 'A' matrix is loaded in the following way (so as 'b' vector):<br>
viewer = petsc4py.PETSc.Viewer().<wbr>createBinary(A, 'r')<br>
A = PETSc.Mat().create(comm=MPI.<wbr>COMM_WORLD)<br>
A.setType(PETSc.Mat.Type.<wbr>MPIAIJCUSPARSE)<br>
A.load(viewer)<br>
- Do you know what is going wrong?<br>
<br>
Best,<br>
Damian<br>
<br>
W liście datowanym 20 kwietnia 2017 (19:15:33) napisano:<br>
<br>
> Hi Damian,<br>
<br>
> You can use the "parse_known_args" method of the ArgumentParser<br>
> class; it will create a Namespace with the args you defined and<br>
> return the uknown ones as a list of strings you can pass to petsc4py.init.<br>
> See:<br>
> <a href="https://docs.python.org/3.6/library/argparse.html" rel="noreferrer" target="_blank">https://docs.python.org/3.6/<wbr>library/argparse.html</a><br>
> section 16.4.5.7.<br>
<br>
> I used it successfully in the past. Hope this helps.<br>
<br>
> Best,<br>
> --<br>
> Francesco Caimmi<br>
<br>
> Laboratorio di Ingegneria dei Polimeri<br>
> <a href="http://www.chem.polimi.it/polyenglab/" rel="noreferrer" target="_blank">http://www.chem.polimi.it/<wbr>polyenglab/</a><br>
<br>
> Politecnico di Milano - Dipartimento di Chimica,<br>
> Materiali e Ingegneria Chimica “Giulio Natta”<br>
<br>
> P.zza Leonardo da Vinci, 32<br>
> I-20133 Milano<br>
> Tel. <a href="tel:%2B39.02.2399.4711" value="+390223994711">+39.02.2399.4711</a><br>
> Fax <a href="tel:%2B39.02.7063.8173" value="+390270638173">+39.02.7063.8173</a><br>
<br>
> <a href="mailto:francesco.caimmi@polimi.it">francesco.caimmi@polimi.it</a><br>
> Skype: fmglcaimmi<br>
<br>
> ______________________________<wbr>__________<br>
> From: <a href="mailto:petsc-users-bounces@mcs.anl.gov">petsc-users-bounces@mcs.anl.<wbr>gov</a><br>
> <<a href="mailto:petsc-users-bounces@mcs.anl.gov">petsc-users-bounces@mcs.anl.<wbr>gov</a>> on behalf of Damian Kaliszan <<a href="mailto:damian@man.poznan.pl">damian@man.poznan.pl</a>><br>
> Sent: Thursday, April 20, 2017 6:26 PM<br>
> To: Lisandro Dalcin<br>
> Cc: PETSc<br>
> Subject: Re: [petsc-users] petsc4py & GPU<br>
<br>
> Hi,<br>
> There might be the problem because I'm using ArgumentParser class<br>
> to catch complex command line arguments. In this case is there any<br>
> chance to make both to cooperate or the only solution is to pass<br>
> everything through argument to init method of petsc4py?<br>
> Best,<br>
> Damian<br>
> W dniu 20 kwi 2017, o 17:37, użytkownik Lisandro Dalcin<br>
> <<a href="mailto:dalcinl@gmail.com">dalcinl@gmail.com</a><mailto:<a href="mailto:dalcinl@gmail.com">dalc<wbr>inl@gmail.com</a>>> napisał:<br>
<br>
> On 20 April 2017 at 17:09, Damian Kaliszan<br>
> <<a href="mailto:damian@man.poznan.pl">damian@man.poznan.pl</a><mailto:<a href="mailto:damian@man.poznan.pl">d<wbr>amian@man.poznan.pl</a>>> wrote:<br>
> Thank you for reply:) Sorry for maybe stupid question in the scope of setting petsc(4py) options.<br>
> Should the following calls (somewhere before creating matrix & vectors):<br>
<br>
> PETSc.Options().setValue("ksp_<wbr>view", "")<br>
> PETSc.Options().setValue("log_<wbr>view", "")<br>
<br>
> Unfortunately, no. There are a few options (-log_view ?) that you<br>
> should set before calling PetscInitialize() (which happens<br>
> automatically at import time), otherwise things do not work as<br>
> expected. To pass things from the command line and set them before<br>
> PetscInitialize() the usual idiom is:<br>
<br>
> import sys, petsc4py<br>
> petsc4py.init(sys.argv)<br>
> from petsc4py import PETSc<br>
<br>
<br>
<br>
> --<br>
> Lisandro Dalcin<br>
> ============<br>
> Research Scientist<br>
> Computer, Electrical and Mathematical Sciences & Engineering (CEMSE)<br>
> Extreme Computing Research Center (ECRC)<br>
> King Abdullah University of Science and Technology (KAUST)<br>
> <a href="http://ecrc.kaust.edu.sa/" rel="noreferrer" target="_blank">http://ecrc.kaust.edu.sa/</a><br>
<br>
> 4700 King Abdullah University of Science and Technology<br>
> al-Khawarizmi Bldg (Bldg 1), Office # 0109<br>
> Thuwal 23955-6900, Kingdom of Saudi Arabia<br>
> <a href="http://www.kaust.edu.sa" rel="noreferrer" target="_blank">http://www.kaust.edu.sa</a><br>
<br>
> Office Phone: <a href="tel:%2B966%2012%20808-0459" value="+966128080459">+966 12 808-0459</a><br>
<br>
<br>
<br>
------------------------------<wbr>-------------------------<br>
Damian Kaliszan<br>
<br>
Poznan Supercomputing and Networking Center<br>
HPC and Data Centres Technologies<br>
ul. Jana Pawła II 10<br>
61-139 Poznan<br>
POLAND<br>
<br>
phone <a href="tel:%28%2B48%2061%29%20858%205109" value="+48618585109">(+48 61) 858 5109</a><br>
e-mail <a href="mailto:damian@man.poznan.pl">damian@man.poznan.pl</a><br>
www - <a href="http://www.man.poznan.pl/" rel="noreferrer" target="_blank">http://www.man.poznan.pl/</a><br>
------------------------------<wbr>-------------------------<br>
<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature" data-smartmail="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>