<div dir="ltr">Gaurish,<div><br></div><div>I would suggest you spend some time reading the PETSc user's manual available here: <a href="http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manual.pdf">http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manual.pdf</a></div>
<div><br></div><div>These two lines are incorrect. </div><div><br></div><div><span class="Apple-style-span" style="font-family: arial, sans-serif; font-size: 15.6px; border-collapse: collapse; ">ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&x);CHKERRQ(ierr);<br>
ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&y);CHKERRQ(ierr);</span></div><div><br><div class="gmail_quote">The problem, as indicated by Shri, is that you are determining your global size from your local size. In the case of a single-process run, these two are the same. For any other code, global_size = sum(local_size[]), corresponding to the local size on each process. You can verify this by running with 3 processes, you will see a vector of length 15 instead of 10.</div>
<div class="gmail_quote"><br></div><div class="gmail_quote">A</div><div class="gmail_quote"><br></div><div class="gmail_quote">On Tue, Jan 18, 2011 at 10:23 AM, Gaurish Telang <span dir="ltr"><<a href="mailto:gaurish108@gmail.com">gaurish108@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">Hmm,,,,,<br><br>Then how come the code worked perfectly on 1 processor? <br><br>Any way here is my code. <br><br>I cant really understand where I went wrong. Possibly with VecCreate(). I think when more than one processor is involved I am not using it correctly<br>
<br><br><br>static char help[] = "'*x.\n\<br> -f <input_file> : file to load \n\n";<br><br>/* <br> Include "petscmat.h" so that we can use matrices.<br> automatically includes:<br> petscsys.h - base PETSc routines petscvec.h - vectors<br>
petscmat.h - matrices<br> petscis.h - index sets petscviewer.h - viewers <br>*/<br>#include "petscmat.h"<br>#include "petscvec.h"<br>#include "petscksp.h" /* For the iterative solvers */<br>
//extern PetscErrorCode LowRankUpdate(Mat,Mat,Vec,Vec,Vec,Vec,PetscInt);<br>//#include <iostream><br>#include <stdlib.h><br>#include <stdio.h><br><br>#undef __FUNCT__<br>#define __FUNCT__ "main"<br>
int main(int argc,char **args)<br>{<br> Mat U; /* matrix */<br> PetscViewer fd; /* viewer */<br> char file[PETSC_MAX_PATH_LEN]; /* input file name */<br>
PetscErrorCode ierr;<br> PetscTruth flg;<br> Vec x,y;<br> PetscInt i,n,m;<br> PetscScalar *xx;<br><br> KSP ksp;<br> PC pc;<br> PetscMPIInt size;<br><br>
PetscInt num_iters;<br>
PetscReal rnorm;<br>KSPConvergedReason reason;<br><br> PetscInitialize(&argc,&args,(char *)0,help);<br><br> /* <br> Determine file from which we read the matrix<br><br> */<br> ierr = PetscOptionsGetString(PETSC_NULL,"-f",file,PETSC_MAX_PATH_LEN-1,&flg);CHKERRQ(ierr);<br>
if (!flg) SETERRQ(1,"Must indicate binary file with the -f option");<br><br><br> /* <br> Open binary file. Note that we use FILE_MODE_READ to indicate<br> reading from this file.<br> */<br> ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);CHKERRQ(ierr);<br>
<br> ierr = MatLoad(fd,MATMPIDENSE,&U);CHKERRQ(ierr);<br> ierr = PetscViewerDestroy(fd);CHKERRQ(ierr);<br><br> ierr = MatGetSize(U,&m,&n);CHKERRQ(ierr);<br> <br> ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&x);CHKERRQ(ierr);<br>
<br>ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&y);CHKERRQ(ierr);<br> ierr=MatView(U,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); <br><br> //ierr=MatView(U,PETSC_VIEWER_DRAW_WORLD);CHKERRQ(ierr);<br>/* -------------------------------------------------------------------------------------- */<br>
ierr=VecSet(x,1);CHKERRQ(ierr);<br><br>ierr = VecView(x,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);<br>/* ------------------------------------------------------------------------------------ */<br>ierr = KSPCreate(PETSC_COMM_WORLD, &ksp); CHKERRQ(ierr);<br>
ierr = KSPSetType(ksp, KSPCGNE); CHKERRQ(ierr);<br> ierr = KSPSetOperators(ksp, U, U, DIFFERENT_NONZERO_PATTERN); CHKERRQ(ierr);<br> ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr);<br><br> ierr = KSPSolve(ksp, x, y); CHKERRQ(ierr);<br>
<br> ierr = KSPGetIterationNumber(ksp, &num_iters); CHKERRQ(ierr);<br><br><br> ierr = KSPGetResidualNorm(ksp, &rnorm); CHKERRQ(ierr);<br><br> <br> ierr = KSPGetConvergedReason(ksp, &reason); CHKERRQ(ierr);<br>
<br> printf ("KSPGetIterationNumber %i \n ", num_iters);<br> printf("KSPGetResidualNorm %f \n" , rnorm );<br> // printf("KSPConvergedReason %f \n ",reason);<br><br> ierr=VecView(y,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); <br>
<br> /* <br> Free work space. All PETSc objects should be destroyed when they<br> are no longer needed.<br> */<br> ierr = MatDestroy(U);CHKERRQ(ierr);<br> ierr = VecDestroy(x);CHKERRQ(ierr);<br> ierr = VecDestroy(y);CHKERRQ(ierr);<br>
ierr = PetscFinalize();CHKERRQ(ierr);<br> return 0;<div><div></div><div class="h5"><br>}<br><br><br><br><br><br> <br><br><div class="gmail_quote">On Mon, Jan 17, 2011 at 11:56 PM, Gaurish Telang <span dir="ltr"><<a href="mailto:gaurish108@gmail.com" target="_blank">gaurish108@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0pt 0pt 0pt 0.8ex;border-left:1px solid rgb(204, 204, 204);padding-left:1ex">Hi,<br><br>I am trying to solve a linear system where Ay=x and A is a 5x5 matrix stored in a binary file called 'square' and x=[1;1;1;1;1]<br>
<br>I am trying to display the matrix A, and the vectors x(rhs) and y(soln) in that order to standard output. <br>
<br>On running my code on a single processor the answer returned is accurate. But on using 2 processors I get weird error messages PART of which says <br>
<br>[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
[1]PETSC ERROR: Nonconforming object sizes!<br>
[1]PETSC ERROR: Mat mat,Vec x: global dim 5 10!<br><br>Also somehow the vector x gets displayed TWICE when run on two processes. A however gets displayed ONCE (as it should!!)<br><br><br>I am attaching the output I get when I run on 1 process and the when I run the same code on 2 processes.<br>
please let me know where I could be going wrong. <br><br>(1)<br><br>This is what I get on running on ONE process: (Here system is solved successfully)<br><br>gaurish108@gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 1 ./ex4 -f square<br>
<br>1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 7.5774013057833345e-01 7.0604608801960878e-01 <br>9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 7.4313246812491618e-01 3.1832846377420676e-02 <br>
9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 3.9222701953416816e-01 2.7692298496088996e-01 <br>4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 6.5547789017755664e-01 4.6171390631153941e-02 <br>
8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 1.7118668781156177e-01 9.7131781235847536e-02 <br>Process [0]<br>1<br>1<br>1<br>1<br>1<br>KSPGetIterationNumber 5 <br> KSPGetResidualNorm 0.000000 <br>
Process [0]<br>
-0.810214<br>2.33178<br>-1.31131<br>1.09323<br>1.17322<br>gaurish108@gaurish108-laptop:~/Desktop$ <br><br>%--------------------------------------------------------------------<br>This is what I get on running on TWO processes:<br>
<br>(2)<br><br>gaurish108@gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex4 -f square<br>1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 7.5774013057833345e-01 7.0604608801960878e-01 <br>
9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 7.4313246812491618e-01 3.1832846377420676e-02 <br>9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 3.9222701953416816e-01 2.7692298496088996e-01 <br>
4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 6.5547789017755664e-01 4.6171390631153941e-02 <br>8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 1.7118668781156177e-01 9.7131781235847536e-02 <br>
Process [0]<br>1<br>1<br>1<br>1<br>1<br>Process [1]<br>1<br>1<br>1<br>1<br>1<br>[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------<br>[1]PETSC ERROR: Nonconforming object sizes!<br>
[1]PETSC ERROR: Mat mat,Vec x: global dim 5 10!<br>[1]PETSC ERROR: ------------------------------------------------------------------------<br>[1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54 CDT 2010<br>
[1]PETSC ERROR: See docs/changes/index.html for recent updates.<br>[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>[1]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
[0]PETSC ERROR: Nonconforming object sizes!<br>[0]PETSC ERROR: Mat mat,Vec x: global dim 5 10!<br>[0]PETSC ERROR: See docs/index.html for manual pages.<br>[1]PETSC ERROR: ------------------------------------------------------------------------<br>
[1]PETSC ERROR: ./ex4 on a linux-gnu named gaurish108-laptop by gaurish108 Mon Jan 17 23:49:18 2011<br>[1]PETSC ERROR: Libraries linked from /home/gaurish108/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/lib<br>
[1]PETSC ERROR: Configure run at Sat Nov 13 20:34:38 2010<br>[1]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack=1 --download-mpich=1 --download-superlu_dist=1 --download-parmetis=1 --with-superlu_dist=1 --with-parmetis=1<br>
[1]PETSC ERROR: ------------------------------------------------------------------------<br>[1]PETSC ERROR: MatMultTranspose() line 1947 in src/mat/interface/matrix.c<br>[1]PETSC ERROR: KSPSolve_CGNE() line 103 in src/ksp/ksp/impls/cg/cgne/cgne.c<br>
[1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c<br>[1]PETSC ERROR: main() line 78 in src/mat/examples/tutorials/ex4.c<br>application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1[cli_1]: aborting job:<br>
application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1<br>[0]0:Return code = 0, signaled with Interrupt<br>[0]1:Return code = 60<br>gaurish108@gaurish108-laptop:~/Desktop$ <br><br><br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br>
</blockquote></div><br>
</div></div></blockquote></div><br></div></div>