[petsc-users] Trouble with solving 5x5 system... Ay=x on 2 or more processors
Gaurish Telang
gaurish108 at gmail.com
Tue Jan 18 01:23:06 CST 2011
Hmm,,,,,
Then how come the code worked perfectly on 1 processor?
Any way here is my code.
I cant really understand where I went wrong. Possibly with VecCreate(). I
think when more than one processor is involved I am not using it correctly
static char help[] = "'*x.\n\
-f <input_file> : file to load \n\n";
/*
Include "petscmat.h" so that we can use matrices.
automatically includes:
petscsys.h - base PETSc routines petscvec.h - vectors
petscmat.h - matrices
petscis.h - index sets petscviewer.h -
viewers
*/
#include "petscmat.h"
#include "petscvec.h"
#include "petscksp.h" /* For the iterative solvers */
//extern PetscErrorCode LowRankUpdate(Mat,Mat,Vec,Vec,Vec,Vec,PetscInt);
//#include <iostream>
#include <stdlib.h>
#include <stdio.h>
#undef __FUNCT__
#define __FUNCT__ "main"
int main(int argc,char **args)
{
Mat U; /* matrix */
PetscViewer fd; /* viewer */
char file[PETSC_MAX_PATH_LEN]; /* input file name */
PetscErrorCode ierr;
PetscTruth flg;
Vec x,y;
PetscInt i,n,m;
PetscScalar *xx;
KSP ksp;
PC pc;
PetscMPIInt size;
PetscInt num_iters;
PetscReal rnorm;
KSPConvergedReason reason;
PetscInitialize(&argc,&args,(char *)0,help);
/*
Determine file from which we read the matrix
*/
ierr =
PetscOptionsGetString(PETSC_NULL,"-f",file,PETSC_MAX_PATH_LEN-1,&flg);CHKERRQ(ierr);
if (!flg) SETERRQ(1,"Must indicate binary file with the -f option");
/*
Open binary file. Note that we use FILE_MODE_READ to indicate
reading from this file.
*/
ierr =
PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);CHKERRQ(ierr);
ierr = MatLoad(fd,MATMPIDENSE,&U);CHKERRQ(ierr);
ierr = PetscViewerDestroy(fd);CHKERRQ(ierr);
ierr = MatGetSize(U,&m,&n);CHKERRQ(ierr);
ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&x);CHKERRQ(ierr);
ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&y);CHKERRQ(ierr);
ierr=MatView(U,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
//ierr=MatView(U,PETSC_VIEWER_DRAW_WORLD);CHKERRQ(ierr);
/*
--------------------------------------------------------------------------------------
*/
ierr=VecSet(x,1);CHKERRQ(ierr);
ierr = VecView(x,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
/*
------------------------------------------------------------------------------------
*/
ierr = KSPCreate(PETSC_COMM_WORLD, &ksp); CHKERRQ(ierr);
ierr = KSPSetType(ksp, KSPCGNE); CHKERRQ(ierr);
ierr = KSPSetOperators(ksp, U, U, DIFFERENT_NONZERO_PATTERN);
CHKERRQ(ierr);
ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr);
ierr = KSPSolve(ksp, x, y); CHKERRQ(ierr);
ierr = KSPGetIterationNumber(ksp, &num_iters); CHKERRQ(ierr);
ierr = KSPGetResidualNorm(ksp, &rnorm); CHKERRQ(ierr);
ierr = KSPGetConvergedReason(ksp, &reason); CHKERRQ(ierr);
printf ("KSPGetIterationNumber %i \n ", num_iters);
printf("KSPGetResidualNorm %f \n" , rnorm );
// printf("KSPConvergedReason %f \n ",reason);
ierr=VecView(y,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
/*
Free work space. All PETSc objects should be destroyed when they
are no longer needed.
*/
ierr = MatDestroy(U);CHKERRQ(ierr);
ierr = VecDestroy(x);CHKERRQ(ierr);
ierr = VecDestroy(y);CHKERRQ(ierr);
ierr = PetscFinalize();CHKERRQ(ierr);
return 0;
}
On Mon, Jan 17, 2011 at 11:56 PM, Gaurish Telang <gaurish108 at gmail.com>wrote:
> Hi,
>
> I am trying to solve a linear system where Ay=x and A is a 5x5 matrix
> stored in a binary file called 'square' and x=[1;1;1;1;1]
>
> I am trying to display the matrix A, and the vectors x(rhs) and y(soln) in
> that order to standard output.
>
> On running my code on a single processor the answer returned is accurate.
> But on using 2 processors I get weird error messages PART of which says
>
> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [1]PETSC ERROR: Nonconforming object sizes!
> [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10!
>
> Also somehow the vector x gets displayed TWICE when run on two processes.
> A however gets displayed ONCE (as it should!!)
>
>
> I am attaching the output I get when I run on 1 process and the when I run
> the same code on 2 processes.
> please let me know where I could be going wrong.
>
> (1)
>
> This is what I get on running on ONE process: (Here system is solved
> successfully)
>
> gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec
> -n 1 ./ex4 -f square
>
> 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01
> 7.5774013057833345e-01 7.0604608801960878e-01
> 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02
> 7.4313246812491618e-01 3.1832846377420676e-02
> 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01
> 3.9222701953416816e-01 2.7692298496088996e-01
> 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01
> 6.5547789017755664e-01 4.6171390631153941e-02
> 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01
> 1.7118668781156177e-01 9.7131781235847536e-02
> Process [0]
> 1
> 1
> 1
> 1
> 1
> KSPGetIterationNumber 5
> KSPGetResidualNorm 0.000000
> Process [0]
> -0.810214
> 2.33178
> -1.31131
> 1.09323
> 1.17322
> gaurish108 at gaurish108-laptop:~/Desktop$
>
> %--------------------------------------------------------------------
> This is what I get on running on TWO processes:
>
> (2)
>
> gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec
> -n 2 ./ex4 -f square
> 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01
> 7.5774013057833345e-01 7.0604608801960878e-01
> 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02
> 7.4313246812491618e-01 3.1832846377420676e-02
> 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01
> 3.9222701953416816e-01 2.7692298496088996e-01
> 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01
> 6.5547789017755664e-01 4.6171390631153941e-02
> 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01
> 1.7118668781156177e-01 9.7131781235847536e-02
> Process [0]
> 1
> 1
> 1
> 1
> 1
> Process [1]
> 1
> 1
> 1
> 1
> 1
> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [1]PETSC ERROR: Nonconforming object sizes!
> [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10!
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
> [1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54
> CDT 2010
> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [1]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Nonconforming object sizes!
> [0]PETSC ERROR: Mat mat,Vec x: global dim 5 10!
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
> [1]PETSC ERROR: ./ex4 on a linux-gnu named gaurish108-laptop by gaurish108
> Mon Jan 17 23:49:18 2011
> [1]PETSC ERROR: Libraries linked from
> /home/gaurish108/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/lib
> [1]PETSC ERROR: Configure run at Sat Nov 13 20:34:38 2010
> [1]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran
> --download-f-blas-lapack=1 --download-mpich=1 --download-superlu_dist=1
> --download-parmetis=1 --with-superlu_dist=1 --with-parmetis=1
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
> [1]PETSC ERROR: MatMultTranspose() line 1947 in src/mat/interface/matrix.c
> [1]PETSC ERROR: KSPSolve_CGNE() line 103 in
> src/ksp/ksp/impls/cg/cgne/cgne.c
> [1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c
> [1]PETSC ERROR: main() line 78 in src/mat/examples/tutorials/ex4.c
> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1[cli_1]:
> aborting job:
> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1
> [0]0:Return code = 0, signaled with Interrupt
> [0]1:Return code = 60
> gaurish108 at gaurish108-laptop:~/Desktop$
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110118/8d9fd5e2/attachment-0001.htm>
More information about the petsc-users
mailing list