A simple question.
Hong Zhang
hzhang at mcs.anl.gov
Mon Apr 2 20:48:41 CDT 2007
The problem is at VecSetValues().
> Vec x;
> PetscInt n =10;
> PetscErrorCode ierr;
> PetscMPIInt size;
> PetscScalar one=1.0;
> double vd[3]={2.,2.,2.};
> int indices[3]={0,5,7};
> PetscScalar* v=&vd[0];
>
> ierr = PetscInitialize(&argc,&args,(char *)0,help); CHKERRQ(ierr);
> ierr = MPI_Comm_size(PETSC_COMM_WORLD,&size); CHKERRQ(ierr);
> printf("Number of processors : %d\n",size);
> ierr = VecCreate(PETSC_COMM_WORLD,&x); CHKERRQ(ierr);
> ierr = VecSetSizes(x,PETSC_DECIDE,n); CHKERRQ(ierr);
> ierr = VecSetType(x,"mpi"); CHKERRQ(ierr);
> ierr = VecSetFromOptions(x); CHKERRQ(ierr);
> ierr = VecSet(x,one); CHKERRQ(ierr);
> VecSetValues(x,3,indices,v,ADD_VALUES);
^^^^^^^^^^
Here, when np = 2, both processors add v=2.0 into the vector at 0,5
7th component, resulting different output than np=1.
You should call VecGetOwnershipRange(Vec x,PetscInt *low,PetscInt *high),
then have each processor set values into its own part.
Hong
> VecAssemblyBegin(x);
> VecAssemblyEnd(x);
>
> ierr = VecView(x,PETSC_VIEWER_STDOUT_WORLD); CHKERRQ(ierr);
> VecDestroy(x); CHKERRQ(ierr);
> ierr = PetscFinalize(); CHKERRQ(ierr);
> return 0;
> }
> ***************************************************************************
>
> 3. And now I'm running it on a dual core processor for -n 1 & 2. These are
> the results.
> ***************************************************************************
> Number of processors : 1
> Process [0]
> 3
> 1
> 1
> 1
> 1
> 3
> 1
> 3
> 1
> 1
> ***************************************************************************
> Number of processors : 2
> Process [0]
> 5
> 1
> 1
> 1
> 1
> Process [1]
> 5
> 1
> 5
> 1
> 1
> Number of processors : 2
> ***************************************************************************
>
> This is not I was really looking for. I intended to have the same output
> (concerning the vector entries) and no double-printings (like "Number of
> processors : 2").
>
> Do you think this is an MPI setup problem, or have I not really understood
> what the above code does?
>
> Kind regards,
>
> Fotios Karaoulanis._
>
> ps. Congratulations on your excellent work!
>
>
>
> --
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> Fotios E. Karaoulanis
> Dipl. Civil Engineer, MSc TUM
> tel +30 2310 458913
> fax +30 2310 458913
> mob +30 6948 179452
> e-mail fkar at nemesis-project.org
> --------------------------------------------
> Consider visiting www.nemesis-project.org.
> Home of an experimental finite element code.
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>
>
More information about the petsc-users
mailing list