[petsc-users] VecSetSizes hangs in MPI

Dave May dave.mayhem23 at gmail.com
Wed Jan 4 16:29:46 CST 2017


You need to swap the order of your function calls.
Call VecSetSizes() before VecSetType()

Thanks,
  Dave


On Wed, 4 Jan 2017 at 23:21, Manuel Valera <mvalera at mail.sdsu.edu> wrote:

Hello all, happy new year,

I'm working on parallelizing my code, it worked and provided some results
when i just called more than one processor, but created artifacts because i
didn't need one image of the whole program in each processor, conflicting
with each other.

Since the pressure solver is the main part i need in parallel im chosing
mpi to run everything in root processor until its time to solve for
pressure, at this point im trying to create a distributed vector using
either

     call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr)
or

     call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr)

     call VecSetType(xp,VECMPI,ierr)

     call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr)



In both cases program hangs at this point, something it never happened on
the naive way i described before. I've made sure the global size, nbdp, is
the same in every processor. What can be wrong?


Thanks for your kind help,


Manuel.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/d30f5bf6/attachment-0001.html>


More information about the petsc-users mailing list