[petsc-users] VecSetSizes hangs in MPI
Manuel Valera
mvalera at mail.sdsu.edu
Wed Jan 4 17:30:51 CST 2017
Thanks i had no idea how to debug and read those logs, that solved this
issue at least (i was sending a message from root to everyone else, but
trying to catch from everyone else including root)
Until next time, many thanks,
Manuel
On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera <mvalera at mail.sdsu.edu>
> wrote:
>
>> I did a PetscBarrier just before calling the vicariate routine and im
>> pretty sure im calling it from every processor, code looks like this:
>>
>
> From the gdb trace.
>
> Proc 0: Is in some MPI routine you call yourself, line 113
>
> Proc 1: Is in VecCreate(), line 130
>
> You need to fix your communication code.
>
> Matt
>
>
>> call PetscBarrier(PETSC_NULL_OBJECT,ierr)
>>
>>
>> print*,'entering POInit from',rank
>>
>> !call exit()
>>
>>
>> call PetscObjsInit()
>>
>>
>>
>> And output gives:
>>
>>
>> entering POInit from 0
>>
>> entering POInit from 1
>>
>> entering POInit from 2
>>
>> entering POInit from 3
>>
>>
>> Still hangs in the same way,
>>
>> Thanks,
>>
>> Manuel
>>
>>
>>
>> On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera <mvalera at mail.sdsu.edu>
>> wrote:
>>
>>> Thanks for the answers !
>>>
>>> heres the screenshot of what i got from bt in gdb (great hint in how to
>>> debug in petsc, didn't know that)
>>>
>>> I don't really know what to look at here,
>>>
>>> Thanks,
>>>
>>> Manuel
>>>
>>> On Wed, Jan 4, 2017 at 2:39 PM, Dave May <dave.mayhem23 at gmail.com>
>>> wrote:
>>>
>>>> Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s).
>>>> These functions cannot be inside if statements like
>>>> if (rank == 0){
>>>> VecCreateMPI(...)
>>>> }
>>>>
>>>>
>>>> On Wed, 4 Jan 2017 at 23:34, Manuel Valera <mvalera at mail.sdsu.edu>
>>>> wrote:
>>>>
>>>>> Thanks Dave for the quick answer, appreciate it,
>>>>>
>>>>> I just tried that and it didn't make a difference, any other
>>>>> suggestions ?
>>>>>
>>>>> Thanks,
>>>>> Manuel
>>>>>
>>>>> On Wed, Jan 4, 2017 at 2:29 PM, Dave May <dave.mayhem23 at gmail.com>
>>>>> wrote:
>>>>>
>>>>> You need to swap the order of your function calls.
>>>>> Call VecSetSizes() before VecSetType()
>>>>>
>>>>> Thanks,
>>>>> Dave
>>>>>
>>>>>
>>>>> On Wed, 4 Jan 2017 at 23:21, Manuel Valera <mvalera at mail.sdsu.edu>
>>>>> wrote:
>>>>>
>>>>> Hello all, happy new year,
>>>>>
>>>>> I'm working on parallelizing my code, it worked and provided some
>>>>> results when i just called more than one processor, but created artifacts
>>>>> because i didn't need one image of the whole program in each processor,
>>>>> conflicting with each other.
>>>>>
>>>>> Since the pressure solver is the main part i need in parallel im
>>>>> chosing mpi to run everything in root processor until its time to solve for
>>>>> pressure, at this point im trying to create a distributed vector using
>>>>> either
>>>>>
>>>>> call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr)
>>>>> or
>>>>>
>>>>> call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr)
>>>>>
>>>>> call VecSetType(xp,VECMPI,ierr)
>>>>>
>>>>> call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr)
>>>>>
>>>>>
>>>>>
>>>>> In both cases program hangs at this point, something it never happened
>>>>> on the naive way i described before. I've made sure the global size, nbdp,
>>>>> is the same in every processor. What can be wrong?
>>>>>
>>>>>
>>>>> Thanks for your kind help,
>>>>>
>>>>>
>>>>> Manuel.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170104/4095aba3/attachment-0001.html>
More information about the petsc-users
mailing list