[petsc-users] VecScatter() causing processes hanging

Qiyue Lu qiyuelu1 at gmail.com
Sun Dec 8 16:57:05 CST 2024


Yes, I got the right Vector local size using the code below and I am also
sure all processes entered that function by printing out a message, which
shows up the number of processes times.
But the hanging problem is still there. I need to spend more time and do
more tests.
int vec_local_size;
// vec_local_size = new int[world_size];
VecGetLocalSize(sol_vector_rp1, &vec_local_size);
std::cout << "process " << world_rank << " has " << vec_local_size << "
entries." << std::endl;

On Sun, Dec 8, 2024 at 4:40 PM Qiyue Lu <qiyuelu1 at gmail.com> wrote:

> Yes, it seems the global_vec has some problems. It is passed by as a
> function parameter. And the original vector is declared as in the code.
> However, I got some print-out as below, while I am expecting to get
> evenly-distributed values.
> process 0 has 5422 entries.
> process 1 has 0 entries.
> process 2 has 0 entries.
> process 3 has 0 entries.
> process 4 has 0 entries.
> process 5 has 0 entries.
> process 6 has 0 entries.
> process 7 has 0 entries.
> process 8 has 0 entries.
> process 9 has 0 entries.
> process 10 has 0 entries.
> process 11 has 0 entries.
> process 12 has 0 entries.
> Vec sol_vector_rp1;
> VecCreate(MPI_COMM_WORLD, &sol_vector_rp1);
> VecSetType(sol_vector_rp1, VECMPI);
> VecSetSizes(sol_vector_rp1, PETSC_DECIDE, numNodesTotal*numDOFs);
> VecSet(sol_vector_rp1, 0.0);
> VecAssemblyBegin(sol_vector_rp1);
> VecAssemblyEnd(sol_vector_rp1);
> int *vec_local_size = nullptr;
> vec_local_size = new int[world_size];
> VecGetLocalSize(sol_vector_rp1, vec_local_size);
> for (int i = 0; i < world_size; i++){
> std::cout << "process " << i << " has " << vec_local_size[i] << "
> entries." << std::endl;
> }
>
>
> On Sun, Dec 8, 2024 at 4:00 PM Barry Smith <bsmith at petsc.dev> wrote:
>
>>
>>    Where does global_vec come from, and are you sure that all MPI
>> processes that share global_vec enter this code region?
>>
>>
>>
>> On Dec 8, 2024, at 4:52 PM, Qiyue Lu <qiyuelu1 at gmail.com> wrote:
>>
>> Thank you all for the reply. Here is the code. Since I need to fetch data
>> from the local_vec, so I set the type as VECSEQ, so I can use
>> VecGetValues().
>> std::vector<std::vector<double>> vc(numNodesSurface, std::vector<double
>> >(numDimsProb)); // 2-D matrix storing velocity u, v.
>> PetscScalar *ptr_vc = NULL; // memory storing the fetched values
>> ptr_vc = new double[numNodesSurface*numDimsProb]; // numNodesSurfac = 6,
>> numDimsProb = 2, are number of nodes per element and problem dimension
>> PetscInt idx_global[numNodesSurface*numDimsProb]; // for creating global
>> Index Set
>> PetscInt idx_local[numNodesSurface*numDimsProb]; // for creating local
>> Index Set
>> for (int i = 0; i < numNodesSurface; i++){
>> for (int j = 0; j < numDimsProb; j++){
>> idx_local[i*numDimsProb + j] = i*numDimsProb + j; // local index
>> idx_global[i*numDimsProb + j] = node_el[i]*numDOFs + j; // global index
>> }
>> }
>> // global_vec is distributed, and uses MPI_COMM_WORLD as a communicator
>> which includes all processes.
>> // I want to fetch 12 (numNodesSurface*numDimsProb) values from
>> global_vec and put them in a local vector,
>> // Therefore, I set up this local_vec as VECSEQ and using PETSC_COMM_SELF.
>> IS is_source, is_dest;
>> ISCreateGeneral(PETSC_COMM_SELF, numNodesSurface*numDimsProb,
>> idx_global, PETSC_COPY_VALUES, &is_source);
>> ISCreateGeneral(PETSC_COMM_SELF, numNodesSurface*numDimsProb, idx_local,
>> PETSC_COPY_VALUES, &is_dest);
>> Vec local_vec;
>> VecCreate(PETSC_COMM_SELF, &local_vec);
>> VecSetSizes(local_vec, PETSC_DECIDE, numNodesSurface*numDimsProb);
>> VecSetType(local_vec, VECSEQ);
>> VecScatter scat;
>> VecScatterCreate(global_vec, is_source, local_vec, is_dest, &scat); //
>> Got Stuck here
>> VecScatterBegin(scat, global_vec, local_vec, INSERT_VALUES,
>> SCATTER_FORWARD);
>> VecScatterEnd(scat, global_vec, local_vec, INSERT_VALUES,
>> SCATTER_FORWARD);
>> VecGetValues(local_vec, numNodesSurface*numDimsProb, idx_local, ptr_vc);
>> for (int i = 0; i < numNodesSurface; i++){
>> for (int j = 0; j < numDimsProb; j++){
>> vc[i][j] = ptr_vc[i*numDimsProb + j]; // From 1-D to 2-D
>> }
>> }
>> ISDestroy(&is_source);
>> ISDestroy(&is_dest);
>> VecDestroy(&local_vec);
>> VecScatterDestroy(&scat);
>>
>> On Sun, Dec 8, 2024 at 11:12 AM Barry Smith <bsmith at petsc.dev> wrote:
>>
>>>
>>>    You can scatter from a global vector to a local vector; numerous
>>> PETSc examples demonstrate this. So hanging here is surprising. Please
>>> display the entire code so we can see the context of the VecScatterCreate()
>>> usage. Perhaps not all the MPI process that are in the global_vec
>>> communicator are involved in the call to VecScatterCreate().
>>>
>>>   To determine where the hang occurs you can run with -start_in_debugger
>>> use c for continue in each debugger window and then after a long time of
>>> hanging do control d in the hanging windows and then type bt to see where
>>> the code is hanging.
>>>
>>>   Barry
>>>
>>>
>>> On Dec 7, 2024, at 9:47 PM, Qiyue Lu <qiyuelu1 at gmail.com> wrote:
>>>
>>> Hello,
>>> I am trying to fetch 12 entries from a distributed vector global_vec and
>>> put them into a local vector on each process.
>>>
>>> IS is_source, is_dest;
>>> ISCreateGeneral(PETSC_COMM_SELF, 12, idx_global, PETSC_COPY_VALUES, &
>>> is_source);
>>> ISCreateGeneral(PETSC_COMM_SELF, 12, idx_local, PETSC_COPY_VALUES, &
>>> is_dest);
>>> Vec local_vec;
>>> VecCreate(PETSC_COMM_SELF, &local_vec);
>>> VecSetSizes(local_vec, PETSC_DECIDE, 12);
>>> VecSetType(local_vec, VECSEQ);
>>> VecScatter scat;
>>> VecScatterCreate(global_vec, is_source, local_vec, is_dest, &scat);
>>>
>>>
>>> I create the local vector as sequential. However, the last two lines
>>> which create a scatter object, will cause more than half processes to hang
>>> and no error pops out.
>>>
>>> Does the scatter have to be VECMPI to VECMPI and cannot VECMPI to VECSEQ?
>>>
>>> Thanks,
>>> Qiyue Lu
>>>
>>>
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20241208/0b2c6ce3/attachment-0001.html>


More information about the petsc-users mailing list