[petsc-users] scattering and communications
Marco Cisternino
marco.cisternino at polito.it
Fri Jul 8 10:48:15 CDT 2011
Thanks for the reply, Matt.
I tried what you suggested, but I saw no inconsistency in the portion
given to GP or extended_P.
Moreover, I tried to see what happens giving no interface, i.e. a simple
Laplace equation.
In this case GP and extended_P have exactly the same structure and
partitioning. And nothing changes.
All the communications happens in VecScatterCreate, the actual scatter
(begin and end) uses almost no time and no communications, but
VecScatterCreate does.
Could you explain me why?
The two IS I use to create the scatter context are from sequential c
vectors as large as the entire computational domain. Could be this the
problem?
If I used smaller and local vectors to create the ISs and then the
scatter context, would VecScatterCreate be more performing??
Thanks a lot.
Marco
> On Thu, Jul 7, 2011 at 4:55 PM, Marco Cisternino<marco.cisternino at polito.it
>> wrote:
>> Hi,
>> I would like to understand better how VecScatterBegin and VecScatterEnd
>> work.
>> I solve an interface elliptic problem on a Cartesian grid introducing extra
>> unknowns (intersections between the interface and the grid axes).
>> Therefore, my linear system is augmented with extra condition on these new
>> unknowns.
>> I use a DA to manage the grid, but I have to use MatCreateMPIAIJ to build
>> the matrix because of the nature of the problem.
>> MatCreateMPIAIJ(proc->cart_**comm,proc->intersections[proc-**
>>> rank],proc->intersections[**proc->rank],g_rows,g_cols,21,**
>> PETSC_NULL,21,PETSC_NULL,&**fsolv->AA);
>> VecCreateMPI(proc->cart_comm,**proc->intersections[proc->**
>> rank],PETSC_DECIDE,&fsolv->**extended_P);
>> VecCreateMPI(proc->cart_comm,**proc->intersections[proc->**
>> rank],PETSC_DECIDE,&fsolv->**extended_RHS);
>>
>> where
>> proc->intersections[proc->**rank] is the total number of unknowns for each
>> processor in its sub-domain (grid points + intersections).
>> g_rows=g_cols is the total number of unknowns in the entire computational
>> domain (grid points + intersections).
>> cart_comm is a Cartesian communicator.
>>
>> The arrangement of the unknowns is such that every processor has the rows
>> of the matrix and of extended_P(the solution) relative to the actual
>> unknowns in its sub-domain.
>> I solve the system and then I call VecScatterBegin and VecScatterEnd:
>>
>> ierr=VecScatterCreate(fsolv->**extended_P,scatter->from,vec->**
>> GP,scatter->to,&scatter->**scatt);
>> ierr=VecScatterBegin(scatter->**scatt,fsolv->extended_P,vec->**
>> GP,INSERT_VALUES,SCATTER_**FORWARD);
>> ierr=VecScatterEnd(scatter->**scatt,fsolv->extended_P,vec->**
>> GP,INSERT_VALUES,SCATTER_**FORWARD);
>>
>> in order to get in GP (made using DACreateGlobalVector(grid->da,**&vec->GP);
>> ) only the solution on the grid points.
>> It works, I mean I can get the right solution in GP, but the scattering
>> doesn't scale at all!
>> I would expect no communications during the scattering, doing what I do,
>> but -log_summary shows me a number of MPI message growing with the number of
>> processors.
>> Every portion of GP contains only the grid points of a processor, while
>> every portion of extended_P contains the same grid points plus the
>> intersections in the relative sub-domain. which is the need for the
>> communications doing such a scattering?
>>
> There is just a mismatch somewhere in your indices. It should be easy to
> check the locally owned indices in
> GP using
>
>
> http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/Vec/VecGetOwnershipRange.html
>
> and compare to what you have.
>
> Matt
>
>
>> I don't know if I was clear enough. Please, ask me what you need to
>> understand my problem.
>> Thanks a lot.
>>
>> Best regards,
>>
>> Marco
>>
>> --
>> Marco Cisternino
>> PhD Student
>> Politecnico di Torino
>> Mobile:+393281189696
>> Email:marco.cisternino at polito.**it<Email%3Amarco.cisternino at polito.it>
>>
>>
>
--
Marco Cisternino
PhD Student
Politecnico di Torino
Mobile:+393281189696
Email:marco.cisternino at polito.it
More information about the petsc-users
mailing list