[petsc-users] using OpenMP in PETSc
Xiangdong Liang
xdliang at gmail.com
Wed Mar 14 15:57:35 CDT 2012
On Wed, Mar 14, 2012 at 4:17 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Wed, Mar 14, 2012 at 1:33 PM, Xiangdong Liang <xdliang at gmail.com> wrote:
>>
>> On Tue, Mar 13, 2012 at 5:57 PM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>> > On Tue, Mar 13, 2012 at 4:39 PM, Xiangdong Liang <xdliang at gmail.com>
>> > wrote:
>> >>
>> >> Hello everyone,
>> >>
>> >> Can someone provide me advice on using OpenMP in PETSc? I am solving a
>> >> problem like this:
>> >>
>> >> int main()
>> >> {
>> >> vec vsum;
>> >>
>> >> for(i=0; i< N; i++)
>> >> {
>> >> vec vi;
>> >> fcomputev(i, vi);
>> >> VecAXPY(vsum, 1.0, vi); // vsum +=vi;
>> >> }
>> >>
>> >> }
>> >>
>> >> Can I use OpenMP "omp parallel for" to do the loop in parallel? For
>> >> example, suppose I have 8 processes. It would be nice if each petsc
>> >> subrountine fcomputev uses 2 processes while 4 different i's are
>> >> computed in parallel (since different i's are independent).
>> >
>> >
>> > PETSc is not threadsafe. This is trivial to do in MPI where the comm for
>> > eah
>> > Vec is a group of 2 procs.
>>
>> Do you mean the use of MPI_Reduce? If I use MPI_Reduce, should I
>> convert the Vec objects into regular array by vecgetarray first, then
>> apply the MPI_Op (MPI_SUM) on these arrays? Is there any way to
>> circumvent this vec-arrary converting by define MPI_Op and
>> MPI_datatype on Vec ?
>
>
> I think you miss my point. This is the difference between writing SPMD
> programs
> and threaded programs. PETSc is SPMD, and I don't ever expect that to
> change.
I am still not clear about this. Is MPI_Reduce or converting vec to
array is unnecessary? Can you explain more about that? Thanks.
Xiangdong
> CUDA, for instance, is also SPMD.
>
>>
>> Another quick question, where can I find the implementation of vec
>> ops? For example, In petscvec.h, VecAXPY is implemented like
>> (*y->ops->axpy)(y,alpha,x). Can you point me to the implementation of
>> methods ops->axpy?
>
>
> src/vec/vec/impls
>
> Matt
>
>>
>> Thanks.
>> Xiangdong
>>
>>
>>
>> >
>> > Matt
>> >
>> >>
>> >> Any helps or hints on this would be appreciated.
>> >>
>> >> Best,
>> >> Xiangdong
>> >
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> > experiments
>> > is infinitely more interesting than any results to which their
>> > experiments
>> > lead.
>> > -- Norbert Wiener
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener
More information about the petsc-users
mailing list