[petsc-users] Building with MKL 10.3

Natarajan CS csnataraj at gmail.com
Tue Mar 15 10:36:23 CDT 2011


Hello,
    I am neither a regular Petsc user nor contributor so preemptive
apologies if I am completely off the line here.

I am not sure if the original poster had hyper-threading in mind when he
asked about multi-threading, in case that was the idea, I don't think using
petsc with MKL (HT) is going to give any benefit, I don't think MKL is
really resource insensitive.

Also I wonder what percentage of the code is actually blas/lapack intensive
to make any significant dent in wall cock?

of course +1 to everything else posed above.

Cheers,

C.S.N

On Mon, Mar 14, 2011 at 8:23 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> On Mar 14, 2011, at 7:48 PM, Lisandro Dalcin wrote:
>
> > On 14 March 2011 20:48, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >>
> >>  Eric,
> >>
> >>   With the current PETSc design that uses one MPI process per core I am
> not sure that it makes sense to use multi-threaded MKL since the processes
> are already using all the cores. I could be wrong.
> >>
> >>   Barry
> >>
> >
> > Well, I think that PETSc should support sequential codes using
> > multi-threaded blas-lapack
>
>    That is support and someone can do it just by using parallel blas.
>
> > and perhaps some OpenMP.
>
>   Again users can use openmp in their sequential PETSc code, we don't stop
> them.
>
>
>  Barry
>
>
> > Moreover, perhaps
> > we could get some little speedup in MatMult for these kinds of
> > users/applications.
>
>    If you aren't willing to pay the entry fee to parallel computing with
> MPI then you shouldn't be allowed any benifits of parallel PETSc :-).
> Actually we will eventually support the "hybrid" model.
>
> >
> > --
> > Lisandro Dalcin
> > ---------------
> > CIMEC (INTEC/CONICET-UNL)
> > Predio CONICET-Santa Fe
> > Colectora RN 168 Km 472, Paraje El Pozo
> > 3000 Santa Fe, Argentina
> > Tel: +54-342-4511594 (ext 1011)
> > Tel/Fax: +54-342-4511169
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110315/d28ee16d/attachment.htm>


More information about the petsc-users mailing list