[petsc-users] Threaded Petsc

Shri abhyshr at mcs.anl.gov
Tue Oct 16 11:36:55 CDT 2012


On Oct 16, 2012, at 11:30 AM, Subramanya G wrote:

> Hi
> Thanks very much for the info. Does this mean that all the internal
> vector and matrix operations are multi-threaded?

All the vector operations are threaded. All the matrix operations aren't threaded yet (only MatMult and a few others). 
We are working on threading the matrix operations once we test the vector operations.
>  If I have a
> sequential code with PETSC_THREADCOMM_ACTIVE  set to on will it be
> faster than a purely sequential code, especially for the KSP routines
> , which I guess depend on AXPY type operations.
Yes, you should expect some speedup for the vector operations and MatMult.

Shri
> Thanks
> 
> Subramanya G Sadasiva,
> 
> Graduate Research Assistant,
> Hierarchical Design and Characterization Laboratory,
> School of Mechanical Engineering,
> Purdue University.
> 
> "The art of structure is where to put the holes"
> Robert Le Ricolais, 1894-1977
> 
> 
> On Tue, Oct 16, 2012 at 9:10 AM, Shri <abhyshr at mcs.anl.gov> wrote:
>> Subramanya,
>> 
>>   Threads can be used in petsc-dev via the thread communicator object "threadcomm". PETSc currently has three thread communicators available:
>> i) nothread - This is the default and does not use any threads.
>> ii) pthread - This one uses native lockfree thread pool implementation using POSIX threads
>> iii) OpenMP - OpenMP manages the threadpool.
>> 
>> Configuration:
>> Configure with --with-pthreadclasses=1 and/or --with-openmp=1 to use pthreads or OpenMP.
>> 
>> **Note: If you are planning to use the OpenMP thread communicator only on MacOS then you'll need to additionally configure with --with-pthreadclasses=1. This is because OpenMP threadprivate variables are not available on MacOS and we use pthread_get/setspecific() calls for thread private variables instead (which are only available when configured with --with-pthreadclasses).
>> 
>> Usage:
>> cd src/sys/objects/threadcomm/examples/tutorials
>> make ex1
>> ./ex3 -threadcomm_type <nothread,openmp,pthread> -threadcomm_nthreads < # of threads>
>> 
>> Use -h to get the list of available options with threadcomm.
>> 
>> Examples:
>> There are several examples in src/sys/threadcomm/examples/tutorials/ .Take a look at these to see how the threadcomm interface is used.
>> 
>> We currently have PETSc's vector (seq and mpi) operations threaded but haven't made it open yet. These operations are guarded by the flag PETSC_THREADCOMM_ACTIVE. In order to use the threaded vector operations define PETSC_THREADCOMM_ACTIVE in $PETSC_DIR/$PETSC_ARCH/include/petscconf.h
>> 
>> I'll update the docs on the website.
>> 
>> Shri
>> 
>> On Oct 15, 2012, at 11:59 PM, Barry Smith wrote:
>>> 
>>>  Ok, thanks. Shri will update them with the latest info tomorrow. Sorry for the confusion. (they are easier :-)).
>>> 
>>>   Barry
>>> 
>>> 
>>> On Oct 15, 2012, at 11:57 PM, Subramanya G <subramanya.g at gmail.com> wrote:
>>> 
>>>> Hi Barry,
>>>> These are the instructions that I found.
>>>> Thanks.
>>>> 
>>>> http://www.mcs.anl.gov/petsc/petsc-dev/docs/installation.html#threads
>>>> 
>>>> Subramanya G Sadasiva,
>>>> 
>>>> Graduate Research Assistant,
>>>> Hierarchical Design and Characterization Laboratory,
>>>> School of Mechanical Engineering,
>>>> Purdue University.
>>>> 
>>>> "The art of structure is where to put the holes"
>>>> Robert Le Ricolais, 1894-1977
>>>> 
>>>> 
>>>> On Mon, Oct 15, 2012 at 9:49 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>>> 
>>>>> On Oct 15, 2012, at 11:35 PM, Subramanya G <subramanya.g at gmail.com> wrote:
>>>>> 
>>>>>> Hi,
>>>>>> When I try to run the example suggested with -dm_vector_type  pthread
>>>>>> , I get  a unknown vector type error.
>>>>> 
>>>>> That documentation is out of date. Please let us know where you saw this "suggestion" so we can remove it.
>>>>> 
>>>>> Barry
>>>>> 
>>>>> 
>>>>>> 
>>>>>> Unknown vector type: pthread!
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> Here are my configure lines
>>>>>> Configure options --download-mpich --download-f-blas-lapack
>>>>>> --download-scientificpython --download-fiat --download-generator
>>>>>> --download-chaco --download-triangle --with-shared-libraries
>>>>>> --download-metis --download-parmetis --download-umfpack
>>>>>> --download-hypre --sharedLibraryFlags=-fPIC -shared
>>>>>> --with-pthreadclasses
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> I am using the latest Petsc Dev version.
>>>>>> Does anybody have any idea what might be wrong? It built without any
>>>>>> issues. Do I need to do anything else when I configure.
>>>>>> 
>>>>>> Subramanya G Sadasiva,
>>>>>> 
>>>>>> Graduate Research Assistant,
>>>>>> Hierarchical Design and Characterization Laboratory,
>>>>>> School of Mechanical Engineering,
>>>>>> Purdue University.
>>>>>> 
>>>>>> "The art of structure is where to put the holes"
>>>>>> Robert Le Ricolais, 1894-1977
>>>>>> 
>>>>>> 
>>>>>> On Mon, Oct 15, 2012 at 5:44 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>>>>> 
>>>>>>> On Oct 15, 2012, at 7:38 PM, Subramanya G <subramanya.g at gmail.com> wrote:
>>>>>>> 
>>>>>>>> Hi ,
>>>>>>>> I need iterative solvers for some of my code and I was planning to use
>>>>>>>> petsc for it. Owing to the fact that some other parts of my code
>>>>>>>> aren't parallelizable, I can't use petsc in parallel. However, I see
>>>>>>>> that petsc can do threads now.. I have a small question about this .
>>>>>>>> Does this allow me to assemble a matrix in the main thread. and then
>>>>>>>> when solve is called will petsc take care of using multiple threads?
>>>>>>> 
>>>>>>> Yes
>>>>>>> 
>>>>>>>> Will this give me any speed ups?
>>>>>>> 
>>>>>>> Yes
>>>>>>> 
>>>>>>> You need to use petsc-dev for this functionality and let us know if you have difficulties.
>>>>>>> 
>>>>>>> Barry
>>>>>>> 
>>>>>>>> Thanks/
>>>>>>>> Subramanya G Sadasiva,
>>>>>>>> 
>>>>>>>> Graduate Research Assistant,
>>>>>>>> Hierarchical Design and Characterization Laboratory,
>>>>>>>> School of Mechanical Engineering,
>>>>>>>> Purdue University.
>>>>>>>> 
>>>>>>>> "The art of structure is where to put the holes"
>>>>>>>> Robert Le Ricolais, 1894-1977
>>>>>>> 
>>>>> 
>>> 
>> 



More information about the petsc-users mailing list