[petsc-dev] configuring hypre on batch system

Mark Adams mfadams at lbl.gov
Fri Jan 9 04:51:18 CST 2015


Thanks Shri,

And a simple fix for my hypre problem was to get an interactive shell and
configure on a compute node.

Mark

On Fri, Jan 9, 2015 at 12:25 AM, Abhyankar, Shrirang G. <abhyshr at mcs.anl.gov
> wrote:

>  Mark,
>   The input for -threadcomm_affinities are the processor numbers
>
>  So -threadcomm_nthreads 4
>      -threadcomm_affinities 0 1 2 3
>
>  will pin the 4 threads to processors 0,1,2,3. Unfortunately, there is no
> standardization of processor number mapping on physical and/or logical
> cores (it is decided by the OS I think). For example, on one node with two
> quad-core CPUs (total 8 processors, no hyperthreading), the 1st CPU may
> have processor numbers 0,1,3,5, while the other 2,4,6,8. On another node
> with similar hardware, the processor numbers may be 0,1,2,3 on the 1st CPU
> and 4,5,6,7 on the second. Hence, tools like likwid or hwloc are very
> helpful for getting the hardware layout. You may also obtain this info by
> looking at /proc/cupinfo on linux.
>
>  Shri
>  From: Mark Adams <mfadams at lbl.gov>
> Date: Thu, 8 Jan 2015 21:43:30 -0500
> To: barry smith <bsmith at mcs.anl.gov>
> Cc: petsc-dev mailing list <petsc-dev at mcs.anl.gov>
> Subject: Re: [petsc-dev] configuring hypre on batch system
>
>
>> > -threadcomm_affinities 0 1 2 3 4 5 6 7  ?????
>>
>> I    don't know what the flag is here
>>
>>
>  Neither do I.  The web page
> http://www.mcs.anl.gov/petsc/features/threads.html says:
>
>
>    -
>    -threadcomm_affinities <list_of_affinities>: Sets the core affinities
>    of threads
>
>  I'm not sure what to put here ...
>
>
>
>> > -threadcomm_type openmp
>> >
>> > Then would I get threaded MatVec and other CG + MG stuff?  I know this
>> will not be faster but I just need data to corroborate what we all know.
>> And I don't care about setup.
>>
>>   Depends on the smoother, we don't have any threaded SOR, if you using
>> Jacobi + Cheyby it will be threaded.
>>
>>
>  Oh right, thanks,
>
>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150109/52f1797e/attachment.html>


More information about the petsc-dev mailing list