[petsc-dev] configuring hypre on batch system

Mark Adams mfadams at lbl.gov
Fri Jan 9 12:26:08 CST 2015


Thanks Pierre,

This page should be linked on the 'google petsc threads' page.

On Fri, Jan 9, 2015 at 8:15 AM, Pierre Jolivet <
pierre.jolivet at ljll.math.upmc.fr> wrote:

> Maybe you should try to re-configure --with-threadcomm
> --with-pthreadclasses and/or --with-openmp, cf.
> http://www.mcs.anl.gov/petsc/documentation/installation.html#threads.
>
> Pierre
>
> On 2015-01-09 14:05, Mark Adams wrote:
>
>> Humm, pthread does not work either:
>>
>> [0]PETSC ERROR: --------------------- Error Message
>> --------------------------------------------------------------
>> [0]PETSC ERROR: Unknown type. Check for miss-spelling or missing
>> package:
>> http://www.mcs.anl.gov/petsc/documentation/installation.html#external
>> [1]
>> [0]PETSC ERROR: Unable to find requested PetscThreadComm type pthread
>> [0]PETSC ERROR: See
>> http://www.mcs.anl.gov/petsc/documentation/faq.html [2] for trouble
>>
>> shooting.
>> [0]PETSC ERROR: Petsc Development GIT revision: v3.5.2-1345-g927ffcc
>> GIT Date: 2015-01-08 16:04:39 -0700
>> [0]PETSC ERROR: ./xgc2 on a arch-titan-opt-pgi named nid07141 by adams
>> Fri Jan  9 08:04:32 2015
>> [0]PETSC ERROR: Configure options --COPTFLAGS="-mp -fast"
>> --CXXOPTFLAGS="-mp -fast" --FOPTFLAGS="-mp -fast" --download-hypre
>> --download-metis --download-parmetis --with-cc=cc
>> --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0
>> --with-fc=ftn --with-fortranlib-autodetect=0 --with-shared-libraries=0
>> --known-mpi-shared-libraries=1 --with-x=0 --with-debugging=0
>> PETSC_ARCH=arch-titan-opt-pgi
>> PETSC_DIR=/lustre/atlas2/env003/scratch/adams/petsc2
>> [0]PETSC ERROR: #1 PetscThreadCommSetType() line 512 in
>> /lustre/atlas2/env003/scratch/adams/petsc2/src/sys/threadcomm/interface/
>> threadcomm.c
>> [0]PETSC ERROR: #2 PetscThreadCommWorldInitialize() line 1250 in
>> /lustre/atlas2/env003/scratch/adams/petsc2/src/sys/threadcomm/interface/
>> threadcomm.c
>> [0]PETSC ERROR: #3 PetscGetThreadCommWorld() line 82 in
>> /lustre/atlas2/env003/scratch/adams/petsc2/src/sys/threadcomm/interface/
>> threadcomm.c
>> [0]PETSC ERROR: #4 PetscCommGetThreadComm() line 117 in
>> /lustre/atlas2/env003/scratch/adams/petsc2/src/sys/threadcomm/interface/
>> threadcomm.c
>> [0]PETSC ERROR: #5 PetscCommDuplicate() line 195 in
>> /lustre/atlas2/env003/scratch/adams/petsc2/src/sys/objects/tagm.c
>>
>> On Fri, Jan 9, 2015 at 7:42 AM, Mark Adams <mfadams at lbl.gov> wrote:
>>
>>  And is openmp not implemented? I get this:
>>>
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: Unknown type. Check for miss-spelling or missing
>>> package:
>>>
>>>  http://www.mcs.anl.gov/petsc/documentation/installation.html#external
>>
>>> [1]
>>> [0]PETSC ERROR: Unable to find requested PetscThreadComm type openmp
>>> [0]PETSC ERROR: See
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html [2] for trouble
>>>
>>> shooting.
>>> [0]PETSC ERROR: Petsc Development GIT revision: v3.5.2-1345-g927ffcc
>>> GIT Date: 2015-01-08 16:04:39 -0700
>>> [0]PETSC ERROR: ./xgc2 on a arch-titan-opt-pgi named nid09736 by
>>> adams Fri Jan 9 07:39:56 2015
>>> [0]PETSC ERROR: Configure options --COPTFLAGS="-mp -fast"
>>> --CXXOPTFLAGS="-mp -fast" --FOPTFLAGS="-mp -fast" --download-hypre
>>> --download-metis --download-parmetis --with-cc=cc
>>> --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0
>>> --with-fc=ftn --with-fortranlib-autodetect=0
>>> --with-shared-libraries=0 --known-mpi-shared-libraries=1 --with-x=0
>>> --with-debugging=0 PETSC_ARCH=arch-titan-opt-pgi
>>> PETSC_DIR=/lustre/atlas2/env003/scratch/adams/petsc2
>>> [0]PETSC ERROR: #1 PetscThreadCommSetType() line 512 in
>>>
>>>  /lustre/atlas2/env003/scratch/adams/petsc2/src/sys/
>> threadcomm/interface/threadcomm.c
>>
>>> [0]PETSC ERROR: #2 PetscThreadCommWorldInitialize() line 1250 in
>>>
>>>  /lustre/atlas2/env003/scratch/adams/petsc2/src/sys/
>> threadcomm/interface/threadcomm.c
>>
>>>
>>> On Fri, Jan 9, 2015 at 12:25 AM, Abhyankar, Shrirang G.
>>> <abhyshr at mcs.anl.gov> wrote:
>>>
>>> Mark,
>>> The input for -threadcomm_affinities are the processor numbers
>>>
>>> So -threadcomm_nthreads 4
>>> -threadcomm_affinities 0 1 2 3
>>>
>>> will pin the 4 threads to processors 0,1,2,3. Unfortunately, there
>>> is no standardization of processor number mapping on physical and/or
>>> logical cores (it is decided by the OS I think). For example, on one
>>> node with two quad-core CPUs (total 8 processors, no
>>> hyperthreading), the 1st CPU may have processor numbers 0,1,3,5,
>>> while the other 2,4,6,8. On another node with similar hardware, the
>>> processor numbers may be 0,1,2,3 on the 1st CPU and 4,5,6,7 on the
>>> second. Hence, tools like likwid or hwloc are very helpful for
>>> getting the hardware layout. You may also obtain this info by
>>> looking at /proc/cupinfo on linux.
>>>
>>> Shri
>>> From: Mark Adams <mfadams at lbl.gov>
>>> Date: Thu, 8 Jan 2015 21:43:30 -0500
>>> To: barry smith <bsmith at mcs.anl.gov>
>>> Cc: petsc-dev mailing list <petsc-dev at mcs.anl.gov>
>>> Subject: Re: [petsc-dev] configuring hypre on batch system
>>>
>>>  -threadcomm_affinities 0 1 2 3 4 5 6 7 ?????
>>>>
>>>
>>> I don't know what the flag is here
>>>
>>> Neither do I. The web page
>>> http://www.mcs.anl.gov/petsc/features/threads.html [3] says:
>>>
>>> *
>>> -threadcomm_affinities <list_of_affinities>: Sets the core
>>> affinities of threads
>>>
>>> I'm not sure what to put here ...
>>>
>>>  -threadcomm_type openmp
>>>>
>>>> Then would I get threaded MatVec and other CG + MG stuff? I know
>>>>
>>> this will not be faster but I just need data to corroborate what we
>>> all know. And I don't care about setup.
>>>
>>> Depends on the smoother, we don't have any threaded SOR, if you
>>> using Jacobi + Cheyby it will be threaded.
>>>
>>> Oh right, thanks,
>>>
>>
>>
>>
>> Links:
>> ------
>> [1] http://www.mcs.anl.gov/petsc/documentation/installation.html#external
>> [2] http://www.mcs.anl.gov/petsc/documentation/faq.html
>> [3] http://www.mcs.anl.gov/petsc/features/threads.html
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150109/e78b80b9/attachment.html>


More information about the petsc-dev mailing list