[petsc-users] Re-configuring PETSc

Najeeb Ahmad nahmad16 at ku.edu.tr
Thu May 31 05:13:42 CDT 2018


Thank you Barry for your useful comments and the paper and Hong for some
very very useful information. I will get back to you if I need any further
assistance.

By the way, in my experience, PETSc has got one of the best user support
system I have ever experienced for any software/library. Information is
timely and relevant, Three cheers for all PETSc experts in this group :)

Have a great day,

On Wed, May 30, 2018 at 8:01 AM, Zhang, Hong <hongzhang at anl.gov> wrote:

>
>
> On May 29, 2018, at 3:05 AM, Najeeb Ahmad <nahmad16 at ku.edu.tr> wrote:
>
>
>
> On Mon, May 28, 2018 at 9:32 PM, Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
>
>>
>>
>> > On May 28, 2018, at 10:32 AM, Najeeb Ahmad <nahmad16 at ku.edu.tr> wrote:
>> >
>> > Thanks a lot Satish for your prompt reply.
>> >
>> > I just checked that SuperLU_dist package works only for matrices of
>> type aij. and uses lu preconditioner. I am currently working with baij
>> matrix. What is the best preconditioner choice for baij matrices on
>> parallel machines?
>>
>>     The best preconditioner is always problem specific. Where does your
>> problem come from? CFD? Structural mechanics? other apps?
>>
>
>         I am interested in writing solver for reservoir simulation
> employing FVM and unstructured grids. My main objective is to study
> performance of the code with different data structures/data layouts and
> architecture specific optimizations, specifically targeting the multicore
> architectures like KNL for instance. Later the study may be extended to
> include GPUs. The options for switching between AIJ and BAIJ etc. are
> therefore very useful for my study.
>
>         The purpose why I wanted to change the preconditioner is that the
> default preconditioner is giving me different iterations count for
> different number of processesors. I would rather like a preconditioner that
> would give me same iteration count for any processor count so that I can
> better compare the performance results.
>
>         Your suggestions in this regard are highly appreciated,
> specifically with reference to the following points:
>
>          - Is it possible to explicitly use high bandwidth memory in PETSc
> for selected object placement (e.g. using memkind library for instance)?
>
>
> Yes, see http://www.mcs.anl.gov/petsc/petsc-3.8/src/sys/memory/mhbw.c.html
>
> This was developed to use memkind to handle adjoint checkpointing where I
> want to use HBW memory for computation and DRAM for storing checkpoints.
> But it can be used for your purpose as well.
>
> When configure PETSc, use "--with-memkind-dir=" to specify the location of
> the memkind library.
> The runtime option "-malloc_hbw" will allow you to allocate all PETSc
> objects in HBW memory. If the HBW memory is ran out, it falls back to DRAM.
>
> If you want to place selective objects in DRAM, you can do
>
> PetscMallocSetDRAM()
> ...  allocate your objects ...
> PetscMallocResetDRAM()
>
> An example usage can be found at
> http://www.mcs.anl.gov/petsc/petsc-dev/src/ts/trajectory/
> impls/memory/trajmemory.c
>
>
>          - What would it take to take advantage of architecture specific
> compiler flags to achieve good performance on a given platform (e.g.
> -xMIC-AVX512 for AVX512 on KNL, #pragma SIMD etc.).
>
>
> To build PETSc on KNL with AVX512 enabled, see the example scripts
> config/examples/arch-linux-knl.py
> config/examples/arch-cray-xc40-knl-opt.py
>
> Note that the MatMult kernel (for AIJ and SELL) has been manually
> optimized for best performance.
>
> Hong (Mr.)
>
>        Sorry for some very basic questions as I am a novice PETSc user.
>
>      Thanks for your time :)
>
>>
>>      Anyways you probably want to make your code be able to switch
>> between AIJ and BAIJ at run time since the different formats support
>> somewhat different solvers. If your code alls MatSetFromOptions then you
>> can switch via the command line option -mat_type aij or baij
>>
>>    Barry
>>
>> >
>> > Thanks
>> >
>> > On Mon, May 28, 2018 at 8:23 PM, Satish Balay <balay at mcs.anl.gov>
>> wrote:
>> > On Mon, 28 May 2018, Najeeb Ahmad wrote:
>> >
>> > > Hi All,
>> > >
>> > > I have Petsc release version 3.9.2 configured with the following
>> options:
>> > >
>> > > Configure options --with-cc=mpiicc --with-cxx=mpiicpc
>> --with-fc=mpiifort
>> > > --download-fblaslapack=1
>> > >
>> > > Now I want to use PCILU in my code and when I set the PC type to
>> PCILU in
>> > > the code, I get the following error:
>> > >
>> > > [0]PETSC ERROR: --------------------- Error Message
>> > > --------------------------------------------------------------
>> > > [0]PETSC ERROR: See
>> > > http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for
>> > > possible LU and Cholesky solvers
>> > > [0]PETSC ERROR: Could not locate a solver package. Perhaps you must
>> > > ./configure with --download-<package>
>> > > [0]PETSC ERROR: See http://www.mcs.anl.gov/
>> petsc/documentation/faq.html for
>> > > trouble shooting.
>> > > [0]PETSC ERROR: Petsc Release Version 3.9.2, unknown
>> > > [0]PETSC ERROR: ./main on a arch-linux2-c-debug named Karachi by
>> nahmad Mon
>> > > May 28 17:52:41 2018
>> > > [0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc
>> > > --with-fc=mpiifort --download-fblaslapack=1
>> > > [0]PETSC ERROR: #1 MatGetFactor() line 4318 in
>> > > /home/nahmad/PETSc/petsc/src/mat/interface/matrix.c
>> > > [0]PETSC ERROR: #2 PCSetUp_ILU() line 142 in
>> > > /home/nahmad/PETSc/petsc/src/ksp/pc/impls/factor/ilu/ilu.c
>> > > [0]PETSC ERROR: #3 PCSetUp() line 923 in
>> > > /home/nahmad/PETSc/petsc/src/ksp/pc/interface/precon.c
>> > > [0]PETSC ERROR: #4 KSPSetUp() line 381 in
>> > > /home/nahmad/PETSc/petsc/src/ksp/ksp/interface/itfunc.c
>> > > [0]PETSC ERROR: #5 KSPSolve() line 612 in
>> > > /home/nahmad/PETSc/petsc/src/ksp/ksp/interface/itfunc.c
>> > > [0]PETSC ERROR: #6 SolveSystem() line 60 in
>> > > /home/nahmad/Aramco/petsc/petsc/BlockSolveTest/src/main.c
>> > >
>> > >
>> > > I assume that I am missing LU package like SuperLU_dist for instance
>> and I
>> > > need to download and configure it with Petsc.
>> >
>> > yes - petsc has sequential LU - but you need superlu_dist/mumps for
>> parallel lu.
>> >
>> > >
>> > > I am wondering what is the best way to reconfigure Petsc to download
>> and
>> > > use the appropriate package to support PCILU?
>> >
>> > Rerun configure with the additional option --download-superlu_dist=1.
>> >
>> > You can do this with current PETSC_ARCH you are using [i.e reinstall
>> > over the current build] - or use a different PETSC_ARCH - so both
>> > builds exist and useable.
>> >
>> > Satish
>> >
>> > >
>> > > You advice is highly appreciated.
>> > >
>> > >
>> >
>> >
>> >
>> >
>> > --
>> > Najeeb Ahmad
>> >
>> > Research and Teaching Assistant
>> > PARallel and MultiCORE Computing Laboratory (ParCoreLab)
>> > Computer Science and Engineering
>> > Koç University, Istanbul, Turkey
>> >
>>
>>
>
>
> --
> *Najeeb Ahmad*
>
>
> *Research and Teaching Assistant *
> *PARallel and MultiCORE Computing Laboratory (ParCoreLab) *
>
> *Computer Science and Engineering *
> *Koç University, Istanbul, Turkey*
>
>
>


-- 
*Najeeb Ahmad*


*Research and Teaching Assistant*
*PARallel and MultiCORE Computing Laboratory (ParCoreLab) *

*Computer Science and Engineering*
*Koç University, Istanbul, Turkey*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180531/8bd9e054/attachment-0001.html>


More information about the petsc-users mailing list