[petsc-users] Re-configuring PETSc

Smith, Barry F. bsmith at mcs.anl.gov
Tue May 29 14:29:41 CDT 2018



> On May 29, 2018, at 3:05 AM, Najeeb Ahmad <nahmad16 at ku.edu.tr> wrote:
>
>
>
> On Mon, May 28, 2018 at 9:32 PM, Smith, Barry F. <bsmith at mcs.anl.gov> wrote:
>
>
> > On May 28, 2018, at 10:32 AM, Najeeb Ahmad <nahmad16 at ku.edu.tr> wrote:
> >
> > Thanks a lot Satish for your prompt reply.
> >
> > I just checked that SuperLU_dist package works only for matrices of type aij. and uses lu preconditioner. I am currently working with baij matrix. What is the best preconditioner choice for baij matrices on parallel machines?
>
>     The best preconditioner is always problem specific. Where does your problem come from? CFD? Structural mechanics? other apps?
>
>         I am interested in writing solver for reservoir simulation employing FVM and unstructured grids. My main objective is to study performance of the code with different data structures/data layouts and architecture specific optimizations, specifically targeting the multicore architectures like KNL for instance. Later the study may be extended to include GPUs. The options for switching between AIJ and BAIJ etc. are therefore very useful for my study.
>
>         The purpose why I wanted to change the preconditioner is that the default preconditioner is giving me different iterations count for different number of processesors. I would rather like a preconditioner that would give me same iteration count for any processor count so that I can better compare the performance results.

    Pretty much all preconditioners will give at least slightly different iteration counts for different number of processes. This must be taken into account in evaluating the "performance" of the solver and its implementation.


   You can try -pc_type gamg or -pc_type hypre (also requires running ./configure with the additional option --download-hypre).

>
>         Your suggestions in this regard are highly appreciated, specifically with reference to the following points:
>
>          - Is it possible to explicitly use high bandwidth memory in PETSc for selected object placement (e.g. using memkind library for instance)?

    We've found that placing some objects in high bandwidth memory and some in regular memory is a painful and thankless task. You should just run all in high bandwidth memory, otherwise on KNL you will get really crappy performance.

>          - What would it take to take advantage of architecture specific compiler flags to achieve good performance on a given platform (e.g. -xMIC-AVX512 for AVX512 on KNL, #pragma SIMD etc.).
>
     We've found that just using these as compile time flags doesn't help much (that is the compiler is not smart enough to really take advantage of this vectorization).

     Please see the attached paper for a bunch of discussions about performance and KNL.

    Barry

>        Sorry for some very basic questions as I am a novice PETSc user.
>
>      Thanks for your time :)
>
>      Anyways you probably want to make your code be able to switch between AIJ and BAIJ at run time since the different formats support somewhat different solvers. If your code alls MatSetFromOptions then you can switch via the command line option -mat_type aij or baij
>
>    Barry
>
> >
> > Thanks
> >
> > On Mon, May 28, 2018 at 8:23 PM, Satish Balay <balay at mcs.anl.gov> wrote:
> > On Mon, 28 May 2018, Najeeb Ahmad wrote:
> >
> > > Hi All,
> > >
> > > I have Petsc release version 3.9.2 configured with the following options:
> > >
> > > Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort
> > > --download-fblaslapack=1
> > >
> > > Now I want to use PCILU in my code and when I set the PC type to PCILU in
> > > the code, I get the following error:
> > >
> > > [0]PETSC ERROR: --------------------- Error Message
> > > --------------------------------------------------------------
> > > [0]PETSC ERROR: See
> > > http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for
> > > possible LU and Cholesky solvers
> > > [0]PETSC ERROR: Could not locate a solver package. Perhaps you must
> > > ./configure with --download-<package>
> > > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
> > > trouble shooting.
> > > [0]PETSC ERROR: Petsc Release Version 3.9.2, unknown
> > > [0]PETSC ERROR: ./main on a arch-linux2-c-debug named Karachi by nahmad Mon
> > > May 28 17:52:41 2018
> > > [0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc
> > > --with-fc=mpiifort --download-fblaslapack=1
> > > [0]PETSC ERROR: #1 MatGetFactor() line 4318 in
> > > /home/nahmad/PETSc/petsc/src/mat/interface/matrix.c
> > > [0]PETSC ERROR: #2 PCSetUp_ILU() line 142 in
> > > /home/nahmad/PETSc/petsc/src/ksp/pc/impls/factor/ilu/ilu.c
> > > [0]PETSC ERROR: #3 PCSetUp() line 923 in
> > > /home/nahmad/PETSc/petsc/src/ksp/pc/interface/precon.c
> > > [0]PETSC ERROR: #4 KSPSetUp() line 381 in
> > > /home/nahmad/PETSc/petsc/src/ksp/ksp/interface/itfunc.c
> > > [0]PETSC ERROR: #5 KSPSolve() line 612 in
> > > /home/nahmad/PETSc/petsc/src/ksp/ksp/interface/itfunc.c
> > > [0]PETSC ERROR: #6 SolveSystem() line 60 in
> > > /home/nahmad/Aramco/petsc/petsc/BlockSolveTest/src/main.c
> > >
> > >
> > > I assume that I am missing LU package like SuperLU_dist for instance and I
> > > need to download and configure it with Petsc.
> >
> > yes - petsc has sequential LU - but you need superlu_dist/mumps for parallel lu.
> >
> > >
> > > I am wondering what is the best way to reconfigure Petsc to download and
> > > use the appropriate package to support PCILU?
> >
> > Rerun configure with the additional option --download-superlu_dist=1.
> >
> > You can do this with current PETSC_ARCH you are using [i.e reinstall
> > over the current build] - or use a different PETSC_ARCH - so both
> > builds exist and useable.
> >
> > Satish
> >
> > >
> > > You advice is highly appreciated.
> > >
> > >
> >
> >
> >
> >
> > --
> > Najeeb Ahmad
> >
> > Research and Teaching Assistant
> > PARallel and MultiCORE Computing Laboratory (ParCoreLab)
> > Computer Science and Engineering
> > Koç University, Istanbul, Turkey
> >
>
>
>
>
> --
> Najeeb Ahmad
>
> Research and Teaching Assistant
> PARallel and MultiCORE Computing Laboratory (ParCoreLab)
> Computer Science and Engineering
> Koç University, Istanbul, Turkey
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180529/e98050bc/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: paper.pdf
Type: application/pdf
Size: 2277996 bytes
Desc: paper.pdf
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180529/e98050bc/attachment-0001.pdf>


More information about the petsc-users mailing list