[petsc-users] Error configuring PETSc with SUNDIALS (to solve stiff PDE)

namu patel namu.patel7 at gmail.com
Sun Nov 29 14:19:02 CST 2015


On Sun, Nov 29, 2015 at 2:09 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>
>
> Namu, Can you try using:
>
> --with-mpi-dir=/Users/namupatel/Softwares/OpenMPI/1.10.0
>
> [in place of options:
> "--CC=/Users/namupatel/Softwares/OpenMPI/1.10.0/bin/mpicc
> --CXX=/Users/namupatel/Softwares/OpenMPI/1.10.0/bin/mpicxx
> --FC=/Users/namupatel/Softwares/OpenMPI/1.10.0/bin/mpif90" ]


This works, thanks.

On Sun, Nov 29, 2015 at 2:09 PM, Satish Balay <balay at mcs.anl.gov> wrote:

> Ok - I see the issue now.
>
> We are using --with-mpi-root option with sundials.
>
> However - configure is not able to determine this - when configured
> with CC=/path/to/mpicc
>
> sundials.py looks complicated enough - just to work arround this
> mpi/non-mpi compiler issue of sundials.
>
> But perhaps we should always add -with-mpicc - as you suggest
> [and get rid of --with-mpi-root stuff? - I'm not sure if that would work]
>
> Namu, Can you try using:
>
> --with-mpi-dir=/Users/namupatel/Softwares/OpenMPI/1.10.0
>
> [in place of options:
> "--CC=/Users/namupatel/Softwares/OpenMPI/1.10.0/bin/mpicc
> --CXX=/Users/namupatel/Softwares/OpenMPI/1.10.0/bin/mpicxx
> --FC=/Users/namupatel/Softwares/OpenMPI/1.10.0/bin/mpif90" ]
>
> Satish
>
> On Sun, 29 Nov 2015, Matthew Knepley wrote:
>
> > On Sun, Nov 29, 2015 at 11:50 AM, namu patel <namu.patel7 at gmail.com>
> wrote:
> >
> > > Attached is my configuration log file.
> > >
> >
> > Satish, the idiotic Sundials configure cannot find mpicc even when its
> > passed in as the C compiler:
> >
> > MPI-C Settings
> > --------------
> > checking if using MPI-C script... yes
> > checking if absolute path to mpicc was given... no
> > checking for mpicc... none
> >    Unable to find a functional MPI-C compiler.
> >    Try using --with-mpicc to specify a MPI-C compiler script,
> >    --with-mpi-incdir, --with-mpi-libdir and --with-mpi-libs
> >    to specify the locations of all relevant MPI files, or
> >    --with-mpi-root to specify the base installation directory
> >    of the MPI implementation to be used.
> >    Disabling the parallel NVECTOR module and all parallel examples...
> >
> > Do we have to give -with-mpicc as an argument?
> >
> >   Thanks,
> >
> >      Matt
> >
> >
> > > Thanks,
> > > Namu
> > >
> > > On Sun, Nov 29, 2015 at 11:48 AM, Matthew Knepley <knepley at gmail.com>
> > > wrote:
> > >
> > >> You must send configure.log
> > >>
> > >>   Thanks,
> > >>
> > >>     Matt
> > >>
> > >> On Sun, Nov 29, 2015 at 11:42 AM, namu patel <namu.patel7 at gmail.com>
> > >> wrote:
> > >>
> > >>> Hello All,
> > >>>
> > >>> I was trying to configure PETSc with SUNDIALS so that I may use
> PVODE to
> > >>> solve a stiff hyperbolic PDE of the form
> > >>>
> > >>> A(x) u_tt = K [B(x) u_x - F(x, u, u_x, t)]_x ,  t > 0,  0 < x < L ,
> > >>> u_x(0, t) = u_x(L, t) = 0 ,
> > >>> u(x, 0) = u_t(x, 0) = 0 ,
> > >>>
> > >>> where K >> 1. I was reading around to see what may be a good
> numerical
> > >>> implementation for such a problem and it can be tricky here because
> the
> > >>> stiffness is both in the linear part and the nonlinear forcing term.
> > >>>
> > >>> I want to try PVODE availble in the SUNDIALS package. When I try to
> > >>> configure PETSc with SUNDIALS, I get the message:
> > >>>
> > >>> Downloaded sundials could not be used. Please check install in
> > >>> /Users/namupatel/Softwares/PETSc/3.6.2/linux-dbg
> > >>>
> > >>>
> > >>>
> *******************************************************************************
> > >>>
> > >>>   File "./config/configure.py", line 363, in petsc_configure
> > >>>
> > >>>     framework.configure(out = sys.stdout)
> > >>>
> > >>>   File
> > >>>
> "/Users/namupatel/Softwares/PETSc/3.6.2/config/BuildSystem/config/framework.py",
> > >>> line 1081, in configure
> > >>>
> > >>>     self.processChildren()
> > >>>
> > >>>   File
> > >>>
> "/Users/namupatel/Softwares/PETSc/3.6.2/config/BuildSystem/config/framework.py",
> > >>> line 1070, in processChildren
> > >>>
> > >>>     self.serialEvaluation(self.childGraph)
> > >>>
> > >>>   File
> > >>>
> "/Users/namupatel/Softwares/PETSc/3.6.2/config/BuildSystem/config/framework.py",
> > >>> line 1051, in serialEvaluation
> > >>>
> > >>>     child.configure()
> > >>>
> > >>>   File
> > >>>
> "/Users/namupatel/Softwares/PETSc/3.6.2/config/BuildSystem/config/package.py",
> > >>> line 677, in configure
> > >>>
> > >>>     self.executeTest(self.configureLibrary)
> > >>>
> > >>>   File
> > >>>
> "/Users/namupatel/Softwares/PETSc/3.6.2/config/BuildSystem/config/base.py",
> > >>> line 126, in executeTest
> > >>>
> > >>>     ret = test(*args,**kargs)
> > >>>
> > >>>   File
> > >>>
> "/Users/namupatel/Softwares/PETSc/3.6.2/config/BuildSystem/config/package.py",
> > >>> line 592, in configureLibrary
> > >>>
> > >>>     for location, directory, lib, incl in self.generateGuesses():
> > >>>
> > >>>   File
> > >>>
> "/Users/namupatel/Softwares/PETSc/3.6.2/config/BuildSystem/config/package.py",
> > >>> line 332, in generateGuesses
> > >>>
> > >>>     raise RuntimeError('Downloaded '+self.package+' could not be
> used.
> > >>> Please check install in '+d+'\n')
> > >>>
> > >>> Two questions:
> > >>>
> > >>> 1. How can I resolve the above error?
> > >>>
> > >>> 2. Are there any recommendations to solving the stiff PDE stated
> above
> > >>> so that I can experiment to see what may be an efficient
> implementation?
> > >>>
> > >>> Thank you,
> > >>>
> > >>> Namu
> > >>>
> > >>>
> > >>>
> > >>
> > >>
> > >> --
> > >> What most experimenters take for granted before they begin their
> > >> experiments is infinitely more interesting than any results to which
> their
> > >> experiments lead.
> > >> -- Norbert Wiener
> > >>
> > >
> > >
> >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151129/5a351527/attachment.html>


More information about the petsc-users mailing list