[petsc-dev] Tao tron and ksp methods
Jason Sarich
jason.sarich at gmail.com
Thu Mar 13 11:50:28 CDT 2014
Hi Corrado,
The TRON solver should be using the STCG as a default ksp type in order to
get trust region information from the KSP. However, you are correct that it
should at least be able to run using a different KSP without bailing out. I
will get this fixed.
Jason
On Thu, Mar 13, 2014 at 11:17 AM, Corrado Maurini
<corrado.maurini at upmc.fr>wrote:
> Hi,
>
> Running in petsc/src/tao/bound/examples/tutorials
>
> ./jbearing2 -tao_view -tao_type tron -ksp_type gmres
>
> I get the error below. With tao 2.2 and petsc 3.4 and the same options, I
> did not get any error.
>
> Now the TAO TRON method seems to work only with STCG as KSP. Is there
> any reason?
>
> Corrado
>
> ------
>
> ./jbearing2 -tao_view -tao_type tron -ksp_type gmres
>
> ---- Journal Bearing Problem SHB-----
> mx: 50, my: 50, ecc: 0.1
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: No support for this operation for this object type
> [0]PETSC ERROR: Cannot locate function KSPSTCGSetRadius_C in object
> [0]PETSC ERROR: See http://http://www.
> mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.4.3-3575-gf3e7060 GIT
> Date: 2014-03-11 19:24:46 -0500
> [0]PETSC ERROR: ./jbearing2 on a arch-darwin-cxx-opt named muspratt by
> Maurini Thu Mar 13 17:16:30 2014
> [0]PETSC ERROR: Configure options --download-blacs=1 --download-hypre=1
> --download-metis=1 --download-ml=1 --download-mumps=1 --download-parmetis=1
> --download-superlu_dist=1 --download-umfpack=1 --with-c-support=1
> --with-clanguage=cxx --with-debugging=0 --with-scalapack-dir=/usr/local/
> --with-scotch-dir=/usr/local/ --with-shared-libraries=1
> --with-x11=1 COPTFLAGS=-O2 PETSC_ARCH=arch-darwin-cxx-opt
> [0]PETSC ERROR: #1 KSPSTCGSetRadius() line 37 in
> /opt/HPC/petsc/src/ksp/ksp/impls/cg/stcg/stcg.c
> [0]PETSC ERROR: #2 TaoSolve_TRON() line 170 in
> /opt/HPC/petsc/src/tao/bound/impls/tron/tron.c
> [0]PETSC ERROR: #3 TaoSolve() line 192 in
> /opt/HPC/petsc/src/tao/interface/taosolver.c
> [0]PETSC ERROR: #4 main() line 169 in
> /opt/HPC/petsc/src/tao/bound/examples/tutorials/jbearing2.c
> [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 56.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20140313/c73ede85/attachment.html>
More information about the petsc-dev
mailing list