[petsc-users] ML options
Jed Brown
jedbrown at mcs.anl.gov
Tue Jan 8 08:07:36 CST 2013
On Mon, Jan 7, 2013 at 4:23 PM, Mark F. Adams <mark.adams at columbia.edu>wrote:
> '-pc_ml_reuse_interpolation true' does seem to get ML to reuse some mesh
> setup. The setup time goes from .3 to .1 sec on one of my tests from the
> first to the second solve.
>
> Hong: this looks like a way to infer ML's RAP times. I think the second
> solves are just redoing the RAP with this flag, like what GAMG does by
> default.
>
Huh, it's changing the sparsity of the coarse grid.
$ ./ex15 -da_grid_x 20 -da_grid_y 20 -p 1.2 -ksp_converged_reason -pc_type
ml -pc_ml_reuse_interpolation
Linear solve converged due to CONVERGED_RTOL iterations 6
[0]PETSC ERROR: --------------------- Error Message
------------------------------------
[0]PETSC ERROR: Argument out of range!
[0]PETSC ERROR: New nonzero at (0,8) caused a malloc!
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Development HG revision:
cb29460836d903f43276d687c4ba9f5917bf6651 HG Date: Sun Jan 06 14:49:14 2013
-0600
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: ./ex15 on a mpich named batura by jed Tue Jan 8 08:06:37
2013
[0]PETSC ERROR: Libraries linked from /home/jed/petsc/mpich/lib
[0]PETSC ERROR: Configure run at Sun Jan 6 15:19:56 2013
[0]PETSC ERROR: Configure options --download-ams --download-blacs
--download-chaco --download-generator --download-hypre --download-ml
--download-spai --download-spooles --download-sundials --download-superlu
--download-superlu_dist --download-triangle --with-blas-lapack=/usr
--with-c2html --with-cholmod-dir=/usr
--with-clique-dir=/home/jed/usr/clique-mpich
--with-elemental-dir=/home/jed/usr/clique-mpich --with-exodusii-dir=/usr
--with-hdf5-dir=/opt/mpich --with-lgrind
--with-metis-dir=/home/jed/usr/clique-mpich --with-mpi-dir=/opt/mpich
--with-netcdf-dir=/usr --with-openmp
--with-parmetis-dir=/home/jed/usr/clique-mpich --with-pcbddc
--with-pthreadclasses --with-shared-libraries --with-single-library=0
--with-sowing --with-threadcomm --with-umfpack-dir=/usr --with-x
-PETSC_ARCH=mpich
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: MatSetValues_SeqAIJ() line 352 in
/home/jed/petsc/src/mat/impls/aij/seq/aij.c
[0]PETSC ERROR: MatSetValues() line 1083 in
/home/jed/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: MatWrapML_SeqAIJ() line 348 in
/home/jed/petsc/src/ksp/pc/impls/ml/ml.c
[0]PETSC ERROR: PCSetUp_ML() line 639 in
/home/jed/petsc/src/ksp/pc/impls/ml/ml.c
[0]PETSC ERROR: PCSetUp() line 832 in
/home/jed/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 267 in
/home/jed/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve() line 376 in
/home/jed/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: SNES_KSPSolve() line 4460 in
/home/jed/petsc/src/snes/interface/snes.c
[0]PETSC ERROR: SNESSolve_NEWTONLS() line 216 in
/home/jed/petsc/src/snes/impls/ls/ls.c
[0]PETSC ERROR: SNESSolve() line 3678 in
/home/jed/petsc/src/snes/interface/snes.c
[0]PETSC ERROR: main() line 221 in src/snes/examples/tutorials/ex15.c
application called MPI_Abort(MPI_COMM_WORLD, 63) - process 0
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130108/c442f2c9/attachment.html>
More information about the petsc-users
mailing list