[petsc-users] ML options
Sanjay Govindjee
s_g at berkeley.edu
Thu Jan 10 21:08:29 CST 2013
It was version 3.3(-p5)
A fix was backported by Jed:
https://bitbucket.org/petsc/petsc-3.3/commits/93bbec421cbaa0b3efc445fb992fecd53db60b61
On 1/10/13 5:26 PM, Tobin Isaac wrote:
> On Mon, Jan 07, 2013 at 09:36:05AM -0600, Jed Brown wrote:
>> On Mon, Jan 7, 2013 at 9:09 AM, Mark F. Adams <mark.adams at columbia.edu>wrote:
>>
>>> ex56 is a simple 3D elasticity problem. There is a runex56 target that
>>> uses GAMG and a runex56_ml. These have a generic parameters and ML and
>>> GAMG work well.
>>>
>>> The eigen estimates could be bad. This can cause death. I've found that
>>> CG converges to the largest eigenvalue faster than the default GMRES so I
>>> use:
>>>
>>> -gamg_est_ksp_max_it 10 # this is the default, you could increase this to
>>> test
>>> -gamg_est_ksp_type cg
>>>
>>> Jed could tell you how to set this for ML.
>>>
>> ML isn't using eigenvalue estimation (doesn't expose the algorithm). Sanjay
>> is using the default smoother (Richardson + SOR) rather than
>> chebyshev/pbjacobi.
> Which version of petsc is being used? I submitted a bug fix for
> Richardson + SOR as a smoother for inodes, but I don't know which
> versions of petsc have integrated it. It could be the same bug.
>
>>
>>>
>>>
>>> On Jan 7, 2013, at 8:49 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>>>
>>> Could we get an example matrix exhibiting this behavior? If you run with
>>> -ksp_view_binary, the solver will write out the matrix to a file called
>>> 'binaryoutput' (and 'binaryoutput.info') when KSPSolve() returns. I
>>> suppose it could be a "math" reason of the inodes somehow causing an
>>> incorrect near-null space to be passed to ML, but the interface is not
>>> supposed to work like this. If you are serious about smoothed aggregation
>>> for elasticity, you should use MatSetNearNullSpace() to provide the rigid
>>> body modes.
>>>
>>> As a related matter, does -pc_type gamg -pc_gamg_agg_nsmooths 1
>>> -mg_levels_ksp_type richardson -mg_levels_pc_type sor converge well?
>>>
>>>
>>> On Mon, Jan 7, 2013 at 12:55 AM, Sanjay Govindjee <s_g at berkeley.edu>wrote:
>>>
>>>> I am adding ML as an option to our FEA code and was looking for a bit of
>>>> guidance on
>>>> options. Generally we solve 1,2, and 3D solids problems (nonlinear
>>>> elasticity) but
>>>> we also treat shells, thermal, problems, coupled problems, etc. etc.
>>>>
>>>> My basic run line looks like:
>>>>
>>>> -@${MPIEXEC} -n $(NPROC) $(MY_PROGRAM) -ksp_type cg -ksp_monitor -pc_type
>>>> ml -log_summary -ksp_view -options_left
>>>>
>>>> but this does not work very well at all with 3D elasticity for example --
>>>> in fact it fails to converge after 10K iterations on a rather
>>>> modest problem. However following ex26 in the ksp tutorials I also tried:
>>>>
>>>> -@${MPIEXEC} -n $(NPROC) $(FEAPRUN) -ksp_type cg -ksp_monitor -pc_type ml
>>>> -mat_no_inode -log_summary -ksp_view -options_left
>>>>
>>>> And this worked very very much better -- converged in about 10
>>>> iterations. What exactly is -mat_no_inode doing for me? and are there
>>>> other 'important' options
>>>> that I should be aware of when using ML.
>>>>
>>>> -sanjay
>>>>
>>>
>>>
--
-----------------------------------------------
Sanjay Govindjee, PhD, PE
Professor of Civil Engineering
Vice Chair for Academic Affairs
779 Davis Hall
Structural Engineering, Mechanics and Materials
Department of Civil Engineering
University of California
Berkeley, CA 94720-1710
Voice: +1 510 642 6060
FAX: +1 510 643 5264
s_g at berkeley.edu
http://www.ce.berkeley.edu/~sanjay
-----------------------------------------------
New Books:
Engineering Mechanics of Deformable
Solids: A Presentation with Exercises
http://www.oup.com/us/catalog/general/subject/Physics/MaterialsScience/?view=usa&ci=9780199651641
http://ukcatalogue.oup.com/product/9780199651641.do
http://amzn.com/0199651647
Engineering Mechanics 3 (Dynamics)
http://www.springer.com/materials/mechanics/book/978-3-642-14018-1
http://amzn.com/3642140181
-----------------------------------------------
More information about the petsc-users
mailing list