[petsc-users] Preconditioning in matrix-free methods

Konstantinos Kontzialis ckontzialis at lycos.com
Mon Feb 24 21:13:22 CST 2014


On 2/24/2014 5:19 PM, petsc-users-request at mcs.anl.gov wrote:
> Send petsc-users mailing list submissions to
> 	petsc-users at mcs.anl.gov
>
> To subscribe or unsubscribe via the World Wide Web, visit
> 	https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
> or, via email, send a message with subject or body 'help' to
> 	petsc-users-request at mcs.anl.gov
>
> You can reach the person managing the list at
> 	petsc-users-owner at mcs.anl.gov
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of petsc-users digest..."
>
>
> Today's Topics:
>
>     1. Re:  Error using MUMPS to solve large linear system (Xiaoye S. Li)
>     2.  Preconditioning in matrix-free methods (Konstantinos Kontzialis)
>     3. Re:  From 1D to 3D problem ? Unstructured mesh ? (Aron Roland)
>     4. Re:  Preconditioning in matrix-free methods (Jed Brown)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 24 Feb 2014 12:58:57 -0800
> From: "Xiaoye S. Li" <xsli at lbl.gov>
> To: Hong Zhang <hzhang at mcs.anl.gov>
> Cc: "petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
> Subject: Re: [petsc-users] Error using MUMPS to solve large linear
> 	system
> Message-ID:
> 	<CAFvbobWXMpqAuQ+wOnEBm5aHXbUSi9D+bysjHFzRsoofxyFjTg at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Samar:
> If you include the error message while crashing using superlu_dist, I
> probably know the reason.  (better yet, include the printout before the
> crash. )
>
> Sherry
>
>
> On Mon, Feb 24, 2014 at 9:56 AM, Hong Zhang <hzhang at mcs.anl.gov> wrote:
>
>> Samar :
>> There are limitations for direct solvers.
>> Do not expect any solver can be used on arbitrarily large problems.
>> Since superlu_dist also crashes, direct solvers may not be able to work on
>> your application.
>> This is why I suggest to increase size incrementally.
>> You may have to experiment other type of solvers.
>>
>> Hong
>>
>>   Hi Hong and Jed,
>>>   Many thanks for replying. It would indeed be nice if the error messages
>>> from MUMPS were less cryptic!
>>>
>>>   1) I have tried smaller matrices although given how my problem is set
>>> up a jump is difficult to avoid. But a good idea
>>> that I will try.
>>>
>>>   2) I did try various ordering but not the one you suggested.
>>>
>>>   3) Tracing the error through the MUMPS code suggest a rather abrupt
>>> termination of the program (there should be more
>>> error messages if, for example, memory was a problem). I therefore
>>> thought it might be an interface problem rather than
>>> one with mumps and turned to the petsc-users group first.
>>>
>>>   4) I've tried superlu_dist but it also crashes (also unclear as to why)
>>> at which point I decided to try mumps. The fact that both
>>> crash would again indicate a common (memory?) problem.
>>>
>>>   I'll try a few more things before asking the MUMPS developers.
>>>
>>>   Thanks again for your help!
>>>
>>>   Samar
>>>
>>>   On Feb 24, 2014, at 11:47 AM, Hong Zhang <hzhang at mcs.anl.gov> wrote:
>>>
>>>   Samar:
>>> The crash occurs in
>>>
>>>> ...
>>>> [161]PETSC ERROR: Error in external library!
>>>> [161]PETSC ERROR: Error reported by MUMPS in numerical factorization
>>>> phase: INFO(1)=-1, INFO(2)=48
>>>
>>> for very large matrix, likely memory problem as you suspected.
>>> I would suggest
>>> 1. run problems with increased sizes (not jump from a small one to a very
>>> large one) and observe memory usage using
>>> '-ksp_view'.
>>>     I see you use '-mat_mumps_icntl_14 1000', i.e., percentage of
>>> estimated workspace increase. Is it too large?
>>>     Anyway, this input should not cause the crash, I guess.
>>> 2. experimenting with different matrix ordering -mat_mumps_icntl_7 <> (I
>>> usually use sequential ordering 2)
>>>      I see you use parallel ordering -mat_mumps_icntl_29 2.
>>> 3. send bug report to mumps developers for their suggestion.
>>>
>>>   4. try other direct solvers, e.g., superlu_dist.
>>>
>>>
>>>> ...
>>>>
>>>> etc etc. The above error I can tell has something to do with processor
>>>> 48 (INFO(2)) and so forth but not the previous one.
>>>>
>>>> The full output enabled with -mat_mumps_icntl_4 3 looks as in the
>>>> attached file. Any hints as to what could be giving this
>>>> error would be very much appreciated.
>>>>
>>> I do not know how to interpret this  output file. mumps developer would
>>> give you better suggestion on it.
>>> I would appreciate to learn as well :-)
>>>
>>>   Hong
>>>
>>>
>>>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140224/aaf8021b/attachment-0001.html>
>
> ------------------------------
>
> Message: 2
> Date: Mon, 24 Feb 2014 22:10:11 +0000 (UTC)
> From: Konstantinos Kontzialis <ckontzialis at lycos.com>
> To: petsc-users at mcs.anl.gov
> Subject: [petsc-users] Preconditioning in matrix-free methods
> Message-ID: <1847626137.47480.1393279811715.JavaMail.mail at webmail01>
> Content-Type: text/plain; charset="us-ascii"
>
> An HTML attachment was scrubbed...
> URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140224/ae2eb6a1/attachment-0001.html>
>
> ------------------------------
>
> Message: 3
> Date: Mon, 24 Feb 2014 23:30:29 +0100
> From: Aron Roland <aaronroland at gmx.de>
> To: Christophe Ortiz <christophe.ortiz at ciemat.es>, 	Jed Brown
> 	<jed at jedbrown.org>
> Cc: petsc-users at mcs.anl.gov, Mathieu Dutour <Mathieu.Dutour at irb.hr>,
> 	Thomas Huxhorn <thomas.huxhorn at web.de>
> Subject: Re: [petsc-users] From 1D to 3D problem ? Unstructured mesh ?
> Message-ID: <530BC805.2090303 at gmx.de>
> Content-Type: text/plain; charset="iso-8859-1"; Format="flowed"
>
> Hi,
>
> I can provide u some nice package to generate unstructured meshes. There
> are many institutions using it now. We have also used PETSC to solve
> some nonlinera hyperbolic problem on 2d on unstructured meshes and it
> works quite ok even if the scaling still not what it should be but well
> these are other issues ...
>
> Cheers
>
> Aron
>
> On 02/24/2014 09:04 AM, Christophe Ortiz wrote:
>> On Sat, Feb 22, 2014 at 2:33 AM, Jed Brown <jed at jedbrown.org
>> <mailto:jed at jedbrown.org>> wrote:
>>
>>      Christophe Ortiz <christophe.ortiz at ciemat.es
>>      <mailto:christophe.ortiz at ciemat.es>> writes:
>>
>>      > Hi all,
>>      >
>>      > Recently I have implemented a 1D problem of coupled diffusion
>>      equations
>>      > using PETSc. I did it using finite differences for diffusion
>>      terms and
>>      > F(t,U,U_t) = 0. It works pretty well with ARKIMEX3. I get a nice
>>      timestep
>>      > variation and all boundary conditions work well.
>>      >
>>      > Now I would like to move to 3D problems to simulate the
>>      diffusion and
>>      > interaction of species in a "real material". By real material I
>>      mean a
>>      > material made of subregions with internal surfaces where species
>>      could
>>      > recombine (means Dirichlet). These subregions are distributed in a
>>      > complicated manner, ie not cartesian. A good picture of this
>>      would be a
>>      > polycrystal (see attachment to get an idea). Each crystal has a
>>      different
>>      > orientation and the boundary between two small crystals forms an
>>      internal
>>      > surface.
>>      >
>>      > I have several questions on how to implement this:
>>      >
>>      > 1) Since, the problem will not be solved in a cartesian mesh,
>>      should I use
>>      > unstructured meshes ? If so, how can this unstructured mesh can be
>>      > generated ( I have no experience with unstructured meshes. I
>>      always work in
>>      > 1D).
>>
>>      Are you intending to mesh the boundaries of the crystals?  Will you be
>>      dynamically remeshing?  (That is very complicated and expensive in
>>      3D.)
>>
>>      What formulation will you be using for grain boundary evolution?
>>
>>
>> No, in principle I will not consider the evolution of grains.
>> Therefore, no dynamic remershing (in principle).
>> What I want is just the evolution of diffusing and reacting species
>> inside the ensemble of grains, including their interaction with the
>> grain boundaries (trapping, segregation, ...).
>>
>>      I think you should check out phase field models, such as the
>>      publication
>>      below.
>>
>>
>> I never used phase-field models. According to what I read, it can
>> model many phnomena but in particular it substitutes a boundary
>> condition at an interface by a PDE for the evolution of an auxiliary
>> field (Wikipedia). In this sense, maybe it could be interesting since
>> I want to simulate the evolution of species inside grains with many
>> internal grain boundaries.
>> But I don't know if to treat a grain boundary as a infinitely sharp
>> interface or as a thin but finite piece of material with different
>> properties for species (diffusion coeff for instance).
>>
>>       Perhaps check out the paper below.  The framework (MOOSE) used
>>      for this publication should be released open source on github next
>>      week
>>      (check https://github.com/idaholab/).  I don't know if Marmot, the
>>      phase-field component, will be open source any time soon, but they are
>>      typically happy to collaborate.  MOOSE uses PETSc for solvers, but
>>      provides a higher level interface.
>>
>>      @article{tonks2012object,
>>        title={An object-oriented finite element framework for
>>      multiphysics phase field simulations},
>>        author={Tonks, M.R. and Gaston, D. and Millett, P.C. and Andrs,
>>      D. and Talbot, P.},
>>        journal={Computational Materials Science},
>>        volume={51},
>>        number={1},
>>        pages={20--29},
>>        year={2012},
>>        publisher={Elsevier}
>>      }
>>
>>
>> Sorry, I could not download the article. We don't have access. Crisis
>> in Spain :-( !
>>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140224/08f66217/attachment-0001.html>
>
> ------------------------------
>
> Message: 4
> Date: Mon, 24 Feb 2014 18:18:36 -0700
> From: Jed Brown <jed at jedbrown.org>
> To: Konstantinos Kontzialis <ckontzialis at lycos.com>,
> 	petsc-users at mcs.anl.gov
> Subject: Re: [petsc-users] Preconditioning in matrix-free methods
> Message-ID: <877g8k0z2r.fsf at jedbrown.org>
> Content-Type: text/plain; charset="us-ascii"
>
> Konstantinos Kontzialis <ckontzialis at lycos.com> writes:
>
>> Dear all,
>>   
>> I am trying to use preconditioning in SNES within a matrix free conext. I use
>> petsc 3.3 and whenever I use the option
>> -snes_mf_operator I get the following error:
> -snes_mf_operator means to use the matrix you assemble for
>   preconditioning, but apply the true Jacobian using matrix-free finite
>   differencing.  It is normally used when the function you pass to
>   TSSetIJacobian() only approximates the true Jacobian, typically by
>   using a lower-order discretization or by discarding some terms.
>
>> Must call DMShellSetMatrix() or DMShellSetCreateMatrix()
>>
>>   
>> I code the following:
>>   
>>        call TSCreate (petsc_comm_world, ts_mhd, ierpetsc)
>> c
>>        call TSSetProblemType (ts_mhd, TS_NONLINEAR, ierpetsc)
>> c
>>        call TSSetIFunction ( ts_mhd, res_mhd, residual_mag,
>>       @  PETSC_NULL_OBJECT, ierpetsc )
>> c
>>        call TSSetSolution( ts_mhd, Bmagnetic_pet, ierpetsc )
>> c
>>        call TSSetMaxSNESFailures ( ts_mhd, -1, ierpetsc)
>> c
>>        call TSGetSNES (ts_mhd, snes_mhd, ierpetsc )
>>   
>>        call MatCreateSNESMF ( snes_mhd, J_mf, ierpetsc )
>> c
>>        call SNESSetJacobian ( snes_mhd, J_mf, M_mhd,
>>      @ SNESDefaultComputeJacobianColor, fdcoloring,
>>      @ ierpetsc )
>>   
>> Has anyone any suggestions on what to do?
>>   
>> Kostas
> -------------- next part --------------
> A non-text attachment was scrubbed...
> Name: not available
> Type: application/pgp-signature
> Size: 835 bytes
> Desc: not available
> URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140224/0e00573c/attachment.pgp>
>
> ------------------------------
>
> _______________________________________________
> petsc-users mailing list
> petsc-users at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
>
>
> End of petsc-users Digest, Vol 62, Issue 47
> *******************************************
Dear Jed,

    Thank you for your response. I understand what you have written, but 
maybe I'm missing something else . I have
not understood why PETSC is giving me this error. With -snes_mf I get my 
code running (however many gmres iterations
are needed and convergence sometimes stalls), but with -snes_mf_operator 
I still get the error I posted on the list. What
should I do to overcome it? Am I missing something? Is there anything I 
do wrong?

   Regards,

   Kostas


More information about the petsc-users mailing list