[petsc-users] Rescaling quantites to remove positivity constraints

Matthew Knepley knepley at gmail.com
Thu Mar 13 21:14:46 CDT 2014


On Thu, Mar 13, 2014 at 9:08 PM, Mani Chandra <mc0710 at gmail.com> wrote:

>
> On Thu, Mar 13, 2014 at 8:51 PM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> On Thu, Mar 13, 2014 at 6:12 PM, Mani Chandra <mc0710 at gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I'm trying to solve for fluid flows with large density and pressure
>>> variations using the theta method in the TS module. This inevitably leads
>>> to negative densities and pressures during the course of the evolution. In
>>> order to deal with this, I am trying to now solve for log(rho) and log(P)
>>> instead of rho and P. In order to do this, I changed my initial conditions
>>> for rho and P to log(rho) and log(P) and inside the residual evaluation
>>> function I keep the original code unchanged expect that I add the following
>>> lines above the original code:
>>>
>>> rho = exp(logrho)
>>> P = exp(logP)
>>>
>>
>> You must d your scaling such that the output of your residual is log() of
>> your variables. If that is the case, it
>> will work.
>>
>>
>
> Could you elaborate a bit? I didn't understand that. I thought that as
> long as the residuals are the same, one can solve for any forms of the
> variables.
>

You are using the residual function to form the Jacobian using finite
differences, so it has to represent the variables you use.

   Matt


> since I expect the solution vector x[j][i] to now have logrho and logP
>>> instead of rho and  P. I now run the simulation with a modified newton
>>> krylov method and lag the jacobian for every 100 linear solves. The
>>> jacobian assembly is using colored finite differences. This goes smoothly
>>> from the point I start the simulation to the first 100 linear solves after
>>> which it needs to assemble the jacobian again. At this point the nonlinear
>>> solver crashes and I think something is wrong with the jacobian assembly. I
>>> checked that the simulation runs till it has to assemble the jacobian the
>>> second time by varying the lagging by say 10, 20, 30, ... It fails
>>> everytime it has to assemble the second time.
>>>
>>>  When I view the solution using VecView inside TSMonitor during the time
>>> elasped when the simulation ran, the initial conditions are log(rho) and
>>> log(P) but the output of the vector actually has rho and P and not log(rho)
>>> and log(P) which I expect the solver to be solving for. However when I look
>>> at the .vts files generated by -ts_monitor_solution_vtk, the output is
>>> log(rho) and log(P).
>>>
>>> Should I be doing something else in order to rescale the quantities that
>>> TS/SNES should solve for instead of what I did above?
>>>
>>> The method suggesting that one rescales is described in this paper:
>>> "A newton-krylov solver for implicit solution of hydrodynamics in
>>> core-collapse supernova"
>>>
>>> http://iopscience.iop.org/1742-6596/125/1/012085/pdf/1742-6596_125_1_012085.pdf
>>>
>>
>> I personally think this is a bad idea. They must have guesses that are
>> very close to the solutions, and
>> it messes with the accuracy of the discretization.
>>
>> I tried other things like setting floors, but they fail. Any suggestions?
>
>
>>    Thanks,
>>
>>      Matt
>>
>> P.S I also tried the variational inequality SNES solver a while back but
>>> it didn't really work in enforcing the constraints.
>>>
>>> Thanks,
>>> Mani
>>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140313/66716a63/attachment.html>


More information about the petsc-users mailing list