github-actions[bot] on new_version
CompatHelper: bump compat for C… (compare)
github-actions[bot] on v1.7.0
pkofod on master
Update Project.toml (compare)
pkofod on ntr0
pkofod on master
Newton Trust Region fail safe f… (compare)
pkofod on ntr0
Missing quote. (compare)
pkofod on ntr0
Fix project (compare)
I recently made a post to the Discourse forum about trying to find a way to access the approximated Hessian from a converged solution to use as the preconditioner on a future minimization of a slightly altered initial state; I'm using BFGS to bifurcate from previously minimizers. After looking at Optim.jl, libLBFGS, and LBFGS++ I couldn't find a interface, so I imagine that it is most likely not available in Optim.jl either. I wanted to double check here if that's something viable with the current interface, if there would be any interest in adding it to the package, or if I should try to extend this on my own in Julia somehow.
So far, the only package outside of Julia that implements this is SciPy:
https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.fmin_bfgs.html
state
object
state
should just be added to MultivariateOptimizationResults
?
optimize(d::D, initial_x::Tx, method::M,
options::Options = Options(;default_options(method)...),
state = initial_state(method, options, d, initial_x))
state
it'll get mutated and you can get what you want from it after
Hi All, new here ... We have been using Optim.jl and LBFGS for a while, and for our particular problems the combination of alphaguess = LineSearches.InitialQuadratic() and linesearch = LineSearches.MoreThuente() works best. I thought I would try that for Rosenbrock on the front page, and it beats BFGS pretty handily, at least in terms of operation count, almost 2x. Thought that was worth mentioning.
Where is a pointer to the slack channel please? Cheers
fun(x) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
For example, we want to get the following Hessian from the above objective function.
function fun_hess!(h, x)
h[1, 1] = 2.0 - 400.0 x[2] + 1200.0 x[1]^2
h[1, 2] = -400.0 x[1]
h[2, 1] = -400.0 x[1]
h[2, 2] = 200.0
end
@ChrisRackauckas
@segawachobbies_twitter you can use ModelingToolkit to test it
using ModelingToolkit
@variables x[1:2]
result = ModelingToolkit.hessian((1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2, x)
I tried the above code in ModelingToolkit and successfully got the ideal result! Thank you very much.
using ModelingToolkit
using Optim
@variables x[1:2]
f = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
g = ModelingToolkit.gradient(f, x)
h = ModelingToolkit.hessian(f, x)
buildedF = build_function(f, x[1:2])
buildedG = build_function(g, x[1:2])
buildedH = build_function(h, x[1:2])
newF = eval(buildedF)
newG! = eval(buildedG[2])
newH! = eval(buildedH[2])
initial_x = zeros(2)
@time Optim.minimizer(optimize(newF, initial_x, Newton(); autodiff=:forward))
@time Optim.minimizer(optimize(newF, newG!, newH!, initial_x, Newton()))