Optim initial step length guess
Tip
This example is also available as a Jupyter notebook: optim_initialstep.ipynb
This example shows how to use the initial step length procedures with Optim. We solve the Rosenbrock problem with two different procedures.
First, run Newton
with the (default) initial guess and line search procedures.
using Optim, LineSearches
import OptimTestProblems.MultivariateProblems
UP = MultivariateProblems.UnconstrainedProblems
prob = UP.examples["Rosenbrock"]
algo_st = Newton(alphaguess = InitialStatic(), linesearch = HagerZhang())
res_st = Optim.optimize(prob.f, prob.g!, prob.h!, prob.initial_x, method=algo_st)
* Status: success
* Candidate solution
Minimizer: [1.00e+00, 1.00e+00]
Minimum: 1.109336e-29
* Found with
Algorithm: Newton's Method
Initial Point: [-1.20e+00, 1.00e+00]
* Convergence measures
|x - x'| = 1.13e-08 ≰ 0.0e+00
|x - x'|/|x'| = 1.13e-08 ≰ 0.0e+00
|f(x) - f(x')| = 7.05e-16 ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = 6.35e+13 ≰ 0.0e+00
|g(x)| = 6.66e-15 ≤ 1.0e-08
* Work counters
Seconds run: 0 (vs limit Inf)
Iterations: 23
f(x) calls: 71
∇f(x) calls: 71
∇²f(x) calls: 23
We can now try with the initial step length guess from Hager and Zhang.
algo_hz = Newton(alphaguess = InitialHagerZhang(α0=1.0), linesearch = HagerZhang())
res_hz = Optim.optimize(prob.f, prob.g!, prob.h!, prob.initial_x, method=algo_hz)
* Status: success
* Candidate solution
Minimizer: [1.00e+00, 1.00e+00]
Minimum: 1.719626e-19
* Found with
Algorithm: Newton's Method
Initial Point: [-1.20e+00, 1.00e+00]
* Convergence measures
|x - x'| = 4.16e-06 ≰ 0.0e+00
|x - x'|/|x'| = 4.16e-06 ≰ 0.0e+00
|f(x) - f(x')| = 6.27e-10 ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = 3.65e+09 ≰ 0.0e+00
|g(x)| = 6.63e-10 ≤ 1.0e-08
* Work counters
Seconds run: 0 (vs limit Inf)
Iterations: 24
f(x) calls: 61
∇f(x) calls: 38
∇²f(x) calls: 24
From the result we see that this has reduced the number of function and gradient calls, but increased the number of iterations.
This page was generated using Literate.jl.