Optim line search

Optim line search

Tip

This example is also available as a Jupyter notebook: optim_linesearch.ipynb

This example shows how to use LineSearches with Optim. We solve the Rosenbrock problem with two different line search algorithms.

First, run Newton with the default line search algorithm:

using Optim, LineSearches
import OptimTestProblems.MultivariateProblems
UP = MultivariateProblems.UnconstrainedProblems
prob = UP.examples["Rosenbrock"]

algo_hz = Newton(linesearch = HagerZhang())
res_hz = Optim.optimize(prob.f, prob.g!, prob.h!, prob.initial_x, method=algo_hz)
Results of Optimization Algorithm
 * Algorithm: Newton's Method
 * Starting Point: [-1.2,1.0]
 * Minimizer: [1.0000000000000033,1.0000000000000067]
 * Minimum: 1.109336e-29
 * Iterations: 23
 * Convergence: true
   * |x - x'| ≤ 0.0e+00: false
     |x - x'| = 1.13e-08
   * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
     |f(x) - f(x')| = 6.35e+13 |f(x)|
   * |g(x)| ≤ 1.0e-08: true
     |g(x)| = 6.66e-15
   * Stopped by an increasing objective: false
   * Reached Maximum Number of Iterations: false
 * Objective Calls: 71
 * Gradient Calls: 71
 * Hessian Calls: 23

Now we can try Newton with the cubic backtracking line search, which reduced the number of objective and gradient calls.

algo_bt3 = Newton(linesearch = BackTracking(order=3))
res_bt3 = Optim.optimize(prob.f, prob.g!, prob.h!, prob.initial_x, method=algo_bt3)
Results of Optimization Algorithm
 * Algorithm: Newton's Method
 * Starting Point: [-1.2,1.0]
 * Minimizer: [1.0,0.9999999999999999]
 * Minimum: 1.232595e-30
 * Iterations: 25
 * Convergence: true
   * |x - x'| ≤ 0.0e+00: false
     |x - x'| = 1.76e-09
   * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
     |f(x) - f(x')| = 9.14e+12 |f(x)|
   * |g(x)| ≤ 1.0e-08: true
     |g(x)| = 4.44e-14
   * Stopped by an increasing objective: false
   * Reached Maximum Number of Iterations: false
 * Objective Calls: 34
 * Gradient Calls: 26
 * Hessian Calls: 25

This page was generated using Literate.jl.