Optim line search

# Optim line search

Tip

This example is also available as a Jupyter notebook: `optim_linesearch.ipynb`

This example shows how to use `LineSearches` with Optim. We solve the Rosenbrock problem with two different line search algorithms.

First, run `Newton` with the default line search algorithm:

``````using Optim, LineSearches
import OptimTestProblems.MultivariateProblems
UP = MultivariateProblems.UnconstrainedProblems
prob = UP.examples["Rosenbrock"]

algo_hz = Newton(linesearch = HagerZhang())
res_hz = Optim.optimize(prob.f, prob.g!, prob.h!, prob.initial_x, method=algo_hz)``````
`````` * Status: success

* Candidate solution
Minimizer: [1.00e+00, 1.00e+00]
Minimum:   1.109336e-29

* Found with
Algorithm:     Newton's Method
Initial Point: [-1.20e+00, 1.00e+00]

* Convergence measures
|x - x'|               = 1.13e-08 ≰ 0.0e+00
|x - x'|/|x'|          = 1.13e-08 ≰ 0.0e+00
|f(x) - f(x')|         = 7.05e-16 ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = 6.35e+13 ≰ 0.0e+00
|g(x)|                 = 6.66e-15 ≤ 1.0e-08

* Work counters
Seconds run:   0  (vs limit Inf)
Iterations:    23
f(x) calls:    71
∇f(x) calls:   71
∇²f(x) calls:  23``````

Now we can try `Newton` with the cubic backtracking line search, which reduced the number of objective and gradient calls.

``````algo_bt3 = Newton(linesearch = BackTracking(order=3))
res_bt3 = Optim.optimize(prob.f, prob.g!, prob.h!, prob.initial_x, method=algo_bt3)``````
`````` * Status: success

* Candidate solution
Minimizer: [1.00e+00, 1.00e+00]
Minimum:   1.232595e-30

* Found with
Algorithm:     Newton's Method
Initial Point: [-1.20e+00, 1.00e+00]

* Convergence measures
|x - x'|               = 1.76e-09 ≰ 0.0e+00
|x - x'|/|x'|          = 1.76e-09 ≰ 0.0e+00
|f(x) - f(x')|         = 1.13e-17 ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = 9.14e+12 ≰ 0.0e+00
|g(x)|                 = 4.44e-14 ≤ 1.0e-08

* Work counters
Seconds run:   0  (vs limit Inf)
Iterations:    25
f(x) calls:    34
∇f(x) calls:   26
∇²f(x) calls:  25``````