Line search routines

Line search routines

BackTracking specifies a backtracking line-search that uses a quadratic or cubic interpolant to determine the reduction in step-size. E.g., if f(α) > f(0) + c₁ α f'(0), then the quadratic interpolant of f(0), f'(0), f(α) has a minimiser α' in the open interval (0, α). More strongly, there exists a factor ρ = ρ(c₁) such that α' ≦ ρ α.

This is a modification of the algorithm described in Nocedal Wright (2nd ed), Sec. 3.5.

source

Conjugate gradient line search implementation from: W. W. Hager and H. Zhang (2006) Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Transactions on Mathematical Software 32: 113–137.

source

The line search implementation from: Moré, Jorge J., and David J. Thuente Line search algorithms with guaranteed sufficient decrease. ACM Transactions on Mathematical Software (TOMS) 20.3 (1994): 286-307.

source

Static: defines a static linesearch which returns the initial step length.

Static is intended for methods with well-scaled updates; i.e. Newton, on well-behaved problems.

StrongWolfe: This linesearch algorithm guarantees that the step length satisfies the (strong) Wolfe conditions. See Nocedal and Wright - Algorithms 3.5 and 3.6

This algorithm is mostly of theoretical interest, users should most likely use MoreThuente, HagerZhang or BackTracking.

Parameters: (and defaults)

  • c_1 = 1e-4: Armijo condition
  • c_2 = 0.9 : second (strong) Wolfe condition
  • ρ = 2.0 : bracket growth
source