View on GitHub
Minimizes a continuous differentiable multivariate function.
Ported to Elixir from Octave version, found in Andre Ng's course, (c) Carl Edward Rasmussen.
`f` — cost function, that takes two paramteters: current version of `x` and `fParams`. For example, `lr_cost_fun/2`.
`x` — vector of parameters, which we try to optimize,
so that cost function returns the minimum value.
`fParams` — this value is passed as the second parameter to the cost function.
`length` — number of iterations to perform.
Returns column matrix of found solutions, list of cost function values and number of iterations used.
Starting point is given by `x` (D by 1), and the function `f`, must
return a function value and a vector of partial derivatives. The Polack-Ribiere
flavour of conjugate gradients is used to compute search directions,
and a line search using quadratic and cubic polynomial approximations and the
Wolfe-Powell stopping criteria is used together with the slope ratio method
for guessing initial step sizes. Additionally a bunch of checks are made to
make sure that exploration is taking place and that extrapolation will not
be unboundedly large.
- Add a code example (optional).