fletcherpowell.RdConjugate Gradient (CG) minimization through the Davidon-Fletcher-Powell approach for function minimization.
The Davidon-Fletcher-Powell (DFP) and the Broyden-Fletcher-Goldfarb-Shanno (BFGS) methods are the first quasi-Newton minimization methods developed. These methods differ only in some details; in general, the BFGS approach is more robust.
fletcher_powell(x0, f, g = NULL,
maxiter = 1000, tol = .Machine$double.eps^(2/3))The starting point is Newton's method in the multivariate case, when the estimate of the minimum is updated by the following equation $$x_{new} = x - H^{-1}(x) grad(g)(x)$$ where \(H\) is the Hessian and \(grad\) the gradient.
The basic idea is to generate a sequence of good approximations to the inverse Hessian matrix, in such a way that the approximations are again positive definite.
List with following components:
minimum solution found.
value of f at minimum.
number of iterations performed.
J. F. Bonnans, J. C. Gilbert, C. Lemarechal, and C. A. Sagastizabal. Numerical Optimization: Theoretical and Practical Aspects. Second Edition, Springer-Verlag, Berlin Heidelberg, 2006.
Used some Matlab code as described in the book “Applied Numerical Analysis Using Matlab” by L. V.Fausett.
## Rosenbrock function
rosenbrock <- function(x) {
n <- length(x)
x1 <- x[2:n]
x2 <- x[1:(n-1)]
sum(100*(x1-x2^2)^2 + (1-x2)^2)
}
fletcher_powell(c(0, 0), rosenbrock)
#> $xmin
#> [1] 1 1
#>
#> $fmin
#> [1] 1.55316e-17
#>
#> $niter
#> [1] 14
#>
# $xmin
# [1] 1 1
# $fmin
# [1] 1.774148e-27
# $niter
# [1] 14