Optimization means to seek minima or maxima of a funtion within a given defined domain.

If a function reach its maxima or minima, the derivative at that point is approaching to 0. If we apply Newton-Raphson method for root finding to f', we can get the optimized f.

f2df <- function(fun) {
    fun.list = as.list(fun)
    var <- names(fun.list[1])
    fun.exp = fun.list[[2]] 
    diff.fun = D(fun.exp, var) 
    df = list(x=0, diff.fun) 
    df = as.function(df) 
    return(df)
}

newton <- function(fun, x0, tol=1e-7, niter=100) { 
    df = f2df(fun) 
    for (i in 1:niter) { 
        x = x0 - fun(x0)/df(x0) 
        if (abs(fun(x)) < tol) 
            return(x) 
        x0 = x      
    } 
    stop("exceeded allowed number of iterations") 
}
    
newton_optimize <- function(fun, x0, tol=1e-7, niter=100) {
    df <- f2df(fun)
    x = newton(df, x0, tol, niter)
    ddf <- f2df(df)
    if (ddf(x) > 0) {
        cat ("minima:\t", x, "\n")
    } else {
        cat ("maxima:\t", x, "\n")
    }
    return(x)
}

The golden-section method does not need f'. And it is similar to the root-bracketing technique for root finding.

gSection <- function(f, x1, x2, x3, tol=1e-7) {
    r <- 2 - (1+sqrt(5))/2 
    x4 <- x2 + (x3-x2)*r
    if ( abs(x3-x1) < tol ){
        return(x2)
    }
    if (f(x4) < f(x2)) {
        gSection(f, x2, x4, x3, tol)
    } else {
        gSection(f, x4, x2, x1, tol)
    }
}
> f <- function(x) (x-1/3)^2
> newton_optimize(f, 0, tol=1e-7)
minima:  0.3333333 
[1] 0.3333333
> gSection(f, 0,0.5,1)
[1] 0.3333333
> optimize(f, c(0,1), tol=1e-7)
$minimum
[1] 0.3333333

$objective
[1] 0