Lecture 17: Deterministic, Unconstrained Optimization. The trade-off of approximation versus time. Basics from calculus about minima. The gradient-descent method, its pros and cons. Taylor expansion as the motivation for Newton's method; Newton's method as gradient descent with adaptive step-size; pros and cons. Coordinate descent instead of multivariate optimization. Nelder-Mead/simplex method for derivative-free optimization. Peculiarities of optimizing statistical functionals: don't bother optimizing much within the margin of error; asymptotic calculations of said margin, using Taylor expansion and the rules for adding and multiplying variances. Illustrations with optim.
Please comment on the article here: Three-Toed Sloth