# Review Questions for Optimization

1. What are the necessary and sufficient conditions for a point to be a local minimum in one dimension?
2. What are the necessary and sufficient conditions for a point to be a local minimum in $$n$$ dimensions?
3. How do you classify extrema as minima, maxima, or saddle points?
4. What is the difference between a local and a global minimum?
5. What does it mean for a function to be unimodal?
6. What special attribute does a function need to have for golden section search to find a minimum?
7. Run one iteration of golden section search.
8. Calculate the gradient of a function (function has many inputs, one output).
9. Calculate the Jacobian of a function (function has many inputs, many outputs).
10. Calculate the Hessian of a function.
11. Find the search direction in steepest/gradient descent.
12. Why must you perform a line search each step of gradient descent?
13. Run one step of Newton's method in one dimension.
14. Run one step of Newton's method in $$n$$ dimensions.
15. When does Newton's method fail to converge to a minimum?
16. What operations do you need to perform each iteration of golden section search?
17. What operations do you need to perform each iteration of Newton's method in one dimension?
18. What operations do you need to perform each iteration of Newton's method in $$n$$ dimensions?
19. What is the convergence rate of Newton's method?