Publication Date


Degree Program

Department of Mathematics and Computer Science

Degree Type

Master of Science


This thesis begins with the history of operations research and introduces two of its major branches, linear and nonlinear optimization. While other methods are mentioned, the focus is on analytical methods used to solve nonlinear optimization problems. We briefly look at some of the most effective constrained methods for nonlinear optimization and then show how unconstrained methods often play a role in developing effective constrained optimization algorithms. In particular we examine Newton and steepest descent methods, focusing primarily on Newton/quasi-Newton methods. Because Newton's method is primarily viewed as a root-finding method, we start with the basic root-finding algorithm for single variable functions and show its progression into a useful, and often efficient, multivariable optimization algorithm. Comparisons are made between a pure Newton algorithm and a modified Newton algorithm as well as between a pure steepest descent algorithm and a modified steepest descent algorithm. In examining nonlinear functions of varying complexity, we note some of the considerations that must be made when choosing an optimization program as well as some of the difficulties that arise when using Newton's method or steepest descent methods for the optimization of a nonlinear function.



Included in

Mathematics Commons