Mathematical optimization is the study of choosing the best option from a set of available alternatives according to one or more criteria, subject to any restrictions that apply. It appears across many areas of science and in mathematics, and its simplest form asks for the minimum or maximum value of a function defined on some domain.

Problem formulation

An optimization problem typically specifies

  • an objective function that assigns a numeric value to each candidate solution (to be minimized or maximized),
  • decision variables whose values determine a candidate solution, and
  • a set of constraints that restrict the allowable values of the variables.

Problems differ by the nature of the objective and constraints (linear or nonlinear), whether variables are continuous or discrete, and whether one seeks a single objective or a trade-off among several objectives.

Key distinctions

  • Convex vs nonconvex: convex problems have the property that any local optimum is global, which simplifies analysis and algorithms.
  • Continuous vs discrete: continuous problems allow variables to take any real values in an interval, while discrete or integer problems restrict some variables to integer values.
  • Single- vs multi-objective: when several criteria must be balanced, solutions are often described by trade-offs (Pareto optimality) rather than a single best value.
  • Deterministic vs stochastic: in stochastic optimization some problem data are random or uncertain and objectives involve expected values or risk measures.

Solution methods

Techniques vary with problem type. Common approaches include:

  • Analytic methods using calculus (setting derivatives or gradients to zero) for smooth unconstrained problems.
  • Numerical algorithms such as gradient descent, Newton and quasi-Newton methods for differentiable problems.
  • Specialized algorithms: the simplex method and interior-point methods for linear programming; branch-and-bound and cutting planes for integer programming.
  • Dynamic programming for problems with a sequential or stagewise structure.
  • Heuristic and metaheuristic methods (genetic algorithms, simulated annealing, particle swarm) when exact solutions are hard to obtain or the objective is noisy or discontinuous.

Applications and considerations

Optimization models are used to design experiments and systems, schedule resources, plan logistics, fit models to data, and control processes. When applying methods in practice, important issues include computational cost, sensitivity to data and modeling assumptions, feasibility of constraints, and whether a found solution is a global optimum or only a local one.

Because optimization is applied across disciplines, work in science and mathematics continues to expand algorithmic tools and theoretical understanding. At its core, the subject remains the formal study of choosing best values of variables for a given function under specified limits.