Convex Optimization

Table of Contents

A mathematical optimization problem can be expressed as

\begin{align*} \min & f_0(x) \\ \text{subject to} & f_i(x) \le b_i, \quad i=1, \ldots, m \end{align*}

where

Optimal solution \(x^\star\) has the smallest value of \(f_0\) among all vectors that satisfy the constraints.

In fact, a general optimization problem is very difficult to solve. Its solution cannot always be found. Even though it can, it suffers some compromise, e.g. very long computation time.

Fortunately, following three problems can be efficiently and reliably solved.

Least squares (LS)

\begin{align*} \min \|Ax - b\|_2^2 \end{align*}
  • Analytical solution \(x^\star = (A^TA)^{-1}A^Tb\)
  • Easy to recognize
  • A mature solution with reliable and efficient algorithms and softwares
  • Computation time proportional to \(n^2k\) (\(A\in \mathbf{R}^{k\times n}\))

Linear programming (LP)

\begin{align*} \min & c^Tx \\ \text{subject to} & a_i^Tx \le b_i,\quad i=1, \ldots, m \end{align*}
  • No analytical solution
  • Not as easy to recognize as LS problems
  • A mature solution with reliable and efficient algorithms and softwares
  • Computation time proportional to \(n^2m\)

Convex optimization

\begin{align*} \min & f_0(x) \\ \text{subject to} & f_i(x) \le b_i, \quad i=1, \ldots, m \end{align*}

where

  • Objective and constraint functions are convex, i.e. \(\forall \alpha,\beta \ge 0\), \(\alpha + \beta = 1\), \(f_i(\alpha x + \beta y) \le \alpha f_i(x) + \beta f_i(y)\), \(i = 0, \ldots, m\).
  • LS and LP are included as special cases.

Generally, for convex optimization problems,

  • No analytical solution
  • Often difficult to recognize
  • Almost a technology with reliable and efficient algorithms
  • Computation time roughly proportional to \(\max\{n^3, n^2m,F\}\), where \(F\) is cost of evaluating \(f_i\) 's and their first and second derivatives.