\[
\begin{aligned}
& \underset{x}{\text{minimize}} & & f_0(x) \\
& \text{subject to} & & f_i(x) \le 0,\quad i=1,\dots,m,\\
& & & A x = b.
\end{aligned}
\]
Variable: \(x\in\mathbb R^n\)
Equality constraints: linear (\(A x = b\))
Inequality constraints: \(f_1,\dots,f_m\) are convex functions
A function \(f\) is convex if, for all \(x,y\) and \(\theta\in[0,1]\), \[
f\bigl(\theta x+(1-\theta)y\bigr)\;\le\;\theta\,f(x)\;+\;(1-\theta)\,f(y),
\] i.e. it has non-negative (upward) curvature.```mermaid
However, this view is too simplistic. The key distinction is actually convexity:
Convex problems (nonnegative curvature) can be solved reliably in polynomial time using interior-point methods, gradient-based schemes with global convergence guarantees, and other efficient algorithms. The feasible set and objective have no ‘valleys’ that trap algorithms in suboptimal points.
Nonconvex problems (regions of negative curvature) are generally NP-hard. They can exhibit multiple local minima, saddle points, and complex landscape features, so finding the global minimum is computationally intractable in the worst case (unless P=NP).
Thus, the presence of negative curvature (nonconvexity) — not merely ‘nonlinearity’ — is what makes an optimization problem hard to solve.
Code
import numpy as npimport matplotlib.pyplot as plt# Define convex and nonconvex functionsconvex =lambda x: x**2nonconvex =lambda x: x**2+5*np.sin(3*x)# Domainx = np.linspace(-3, 3, 500)# Plot both functionsplt.figure()plt.plot(x, convex(x), label='Convex: $f(x)=x^2$')plt.plot(x, nonconvex(x), label='Nonconvex: $f(x)=x^2+5\sin(3x)$')# Mark minima# For convex, minimum at x=0plt.scatter(0, convex(0), color='black', label='Convex minimum')# For nonconvex, approximate local minimad = nonconvex(x)idx = (np.diff(np.sign(np.diff(d))) >0).nonzero()[0] +1plt.scatter(x[idx], d[idx], color='red', s=20, label='Nonconvex local minima')plt.title('Convex vs Nonconvex Landscapes')plt.xlabel('x')plt.ylabel('f(x)')plt.legend()plt.tight_layout()
negative curvature (nonconvexity) — not merely ‘nonlinearity’ — is what makes an optimization problem hard to solve.