Slater's condition

From HandWiki

In mathematics, Slater's condition (or Slater condition) is a sufficient condition for strong duality to hold for a convex optimization problem, named after Morton L. Slater.[1] Informally, Slater's condition states that the feasible region must have an interior point (see technical details below).

Slater's condition is a specific example of a constraint qualification.[2] In particular, if Slater's condition holds for the primal problem, then the duality gap is 0, and if the dual value is finite then it is attained.[3]

Formulation

Let [math]\displaystyle{ f_1,\ldots,f_m }[/math] be real-valued functions on some subset D of Rn. We say that the functions satisfy the Slater condition if there exists some x in the relative interior of D, for which fi(x)<0 for all i in 1,...,m. We say that the functions satisfy the relaxed Slater condition if:[4]

  • Some k functions (say f1,...,fk) are affine;
  • There exists some x in the relative interior of D, for which fi(x)≤0 for all i in 1,...,k, and fi(x)<0 for all i in k+1,...,m.

Application to convex optimization

Consider the optimization problem

[math]\displaystyle{ \text{Minimize }\; f_0(x) }[/math]
[math]\displaystyle{ \text{subject to: }\ }[/math]
[math]\displaystyle{ f_i(x) \le 0 , i = 1,\ldots,m }[/math]
[math]\displaystyle{ Ax = b }[/math]

where [math]\displaystyle{ f_0,\ldots,f_m }[/math] are convex functions. This is an instance of convex programming. Slater's condition for convex programming states that there exists an [math]\displaystyle{ x^* }[/math] that is strictly feasible, that is, all m constraints are satisfied, and the nonlinear constraints are satisfied with strict inequalities.

If a convex program satisfies Slater's condition (or relaxed condition), and it is bounded from below, then strong duality holds. Mathematically, this means states that strong duality holds if there exists an [math]\displaystyle{ x^* \in \operatorname{relint}(D) }[/math] (where relint denotes the relative interior of the convex set [math]\displaystyle{ D := \cap_{i = 0}^m \operatorname{dom}(f_i) }[/math]) such that

[math]\displaystyle{ f_i(x^*) \lt 0, i = 1,\ldots,m, }[/math] (the convex, nonlinear constraints)
[math]\displaystyle{ Ax^* = b.\, }[/math][5]

Generalized Inequalities

Given the problem

[math]\displaystyle{ \text{Minimize }\; f_0(x) }[/math]
[math]\displaystyle{ \text{subject to: }\ }[/math]
[math]\displaystyle{ f_i(x) \le_{K_i} 0 , i = 1,\ldots,m }[/math]
[math]\displaystyle{ Ax = b }[/math]

where [math]\displaystyle{ f_0 }[/math] is convex and [math]\displaystyle{ f_i }[/math] is [math]\displaystyle{ K_i }[/math]-convex for each [math]\displaystyle{ i }[/math]. Then Slater's condition says that if there exists an [math]\displaystyle{ x^* \in \operatorname{relint}(D) }[/math] such that

[math]\displaystyle{ f_i(x^*) \lt _{K_i} 0, i = 1,\ldots,m }[/math] and
[math]\displaystyle{ Ax^* = b }[/math]

then strong duality holds.[5]

References

  1. Slater, Morton (1950). "Lagrange Multipliers Revisited". Cowles Commission Discussion Paper No. 403. https://cowles.yale.edu/sites/default/files/files/pub/d00/d0080.pdf.  Reprinted in Giorgi, Giorgio; Kjeldsen, Tinne Hoff, eds (2014). Traces and Emergence of Nonlinear Programming. Basel: Birkhäuser. pp. 293–306. ISBN 978-3-0348-0438-7. https://books.google.com/books?id=q4PBBAAAQBAJ&pg=PA293. 
  2. Takayama, Akira (1985). Mathematical Economics. New York: Cambridge University Press. pp. 66–76. ISBN 0-521-25707-7. https://archive.org/details/mathematicalecon00taka/page/66. 
  3. Borwein, Jonathan; Lewis, Adrian (2006). Convex Analysis and Nonlinear Optimization: Theory and Examples (2nd ed.). Springer. ISBN 0-387-29570-4. 
  4. Nemirovsky and Ben-Tal (2023). "Optimization III: Convex Optimization". http://www2.isye.gatech.edu/~nemirovs/OPTIIILN2023Spring.pdf. 
  5. 5.0 5.1 Boyd, Stephen; Vandenberghe, Lieven (2004) (pdf). Convex Optimization. Cambridge University Press. ISBN 978-0-521-83378-3. https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf#page=240. Retrieved October 3, 2011.