Perturbation function

In mathematical optimization, the perturbation function is any function which relates to primal and dual problems. The name comes from the fact that any such function defines a perturbation of the initial problem. In many cases this takes the form of shifting the constraints.[1]

In some texts the value function is called the perturbation function, and the perturbation function is called the bifunction.[2]

Definition

Given two dual pairs of separated locally convex spaces ( X , X ) {\displaystyle \left(X,X^{*}\right)} and ( Y , Y ) {\displaystyle \left(Y,Y^{*}\right)} . Then given the function f : X R { + } {\displaystyle f:X\to \mathbb {R} \cup \{+\infty \}} , we can define the primal problem by

inf x X f ( x ) . {\displaystyle \inf _{x\in X}f(x).\,}

If there are constraint conditions, these can be built into the function f {\displaystyle f} by letting f f + I c o n s t r a i n t s {\displaystyle f\leftarrow f+I_{\mathrm {constraints} }} where I {\displaystyle I} is the characteristic function. Then F : X × Y R { + } {\displaystyle F:X\times Y\to \mathbb {R} \cup \{+\infty \}} is a perturbation function if and only if F ( x , 0 ) = f ( x ) {\displaystyle F(x,0)=f(x)} .[1][3]

Use in duality

The duality gap is the difference of the right and left hand side of the inequality

sup y Y F ( 0 , y ) inf x X F ( x , 0 ) , {\displaystyle \sup _{y^{*}\in Y^{*}}-F^{*}(0,y^{*})\leq \inf _{x\in X}F(x,0),}

where F {\displaystyle F^{*}} is the convex conjugate in both variables.[3][4]

For any choice of perturbation function F weak duality holds. There are a number of conditions which if satisfied imply strong duality.[3] For instance, if F is proper, jointly convex, lower semi-continuous with 0 core ( Pr Y ( dom F ) ) {\displaystyle 0\in \operatorname {core} ({\Pr }_{Y}(\operatorname {dom} F))} (where core {\displaystyle \operatorname {core} } is the algebraic interior and Pr Y {\displaystyle {\Pr }_{Y}} is the projection onto Y defined by Pr Y ( x , y ) = y {\displaystyle {\Pr }_{Y}(x,y)=y} ) and X, Y are Fréchet spaces then strong duality holds.[1]

Examples

Lagrangian

Let ( X , X ) {\displaystyle (X,X^{*})} and ( Y , Y ) {\displaystyle (Y,Y^{*})} be dual pairs. Given a primal problem (minimize f(x)) and a related perturbation function (F(x,y)) then the Lagrangian L : X × Y R { + } {\displaystyle L:X\times Y^{*}\to \mathbb {R} \cup \{+\infty \}} is the negative conjugate of F with respect to y (i.e. the concave conjugate). That is the Lagrangian is defined by

L ( x , y ) = inf y Y { F ( x , y ) y ( y ) } . {\displaystyle L(x,y^{*})=\inf _{y\in Y}\left\{F(x,y)-y^{*}(y)\right\}.}

In particular the weak duality minmax equation can be shown to be

sup y Y F ( 0 , y ) = sup y Y inf x X L ( x , y ) inf x X sup y Y L ( x , y ) = inf x X F ( x , 0 ) . {\displaystyle \sup _{y^{*}\in Y^{*}}-F^{*}(0,y^{*})=\sup _{y^{*}\in Y^{*}}\inf _{x\in X}L(x,y^{*})\leq \inf _{x\in X}\sup _{y^{*}\in Y^{*}}L(x,y^{*})=\inf _{x\in X}F(x,0).}

If the primal problem is given by

inf x : g ( x ) 0 f ( x ) = inf x X f ~ ( x ) {\displaystyle \inf _{x:g(x)\leq 0}f(x)=\inf _{x\in X}{\tilde {f}}(x)}

where f ~ ( x ) = f ( x ) + I R + d ( g ( x ) ) {\displaystyle {\tilde {f}}(x)=f(x)+I_{\mathbb {R} _{+}^{d}}(-g(x))} . Then if the perturbation is given by

inf x : g ( x ) y f ( x ) {\displaystyle \inf _{x:g(x)\leq y}f(x)}

then the perturbation function is

F ( x , y ) = f ( x ) + I R + d ( y g ( x ) ) . {\displaystyle F(x,y)=f(x)+I_{\mathbb {R} _{+}^{d}}(y-g(x)).}

Thus the connection to Lagrangian duality can be seen, as L can be trivially seen to be

L ( x , y ) = { f ( x ) y ( g ( x ) ) if  y R d , else . {\displaystyle L(x,y^{*})={\begin{cases}f(x)-y^{*}(g(x))&{\text{if }}y^{*}\in \mathbb {R} _{-}^{d},\\-\infty &{\text{else}}.\end{cases}}}

Fenchel duality

Let ( X , X ) {\displaystyle (X,X^{*})} and ( Y , Y ) {\displaystyle (Y,Y^{*})} be dual pairs. Assume there exists a linear map T : X Y {\displaystyle T:X\to Y} with adjoint operator T : Y X {\displaystyle T^{*}:Y^{*}\to X^{*}} . Assume the primal objective function f ( x ) {\displaystyle f(x)} (including the constraints by way of the indicator function) can be written as f ( x ) = J ( x , T x ) {\displaystyle f(x)=J(x,Tx)} such that J : X × Y R { + } {\displaystyle J:X\times Y\to \mathbb {R} \cup \{+\infty \}} . Then the perturbation function is given by

F ( x , y ) = J ( x , T x y ) . {\displaystyle F(x,y)=J(x,Tx-y).}

In particular if the primal objective is f ( x ) + g ( T x ) {\displaystyle f(x)+g(Tx)} then the perturbation function is given by F ( x , y ) = f ( x ) + g ( T x y ) {\displaystyle F(x,y)=f(x)+g(Tx-y)} , which is the traditional definition of Fenchel duality.[5]

References

  1. ^ a b c Radu Ioan Boţ; Gert Wanka; Sorin-Mihai Grad (2009). Duality in Vector Optimization. Springer. ISBN 978-3-642-02885-4.
  2. ^ J. P. Ponstein (2004). Approaches to the Theory of Optimization. Cambridge University Press. ISBN 978-0-521-60491-8.
  3. ^ a b c Zălinescu, C. (2002). Convex analysis in general vector spaces. River Edge, NJ: World Scientific Publishing  Co., Inc. pp. 106–113. ISBN 981-238-067-1. MR 1921556.
  4. ^ Ernö Robert Csetnek (2010). Overcoming the failure of the classical generalized interior-point regularity conditions in convex optimization. Applications of the duality theory to enlargements of maximal monotone operators. Logos Verlag Berlin GmbH. ISBN 978-3-8325-2503-3.
  5. ^ Radu Ioan Boţ (2010). Conjugate Duality in Convex Optimization. Springer. p. 68. ISBN 978-3-642-04899-9.