Polynomial SOS

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms g 1 ( x ) , , g k ( x ) {\displaystyle g_{1}(x),\ldots ,g_{k}(x)} of degree m such that

h ( x ) = i = 1 k g i ( x ) 2 . {\displaystyle h(x)=\sum _{i=1}^{k}g_{i}(x)^{2}.}

Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, 2m = 2, or n = 3 and 2m = 4 a form is SOS if and only if it is positive.[1] The same is also valid for the analog problem on positive symmetric forms.[2][3]

Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.[4][5] Moreover, every real nonnegative form can be approximated as closely as desired (in the l 1 {\displaystyle l_{1}} -norm of its coefficient vector) by a sequence of forms { f ϵ } {\displaystyle \{f_{\epsilon }\}} that are SOS.[6]

Square matricial representation (SMR)

To establish whether a form h(x) is SOS amounts to solving a convex optimization problem. Indeed, any h(x) can be written as

h ( x ) = x { m } ( H + L ( α ) ) x { m } {\displaystyle h(x)=x^{\{m\}'}\left(H+L(\alpha )\right)x^{\{m\}}}
where x { m } {\displaystyle x^{\{m\}}} is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying
h ( x ) = x { m } H x { m } {\displaystyle h(x)=x^{\left\{m\right\}'}Hx^{\{m\}}}
and L ( α ) {\displaystyle L(\alpha )} is a linear parameterization of the linear space
L = { L = L :   x { m } L x { m } = 0 } . {\displaystyle {\mathcal {L}}=\left\{L=L':~x^{\{m\}'}Lx^{\{m\}}=0\right\}.}

The dimension of the vector x { m } {\displaystyle x^{\{m\}}} is given by

σ ( n , m ) = ( n + m 1 m ) , {\displaystyle \sigma (n,m)={\binom {n+m-1}{m}},}
whereas the dimension of the vector α {\displaystyle \alpha } is given by
ω ( n , 2 m ) = 1 2 σ ( n , m ) ( 1 + σ ( n , m ) ) σ ( n , 2 m ) . {\displaystyle \omega (n,2m)={\frac {1}{2}}\sigma (n,m)\left(1+\sigma (n,m)\right)-\sigma (n,2m).}

Then, h(x) is SOS if and only if there exists a vector α {\displaystyle \alpha } such that

H + L ( α ) 0 , {\displaystyle H+L(\alpha )\geq 0,}
meaning that the matrix H + L ( α ) {\displaystyle H+L(\alpha )} is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression h ( x ) = x { m } ( H + L ( α ) ) x { m } {\displaystyle h(x)=x^{\{m\}'}\left(H+L(\alpha )\right)x^{\{m\}}} was introduced in [7] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.[8]

Examples

  • Consider the form of degree 4 in two variables h ( x ) = x 1 4 x 1 2 x 2 2 + x 2 4 {\displaystyle h(x)=x_{1}^{4}-x_{1}^{2}x_{2}^{2}+x_{2}^{4}} . We have
    m = 2 ,   x { m } = ( x 1 2 x 1 x 2 x 2 2 ) ,   H + L ( α ) = ( 1 0 α 1 0 1 + 2 α 1 0 α 1 0 1 ) . {\displaystyle m=2,~x^{\{m\}}={\begin{pmatrix}x_{1}^{2}\\x_{1}x_{2}\\x_{2}^{2}\end{pmatrix}}\!,~H+L(\alpha )={\begin{pmatrix}1&0&-\alpha _{1}\\0&-1+2\alpha _{1}&0\\-\alpha _{1}&0&1\end{pmatrix}}\!.}
    Since there exists α such that H + L ( α ) 0 {\displaystyle H+L(\alpha )\geq 0} , namely α = 1 {\displaystyle \alpha =1} , it follows that h(x) is SOS.
  • Consider the form of degree 4 in three variables h ( x ) = 2 x 1 4 2.5 x 1 3 x 2 + x 1 2 x 2 x 3 2 x 1 x 3 3 + 5 x 2 4 + x 3 4 {\displaystyle h(x)=2x_{1}^{4}-2.5x_{1}^{3}x_{2}+x_{1}^{2}x_{2}x_{3}-2x_{1}x_{3}^{3}+5x_{2}^{4}+x_{3}^{4}} . We have
    m = 2 ,   x { m } = ( x 1 2 x 1 x 2 x 1 x 3 x 2 2 x 2 x 3 x 3 2 ) ,   H + L ( α ) = ( 2 1.25 0 α 1 α 2 α 3 1.25 2 α 1 0.5 + α 2 0 α 4 α 5 0 0.5 + α 2 2 α 3 α 4 α 5 1 α 1 0 α 4 5 0 α 6 α 2 α 4 α 5 0 2 α 6 0 α 3 α 5 1 α 6 0 1 ) . {\displaystyle m=2,~x^{\{m\}}={\begin{pmatrix}x_{1}^{2}\\x_{1}x_{2}\\x_{1}x_{3}\\x_{2}^{2}\\x_{2}x_{3}\\x_{3}^{2}\end{pmatrix}},~H+L(\alpha )={\begin{pmatrix}2&-1.25&0&-\alpha _{1}&-\alpha _{2}&-\alpha _{3}\\-1.25&2\alpha _{1}&0.5+\alpha _{2}&0&-\alpha _{4}&-\alpha _{5}\\0&0.5+\alpha _{2}&2\alpha _{3}&\alpha _{4}&\alpha _{5}&-1\\-\alpha _{1}&0&\alpha _{4}&5&0&-\alpha _{6}\\-\alpha _{2}&-\alpha _{4}&\alpha _{5}&0&2\alpha _{6}&0\\-\alpha _{3}&-\alpha _{5}&-1&-\alpha _{6}&0&1\end{pmatrix}}.}
    Since H + L ( α ) 0 {\displaystyle H+L(\alpha )\geq 0} for α = ( 1.18 , 0.43 , 0.73 , 1.13 , 0.37 , 0.57 ) {\displaystyle \alpha =(1.18,-0.43,0.73,1.13,-0.37,0.57)} , it follows that h(x) is SOS.

Generalizations

Matrix SOS

A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms G 1 ( x ) , , G k ( x ) {\displaystyle G_{1}(x),\ldots ,G_{k}(x)} of degree m such that

F ( x ) = i = 1 k G i ( x ) G i ( x ) . {\displaystyle F(x)=\sum _{i=1}^{k}G_{i}(x)'G_{i}(x).}

Matrix SMR

To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as

F ( x ) = ( x { m } I r ) ( H + L ( α ) ) ( x { m } I r ) {\displaystyle F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'\left(H+L(\alpha )\right)\left(x^{\{m\}}\otimes I_{r}\right)}
where {\displaystyle \otimes } is the Kronecker product of matrices, H is any symmetric matrix satisfying
F ( x ) = ( x { m } I r ) H ( x { m } I r ) {\displaystyle F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'H\left(x^{\{m\}}\otimes I_{r}\right)}
and L ( α ) {\displaystyle L(\alpha )} is a linear parameterization of the linear space
L = { L = L :   ( x { m } I r ) L ( x { m } I r ) = 0 } . {\displaystyle {\mathcal {L}}=\left\{L=L':~\left(x^{\{m\}}\otimes I_{r}\right)'L\left(x^{\{m\}}\otimes I_{r}\right)=0\right\}.}

The dimension of the vector α {\displaystyle \alpha } is given by

ω ( n , 2 m , r ) = 1 2 r ( σ ( n , m ) ( r σ ( n , m ) + 1 ) ( r + 1 ) σ ( n , 2 m ) ) . {\displaystyle \omega (n,2m,r)={\frac {1}{2}}r\left(\sigma (n,m)\left(r\sigma (n,m)+1\right)-(r+1)\sigma (n,2m)\right).}

Then, F(x) is SOS if and only if there exists a vector α {\displaystyle \alpha } such that the following LMI holds:

H + L ( α ) 0. {\displaystyle H+L(\alpha )\geq 0.}

The expression F ( x ) = ( x { m } I r ) ( H + L ( α ) ) ( x { m } I r ) {\displaystyle F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'\left(H+L(\alpha )\right)\left(x^{\{m\}}\otimes I_{r}\right)} was introduced in [9] in order to establish whether a matrix form is SOS via an LMI.

Noncommutative polynomial SOS

Consider the free algebra RX⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn. By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form f = fT. When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials h 1 , , h k {\displaystyle h_{1},\ldots ,h_{k}} such that

f ( X ) = i = 1 k h i ( X ) T h i ( X ) . {\displaystyle f(X)=\sum _{i=1}^{k}h_{i}(X)^{T}h_{i}(X).}

Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.[10] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.[11]

References

  1. ^ Hilbert, David (September 1888). "Ueber die Darstellung definiter Formen als Summe von Formenquadraten". Mathematische Annalen. 32 (3): 342–350. doi:10.1007/bf01443605. S2CID 177804714.
  2. ^ Choi, M. D.; Lam, T. Y. (1977). "An old question of Hilbert". Queen's Papers in Pure and Applied Mathematics. 46: 385–405.
  3. ^ Goel, Charu; Kuhlmann, Salma; Reznick, Bruce (May 2016). "On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms". Linear Algebra and Its Applications. 496: 114–120. arXiv:1505.08145. doi:10.1016/j.laa.2016.01.024. S2CID 17579200.
  4. ^ Lasserre, Jean B. (2007). "Sufficient conditions for a real polynomial to be a sum of squares". Archiv der Mathematik. 89 (5): 390–398. arXiv:math/0612358. CiteSeerX 10.1.1.240.4438. doi:10.1007/s00013-007-2251-y. S2CID 9319455.
  5. ^ Powers, Victoria; Wörmann, Thorsten (1998). "An algorithm for sums of squares of real polynomials" (PDF). Journal of Pure and Applied Algebra. 127 (1): 99–104. doi:10.1016/S0022-4049(97)83827-3.
  6. ^ Lasserre, Jean B. (2007). "A Sum of Squares Approximation of Nonnegative Polynomials". SIAM Review. 49 (4): 651–669. arXiv:math/0412398. Bibcode:2007SIAMR..49..651L. doi:10.1137/070693709.
  7. ^ Chesi, G.; Tesi, A.; Vicino, A.; Genesio, R. (1999). "On convexification of some minimum distance problems". Proceedings of the 5th European Control Conference. Karlsruhe, Germany: IEEE. pp. 1446–1451.
  8. ^ Choi, M.; Lam, T.; Reznick, B. (1995). "Sums of squares of real polynomials". Proceedings of Symposia in Pure Mathematics. pp. 103–125.
  9. ^ Chesi, G.; Garulli, A.; Tesi, A.; Vicino, A. (2003). "Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions". Proceedings of the 42nd IEEE Conference on Decision and Control. Maui, Hawaii: IEEE. pp. 4670–4675. doi:10.1109/CDC.2003.1272307.
  10. ^ Helton, J. William (September 2002). ""Positive" Noncommutative Polynomials Are Sums of Squares". The Annals of Mathematics. 156 (2): 675–694. doi:10.2307/3597203. JSTOR 3597203.
  11. ^ Burgdorf, Sabine; Cafuta, Kristijan; Klep, Igor; Povh, Janez (25 October 2012). "Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials". Computational Optimization and Applications. 55 (1): 137–153. CiteSeerX 10.1.1.416.543. doi:10.1007/s10589-012-9513-8. S2CID 254416733.

See also