# Polynomial SOS

In mathematics, a form (i.e. a homogeneous polynomial) *h*(*x*) of degree 2*m* in the real *n*-dimensional vector *x* is sum of squares of forms (SOS) if and only if there exist forms of degree *m* such that

Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for *n* = 2, *m* = 1 or *n* = 3 and 2*m* = 4 a form is SOS if and only if it is positive.[1] The same is also valid for the analog problem on positive *symmetric* forms.[2][3]

Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.[4][5] Moreover, every real nonnegative form can be approximated as closely as desired (in the -norm of its coefficient vector) by a sequence of forms that are SOS.[6]

## Square matricial representation (SMR)

To establish whether a form *h*(*x*) is SOS amounts to solving a convex optimization problem. Indeed, any *h*(*x*) can be written as

where is a vector containing a base for the forms of degree *m* in *x* (such as all monomials of degree *m* in *x*), the prime ′ denotes the transpose, *H* is any symmetric matrix satisfying

and is a linear parameterization of the linear space

The dimension of the vector is given by

whereas the dimension of the vector is given by

Then, *h*(*x*) is SOS if and only if there exists a vector such that

meaning that the matrix is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression was introduced in [7] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.[8]

### Examples

- Consider the form of degree 4 in two variables . We have

- Since there exists α such that , namely , it follows that
*h*(*x*) is SOS.

- Consider the form of degree 4 in three variables . We have

- Since for , it follows that
*h*(*x*) is SOS.

## Generalizations

### Matrix SOS

A matrix form *F*(*x*) (i.e., a matrix whose entries are forms) of dimension *r* and degree *2m* in the real *n*-dimensional vector *x* is SOS if and only if there exist matrix forms of degree *m* such that

#### Matrix SMR

To establish whether a matrix form *F*(*x*) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any *F*(*x*) can be written according to the SMR as

where is the Kronecker product of matrices, *H* is any symmetric matrix satisfying

and is a linear parameterization of the linear space

The dimension of the vector is given by

Then, *F*(*x*) is SOS if and only if there exists a vector such that the following LMI holds:

The expression was introduced in [9] in order to establish whether a matrix form is SOS via an LMI.

### Noncommutative polynomial SOS

Consider the free algebra *R*⟨*X*⟩ generated by the *n* noncommuting letters *X* = (*X*_{1},...,*X*_{n}) and equipped with the involution ^{T}, such that ^{T} fixes *R* and *X*_{1},...,*X*_{n} and reverse words formed by *X*_{1},...,*X*_{n}.
By analogy with the commutative case, the noncommutative symmetric polynomials *f* are the noncommutative polynomials of the form *f* = *f*^{T}. When any real matrix of any dimension *r x r* is evaluated at a symmetric noncommutative polynomial *f* results in a positive semi-definite matrix, *f* is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials such that

Surprisingly, in the noncommutative scenario a noncommutative polynomial is SoS if and only if it is matrix-positive.[10] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.[11]

## References

- Hilbert, David (September 1888). "Ueber die Darstellung definiter Formen als Summe von Formenquadraten".
*Mathematische Annalen*.**32**(3): 342–350. doi:10.1007/bf01443605. - Choi, M. D.; Lam, T. Y. (1977). "An old question of Hilbert".
*Queen's Papers in Pure and Applied Mathematics*.**46**: 385–405. - Goel, Charu; Kuhlmann, Salma; Reznick, Bruce (May 2016). "On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms".
*Linear Algebra and Its Applications*.**496**: 114–120. arXiv:1505.08145. doi:10.1016/j.laa.2016.01.024. - Lasserre, Jean B. (2007). "Sufficient conditions for a real polynomial to be a sum of squares".
*Archiv der Mathematik*.**89**(5): 390–398. arXiv:math/0612358. CiteSeerX 10.1.1.240.4438. doi:10.1007/s00013-007-2251-y. - Powers, Victoria; Wörmann, Thorsten (1998). "An algorithm for sums of squares of real polynomials" (PDF).
*Journal of Pure and Applied Algebra*.**127**(1): 99–104. doi:10.1016/S0022-4049(97)83827-3. - Lasserre, Jean B. (2007). "A Sum of Squares Approximation of Nonnegative Polynomials".
*SIAM Review*.**49**(4): 651–669. arXiv:math/0412398. doi:10.1137/070693709. - Chesi, G.; Tesi, A.; Vicino, A.; Genesio, R. (1999). "On convexification of some minimum distance problems".
*Proceedings of the 5th European Control Conference*. Karlsruhe, Germany: IEEE. pp. 1446–1451. - Choi, M.; Lam, T.; Reznick, B. (1995). "Sums of squares of real polynomials" (PDF).
*Proceedings of Symposia in Pure Mathematics*. pp. 103–125. - Chesi, G.; Garulli, A.; Tesi, A.; Vicino, A. (2003). "Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions".
*Proceedings of the 42nd IEEE Conference on Decision and Control*. Maui, Hawaii: IEEE. pp. 4670–4675. - Helton, J. William (September 2002). ""Positive" Noncommutative Polynomials Are Sums of Squares".
*The Annals of Mathematics*.**156**(2): 675–694. doi:10.2307/3597203. JSTOR 3597203. - Burgdorf, Sabine; Cafuta, Kristijan; Klep, Igor; Povh, Janez (25 October 2012). "Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials".
*Computational Optimization and Applications*.**55**(1): 137–153. CiteSeerX 10.1.1.416.543. doi:10.1007/s10589-012-9513-8.