Convention. denotes a zero matrix of appropriate size.
Theorem (Singular Value Decomposition).
Every matrix has a SVD:
where are unitary and is "diagonal".
Furthermore, The singular values 's
are uniquely determined.
If is square and 's are distinct, the left and right singular vectors and are unique up to a multiplicative constant with modulus 1.
Remark. The technique in the proof below remains valid when all is replaced by .
Proof of Part (1). The case that or is simple, let's assume . Let , then due to compactness of in and the continuity of the map , there must be with s.t. , so there is , , . Hence is our first singular value.
Extend to an o.n. basis of and to an o.n. basis of . Let be the matrix with columns and be that with columns , then (see remark next page for the explanation of )
Now
this implies . Note that we have , and the only assumption to derive this result is , with . We extract this as a technical corollary.
Corollary. Let , with . Then if ,
The same is true when is replaced by .
Proof. Repeat what we have done so far, i.e., replace by and by in the argument preceding the corollary. Then once , one has .
To finish the proof let's induct on , where . Suppose any matrix with has SVD with uniquely determined singular values in descending order. Then for , by induction hypothesis and according to equation (1), with unique , and the existence of SVD follows from the formula:
Although is unique, it is dependent on , while is dependent on the choice of basis. Fortunately under any changes of and to other o.n. bases, and will be replaced by other unitary matrices and remains unchanged, hence singular values of are unique. The proof is almost completed by induction, except for the base case , which is obvious by (1).
Proof of Part (2). Let's assume is square. It is clear that since is the largest possible singular value of . We first prove that if the right singular vector of is not ``unique", then is not simple, i.e., is repeated in .
Let , . Suppose there are other vectors , with s.t. . For the sake of contradiction, let's assume , then the unit vector is orthogonal to . Now , the inequality cannot be strict, otherwise since with , we have
absurd. We conclude , for some unit vector . Now by the corollary one observes that
and thus we can get a complete list of singular values with appears twice, a contradiction. Hence if 's are distinct, , i.e., and differ by a multiplicative constant with modulus 1. It follows that is unique up to a complex sign. Finally since , by choosing the bases of these two spaces, the uniqueness follows from induction on dimension of the square matrix.