Quiz: Matrix Decompositions
Test your understanding of SVD, QR, and other matrix decompositions.
1. The Singular Value Decomposition (SVD) factors a matrix \(A\) as:
- \(A = U\Sigma V\)
- \(A = U\Sigma V^T\)
- \(A = \Sigma UV\)
- \(A = U + \Sigma + V\)
Show Answer
The correct answer is B. The SVD decomposes any \(m \times n\) matrix as \(A = U\Sigma V^T\), where \(U\) and \(V\) are orthogonal matrices and \(\Sigma\) is a diagonal matrix of singular values.
Concept Tested: Singular Value Decomposition
2. Singular values of a matrix are always:
- Complex numbers
- Negative or zero
- Non-negative real numbers
- Equal to the eigenvalues
Show Answer
The correct answer is C. Singular values are always non-negative real numbers, typically arranged in decreasing order: \(\sigma_1 \geq \sigma_2 \geq \cdots \geq \sigma_r > 0\). They are the square roots of eigenvalues of \(A^TA\).
Concept Tested: Singular Values
3. The rank of a matrix equals:
- The number of rows
- The number of non-zero singular values
- The largest singular value
- The trace
Show Answer
The correct answer is B. The rank of a matrix equals the number of non-zero singular values. This provides a robust numerical way to determine rank, especially when using a tolerance for "effectively zero" values.
Concept Tested: Rank and SVD
4. In the low-rank approximation via SVD, keeping only the top \(k\) singular values minimizes:
- The Frobenius norm of the error
- The rank of the matrix
- The number of computations
- The trace of the matrix
Show Answer
The correct answer is A. The truncated SVD gives the best rank-\(k\) approximation in both the Frobenius norm and the spectral norm. This is the Eckart-Young-Mirsky theorem.
Concept Tested: Low-Rank Approximation
5. QR decomposition factors a matrix as:
- \(A = QR\) where \(Q\) is orthogonal and \(R\) is upper triangular
- \(A = QR\) where \(Q\) is diagonal and \(R\) is symmetric
- \(A = R^TQ\)
- \(A = Q + R\)
Show Answer
The correct answer is A. QR decomposition expresses a matrix as \(A = QR\), where \(Q\) is an orthogonal matrix (orthonormal columns) and \(R\) is upper triangular. It is used in solving least squares and computing eigenvalues.
Concept Tested: QR Decomposition
6. The Gram-Schmidt process produces:
- A diagonal matrix
- An orthonormal basis from a set of vectors
- The inverse of a matrix
- The determinant
Show Answer
The correct answer is B. The Gram-Schmidt process takes a set of linearly independent vectors and produces an orthonormal set spanning the same space. It is the foundation of QR decomposition.
Concept Tested: Gram-Schmidt Process
7. Cholesky decomposition applies to:
- Any square matrix
- Positive definite symmetric matrices
- Singular matrices only
- Rectangular matrices
Show Answer
The correct answer is B. Cholesky decomposition factors a positive definite symmetric matrix as \(A = LL^T\), where \(L\) is lower triangular. It is twice as efficient as LU decomposition for applicable matrices.
Concept Tested: Cholesky Decomposition
8. The left singular vectors (columns of \(U\) in SVD) are eigenvectors of:
- \(A\)
- \(A^TA\)
- \(AA^T\)
- \(A + A^T\)
Show Answer
The correct answer is C. The left singular vectors (columns of \(U\)) are eigenvectors of \(AA^T\), while the right singular vectors (columns of \(V\)) are eigenvectors of \(A^TA\). The squared singular values are the eigenvalues.
Concept Tested: SVD and Eigenvalues
9. The pseudoinverse \(A^+\) computed via SVD satisfies:
- \(AA^+ = I\) always
- \(AA^+A = A\)
- \(A^+ = A^T\)
- \(A^+ = A^{-1}\) always
Show Answer
The correct answer is B. The Moore-Penrose pseudoinverse satisfies \(AA^+A = A\) (among other conditions). It generalizes the inverse to non-square and singular matrices, providing least-squares solutions.
Concept Tested: Pseudoinverse
10. Which decomposition is most useful for image compression?
- LU decomposition
- QR decomposition
- SVD with truncation
- Cholesky decomposition
Show Answer
The correct answer is C. Truncated SVD is ideal for image compression because it provides the optimal low-rank approximation. Keeping only the largest singular values captures the most important image features while reducing storage.
Concept Tested: SVD Applications