Quiz: Eigenvalues and Eigenvectors
Test your understanding of eigenvalues, eigenvectors, and their applications.
1. An eigenvector \(\mathbf{v}\) of matrix \(A\) satisfies:
- \(A\mathbf{v} = \mathbf{0}\)
- \(A\mathbf{v} = \lambda\mathbf{v}\) for some scalar \(\lambda\)
- \(A + \mathbf{v} = \lambda\)
- \(\mathbf{v}^T A = \lambda\)
Show Answer
The correct answer is B. An eigenvector \(\mathbf{v}\) of matrix \(A\) satisfies \(A\mathbf{v} = \lambda\mathbf{v}\), where \(\lambda\) is the corresponding eigenvalue. The matrix simply scales the eigenvector rather than changing its direction.
Concept Tested: Eigenvector
2. The characteristic polynomial of matrix \(A\) is:
- \(\det(A)\)
- \(\det(A - \lambda I)\)
- \(\det(A + \lambda I)\)
- \(\text{tr}(A) - \lambda\)
Show Answer
The correct answer is B. The characteristic polynomial is \(p(\lambda) = \det(A - \lambda I)\). Setting this equal to zero gives the characteristic equation, whose roots are the eigenvalues.
Concept Tested: Characteristic Polynomial
3. The eigenvalues of a symmetric matrix are always:
- Complex numbers
- Real numbers
- Positive numbers
- Zero
Show Answer
The correct answer is B. A key property of symmetric matrices is that all eigenvalues are real numbers. Additionally, eigenvectors corresponding to distinct eigenvalues are orthogonal.
Concept Tested: Eigenvalues of Symmetric Matrices
4. The sum of all eigenvalues of a matrix equals:
- The determinant
- The trace
- The rank
- Zero
Show Answer
The correct answer is B. The sum of all eigenvalues (counting multiplicities) equals the trace of the matrix. Similarly, the product of all eigenvalues equals the determinant.
Concept Tested: Eigenvalue Properties
5. If \(\lambda\) is an eigenvalue of \(A\), then the eigenvalue of \(A^2\) for the same eigenvector is:
- \(\lambda\)
- \(2\lambda\)
- \(\lambda^2\)
- \(\sqrt{\lambda}\)
Show Answer
The correct answer is C. If \(A\mathbf{v} = \lambda\mathbf{v}\), then \(A^2\mathbf{v} = A(A\mathbf{v}) = A(\lambda\mathbf{v}) = \lambda(A\mathbf{v}) = \lambda^2\mathbf{v}\). Raising a matrix to a power raises its eigenvalues to the same power.
Concept Tested: Powers of Matrices
6. Eigendecomposition of matrix \(A\) (when possible) is:
- \(A = V + D\)
- \(A = VDV^{-1}\)
- \(A = V^T D V\)
- \(A = D - V\)
Show Answer
The correct answer is B. Eigendecomposition expresses a matrix as \(A = VDV^{-1}\), where \(V\) is the matrix of eigenvectors and \(D\) is a diagonal matrix of eigenvalues. For symmetric matrices, \(V\) is orthogonal so \(A = VDV^T\).
Concept Tested: Eigendecomposition
7. A matrix is diagonalizable if:
- It is symmetric
- It has \(n\) linearly independent eigenvectors
- All eigenvalues are positive
- It is invertible
Show Answer
The correct answer is B. A matrix is diagonalizable if and only if it has \(n\) linearly independent eigenvectors, where \(n\) is the matrix dimension. Symmetric matrices always satisfy this condition.
Concept Tested: Diagonalizability
8. The spectral theorem states that for a real symmetric matrix:
- All eigenvalues are complex
- The matrix has an orthonormal basis of eigenvectors
- The matrix is not diagonalizable
- Eigenvalues form a spectrum of colors
Show Answer
The correct answer is B. The spectral theorem guarantees that real symmetric matrices have real eigenvalues and an orthonormal basis of eigenvectors. This enables the decomposition \(A = Q\Lambda Q^T\) where \(Q\) is orthogonal.
Concept Tested: Spectral Theorem
9. Why are eigenvalues important in stability analysis?
- They determine the color of the system
- Negative real parts indicate stable equilibria
- They measure the matrix size
- They count the number of solutions
Show Answer
The correct answer is B. In dynamical systems \(\dot{\mathbf{x}} = A\mathbf{x}\), eigenvalues determine stability. If all eigenvalues have negative real parts, perturbations decay over time and the system is stable. Positive real parts indicate instability.
Concept Tested: Stability Analysis
10. In Principal Component Analysis (PCA), eigenvectors of the covariance matrix represent:
- Random noise directions
- Directions of maximum variance in the data
- The mean of the data
- Outlier locations
Show Answer
The correct answer is B. In PCA, eigenvectors of the covariance matrix (principal components) point in directions of maximum variance. The corresponding eigenvalues indicate how much variance is captured by each direction.
Concept Tested: Principal Component Analysis