Jose's Linear Algebra Times 2
Across
- 1. A method to find the best approximate solution to an overdetermined system.
- 3. A matrix G satisfying AGA = A and GAG = G
- 7. Gauss-Seidel method with relaxation to speed up convergence.
- 9. A square matrix whose rows and columns are orthonormal, satisfying QQ^T = I
- 11. Decomposition of a matrix into UΣVᵀ, where U and V are orthogonal, and Σ is diagonal.
- 13. Scalar λ such that Av=λv, where v is a nonzero eigenvector.
- 15. SVD where only nonzero singular values and their corresponding vectors are considered.
- 16. Algorithm to compute eigenvalues by repeated QR decomposition and matrix multiplication.
- 18. Iterative solvers like GMRES or CG based on the subspace {b, Ab, A^2b,...}
- 19. Solving problems at multiple levels of resolution to accelerate convergence.
- 20. Iterative algorithm using previously computed results within the same iteration.
Down
- 2. Square root of the eigenvalues of A^T A for a matrix A
- 4. Decomposition of a matrix into an orthogonal matrix (Q) and an upper triangular matrix (R).
- 5. R(x)= (x^T Ax)/(x^T x), an estimate for an eigenvalue.
- 6. A partial differential equation Δu=f common in physical systems.
- 8. Simplified matrix form (e.g., diagonal or Jordan) used to represent a matrix.
- 10. Iterative algorithm solving diagonally dominant linear systems by splitting A = D + R
- 11. A matrix A where A=A^T, often having real eigenvalues.
- 12. Ratio of the largest to smallest singular value, indicating sensitivity to perturbations.
- 14. Iterative method to find the largest eigenvalue of a matrix.
- 17. Iterative method to find an eigenvalue close to a given guess.