Jose's Linear Algebra Times 2

1234567891011121314151617181920
Across
  1. 1. A method to find the best approximate solution to an overdetermined system.
  2. 3. A matrix G satisfying AGA = A and GAG = G
  3. 7. Gauss-Seidel method with relaxation to speed up convergence.
  4. 9. A square matrix whose rows and columns are orthonormal, satisfying QQ^T = I
  5. 11. Decomposition of a matrix into UΣVᵀ, where U and V are orthogonal, and Σ is diagonal.
  6. 13. Scalar λ such that Av=λv, where v is a nonzero eigenvector.
  7. 15. SVD where only nonzero singular values and their corresponding vectors are considered.
  8. 16. Algorithm to compute eigenvalues by repeated QR decomposition and matrix multiplication.
  9. 18. Iterative solvers like GMRES or CG based on the subspace {b, Ab, A^2b,...}
  10. 19. Solving problems at multiple levels of resolution to accelerate convergence.
  11. 20. Iterative algorithm using previously computed results within the same iteration.
Down
  1. 2. Square root of the eigenvalues of A^T A for a matrix A
  2. 4. Decomposition of a matrix into an orthogonal matrix (Q) and an upper triangular matrix (R).
  3. 5. R(x)= (x^T Ax)/(x^T x), an estimate for an eigenvalue.
  4. 6. A partial differential equation Δu=f common in physical systems.
  5. 8. Simplified matrix form (e.g., diagonal or Jordan) used to represent a matrix.
  6. 10. Iterative algorithm solving diagonally dominant linear systems by splitting A = D + R
  7. 11. A matrix A where A=A^T, often having real eigenvalues.
  8. 12. Ratio of the largest to smallest singular value, indicating sensitivity to perturbations.
  9. 14. Iterative method to find the largest eigenvalue of a matrix.
  10. 17. Iterative method to find an eigenvalue close to a given guess.