MA4248 Computational Linear Algebra

Development of algorithms for matrix computations. Rounding errors and introduction to stability analysis. Stable algorithms for solving systems of linear equations, linear least squares problems and eigen problems. Iterative methods for linear systems. Structured problems from applications in various disciplines.

Prerequisite

MA3046, or consent of instructor, advanced MATLAB programming

Lecture Hours

4

Lab Hours

1

Course Learning Outcomes

After successfully completing this course, the student will be able to:

  • Understand the difference between sparse and dense matrices and why exploiting matrix sparsity is critical for solving large-scale problems.
  • Know the differences between row- and column-major storage (for dense matrices) and between COO, CSR, and CSC formats (for sparse matrices).
  • Describe minimum-degree, (reverse) Cuthill-McKee, and nested dissection orderings, the differences between them, and the scenarios in which each is likely to be most effective.
  • Describe the difference between direct and iterative methods for sparse linear systems and when each class of methods is most useful.
  • Understand the basic mathematical framework that governs their convergence, including the concepts of iteration matrices, spectral radii, matrix norms, and diagonal dominance.
  • Possess a basic knowledge of the ideas behind multigrid, including the concepts of residual equations, prolongation and restriction, coarse-grid operators, smoothing, and multigrid cycles. Know how multigrid arises from the classical iterations via the idea of “smoothing the error.”
  • Understand the idea of seeking approximate solutions to a system of linear equations from a chosen subspace and the mathematical concepts used to describe this idea, including test and trial spaces, Galerkin and Petrov–Galerkin formulations, and orthogonal and oblique projections.
  • Define Krylov subspaces and explain why they are a natural/useful class of subspaces in which to seek approximate solutions to linear algebra problems from both theoretical and practical perspectives.
  • Identify the different projections that define/characterize FOM, CG, MINRES, GMRES, and BiCG.
  • Describe how CG relates to polynomial approximation know the basic CG convergence theorem.
  • Describe how GMRES relates to polynomial approximation and explain why its convergence properties are subtler than those of CG.
  • Define preconditioning and explain why it is necessary. Know the differences between left, right, and two-sided preconditioning.