Following toggle tip provides clarification

Linear Algebra 2

This the courseware for MATH 235: Linear Algebra 2 for Honours Mathematics at the University of Waterloo.

Topics include orthogonal and unitary matrices and transformations; orthogonal projections; the Gram-Schmidt procedure; and best approximations and the method of least squares. Inner products; angles and orthogonality; orthogonal diagonalization; singular value decomposition; and other applications will also be explored.

Fundamental Subspaces

In this module, we will look at the fundamental subspaces of a matrix and of a linear mapping, and prove some useful results. Part of the purpose of this module is to help review and recall many of the concepts from Linear Algebra I that are needed for this course.

Linear Mappings

In this module, we will extend the concept of a linear mapping from Rn to Rm to linear mappings from a vector space V to a vector space W. It may be helpful to briefly review linear mappings from Rn to Rm, which includes the matrix of a linear mapping and diagonalization.

Inner Products

Previously, we studied the important concepts of length, orthogonality, and projections. Since these concepts are so amazingly useful in Rn, in this module, we will generalize all of these concepts to general vector spaces. To do this, we will need to first generalize the idea of the dot product in Rn to the concept of an inner product on a general vector space. This will lead us to an extremely important theorem in linear algebra. Finally, we will use the theory we developed in this module to actually look at a real-world application, called the method of least squares.

Applications of Orthogonal Matrices

In this module, we will use the theory we have previously developed to extend the idea of diagonalization to something even better: orthogonal diagonalization. This will lead us to the extremely useful and important topic of quadratic forms. Finally, using all of the theory we have developed, we will look at how to mimic diagonalization for non-square matrices.

Complex Vector Spaces

In this module, we are going to revisit many concepts covered previous, now allowing the use of complex numbers. We will see that much of the theory remains the same, but there will be some differences. Moreover, we will see that quite a lot of the computations will now be a little more complex.