## ChapterMMatrices

We have made frequent use of matrices for solving systems of equations, and we have begun to investigate a few of their properties, such as the null space and nonsingularity. In this chapter, we will take a more systematic approach to the study of matrices.

## Annotated Acronyms M.

### TheoremÂ VSPM.

These are the fundamental rules for working with the addition, and scalar multiplication, of matrices. We saw something very similar in the previous chapter (TheoremÂ VSPCV). Together, these two definitions will provide our definition for the key definition, DefinitionÂ VS.

### TheoremÂ SLEMM.

TheoremÂ SLSLC connected linear combinations with systems of equations. TheoremÂ SLEMM connects the matrix-vector product (DefinitionÂ MVP) and column vector equality (DefinitionÂ CVE) with systems of equations. We will see this one regularly.

### TheoremÂ EMP.

This theorem is a workhorse in SectionÂ MM and will continue to make regular appearances. If you want to get better at formulating proofs, the application of this theorem can be a key step in gaining that broader understanding. While it might be hard to imagine TheoremÂ EMP as a definition of matrix multiplication, we will see in ExerciseÂ MR.T80 that in theory it is actually a better definition of matrix multiplication long-term.

### TheoremÂ CINM.

The inverse of a matrix is key. Here is how you can get one if you know how to row-reduce.

### TheoremÂ NPNT.

This theorem is a fundamental tool for proving subsequent important theorems, such as TheoremÂ NI. It may also be the best explantion for the term â€śnonsingular.â€ť

### TheoremÂ NI.

â€śNonsingularityâ€ť or â€śinvertibilityâ€ť? Pick your favorite, or show your versatility by using one or the other in the right context. They mean the same thing.

### TheoremÂ CSCS.

Given a coefficient matrix, which vectors of constants create consistent systems? This theorem tells us that the answer is exactly those column vectors in the column space. Conversely, we also use this theorem to test for membership in the column space by checking the consistency of the appropriate system of equations.

### TheoremÂ BCS.

Another theorem that provides a linearly independent set of vectors whose span equals some set of interest (a column space this time).

### TheoremÂ BRS.

Yet another theorem that provides a linearly independent set of vectors whose span equals some set of interest (a row space).

### TheoremÂ CSRST.

Column spaces, row spaces, transposes, rows, columns. Many of the connections between these objects are based on the simple observation captured in this theorem. This is not a deep result. We state it as a theorem for convenience, so we can refer to it as needed.

### TheoremÂ FS.

This theorem is inherently interesting, if not computationally satisfying. Null space, row space, column space, left null space â€” here they all are, simply by row reducing the extended matrix and applying TheoremÂ BNS and TheoremÂ BCS twice (each). Nice.