Linear Algebra and Its Applications


How much do I want to read more? 8/10

If you want to understand matrix in the most basic and playful way, this book might be it.
I wish I had such a book to study, and my class would have made so much more sense.
It reminds me elementary, when everything made sense. I also had a rigorous teacher at that time.
Reading this book is like going back to that time, and extending the happiness beyond.


Preface

I personally believe that many more people need linear algebra than calculus.
Working with curved lines and curved surfaces, the first step is always to linearize. Replace the curve by its tangent line, fit the surface by a plane, and the problem becomes linear.

This subject begins with two vectors v and w, pointing in different directions. The key step is to take their linear combinations. We multiply to get 3v and 4w, and we add to get the particular combination 3v + 4w.
That new vector is in the same plane as v and w. When we take all combinations, we are filling in the whole plane. If I draw v and w on this page, their combinations cv + dw fill the page (and beyond), but they don’t go up from the page.
In the language of linear equations, I can solve cv + dw = b exactly when the vector b lies in the same plane as v and w.

Matrices

If the vectors are v = (1, 2, 3) and w = (1, 3, 4), put them into the columns of a matrix:

To find combinations of those columns, “multiply” the matrix by a vector (c,d): Linearcombinations cv + dw :

Those combinations fill a vector space. We call it the column space of the matrix. (For these two columns, that space is a plane.)
To decide if b = (2, 5, 7) is on that plane, we have three components to get right. So we have three equations to solve:

means:

c + d = 2
2c + 3d = 5
3c + 4d = 7

Now I can describe the first part of the book, about linear equations Ax = b. The matrix A has n columns and m rows. Linear algebra moves steadily to n vectors in m- dimensional space. We still want combinations of the columns (in the column space). We still get m equations to produce b (one for each row). Those equations may or may not have a solution. They always have a least-squares solution.

The interplay of columns and rows is the heart of linear algebra. It’s not totally easy, but it’s not too hard. Here are four of the central ideas:

  1. The column space (all combinations of the columns).
  2. The row space (all combinations of the rows).
  3. The rank (the number of independent columns) (or rows).
  4. Elimination (the good way to find the rank of a matrix).

Structure of the Course

The two fundamental problems are Ax = b and Ax = λ x for square matrices A. The first problem Ax = b has a solution when A has independent columns. The second problem Ax = λ x looks for independent eigenvectors. A crucial part of this course is to learn what “independence” means.

For ex, Matrix A does not have independent columns:

Column 1 plus column 2 equals column 3.
A wonderful theorem of linear algebra says that the three rows are not independent either.
The third row must lie in the same plane as the first two rows. Some combination of rows 1 and 2 will produce row 3.
You might find that combination quickly (I didn’t). In the end I had to use elimination to discover that the right combination uses 2 times row 2, minus row 1.

Elimination is the simple and natural way to understand a matrix by producing a lot of zero entries. So the course starts there. But don’t stay there too long! You have to get from combinations of the rows, to independence of the rows, to “dimension of the row space.” That is a key goal, to see whole spaces of vectors: the row space and the column space and the nullspace.

A further goal is to understand how the matrix acts. When A multiplies x it produces the new vector Ax. The whole space of vectors moves—it is “transformed” by A. Special transformations come from particular matrices, and those are the foundation stones of linear algebra: diagonal matrices, orthogonal matrices, triangular matrices, symmetric matrices.

The eigenvalues of those matrices are special too. I think 2 by 2 matrices provide terrific examples of the information that eigenvalues λ can give. Sections 5.1 and 5.2 are worth careful reading, to see how Ax = λ x is useful. Here is a case in which small matrices allow tremendous insight.

Overall, the beauty of linear algebra is seen in so many different ways:

  1. Visualization. Combinations of vectors. Spaces of vectors. Rotation and reflection and projection of vectors. Perpendicular vectors. Four fundamental subspaces.
  2. Abstraction. Independence of vectors. Basis and dimension of a vector space. Linear transformations. Singular value decomposition and the best basis.
  3. Computation. Elimination to produce zero entries. Gram-Schmidt to produce orthogonal vectors. Eigenvalues to solve differential and difference equations.
  4. Applications. Least-squares solution when Ax = b has too many equations. Dif- ference equations approximating differential equations. Markov probability matrices (the basis for Google!). Orthogonal eigenvectors as principal axes (and more…).

I try to explain rather than to deduce. This is a book about real mathematics, not endless drill. In class, I am constantly working with examples to teach what students need.


Chapter 1 - Matrices and Gaussian Elimination

1.1 Introduction

This book begins with the central problem of linear algebra: solving linear equations. The most important ease, and the simplest, is when the number of unknowns equals the number of equations. We have n equations in n unknowns, starting with n = 2:

Two equations Two unknowns :
1x + 2y = 3
4x + 5y = 6

The unknowns are x and y. I want to describe two ways, elimination and determinants, to solve these equations. Certainly x and y are determined by the numbers 1, 2, 3, 4, 5, 6. The question is how to use those six numbers to solve the system.