Prof. Readdy's Math 322 Matrix Algebra Course Homepage

Math 322 Matrix Algebra
Prof. Readdy
University of Kentucky
Fall 2024


Lectures: MWF 9:00 - 9:50 am Lafferty 201 C

Lecturer: Prof. Readdy, 825 POT, margaret.readdy@uky.edu

Math Dept. Staff: phone 859-257-3336


Course Information

Text: David C. Lay, Steven Lay, Judi McDonald, Linear Algebra and Its Applications, 6th Edition. You may also use any recent edition.

Syllabus

Course Schedule

Homework (6th edition) and (4th or 5th edition)
Student Solutions!

Use chapter 4 and chapter 5 from the fifth edition of the textbook for the chapter 4 and chapter 5 homework.


Announcements

Mathskeller is temporarily located on the Mezzanine level of POT.


Course Diary

Week 1

Monday, August 26th
No class.
Prof. Readdy is speaking at a conference.

Wednesday, August 28th
Today we began speaking about linear algebra and its applications to huge data sets, computer graphics and error correcting codes (think scratched cd), with the more general question of understanding space. We discussed section 1.1 systems of linear equations. We did a 2x2 example to motivate elementary row operations on matrices. We ended with an example of a linear system in three dimensions with three intersecting hyperplanes. We are starting to see that pivots are becoming important. We also discussed the geometry behind a linear system and its solution.

Next time we will finish up §1.1 and continue with §1.2.

Homework: Read 1.1 and do the assigned problems. Due Wednesday Sept 4th (See course information above for link to homework list.)

Friday, August 30th
Quiz 0. Here are some resources for reviewing this material.

We began lecture today discussing location in a matrix and size of a matrix. We then returned to §1.1 discussing the geometry behind a linear system and its solution. We also saw there were many ways to have a no solution situation, including having parallel hyperplanes or simply a situation where all of the hyperplanes do not intersect simultaneously, eventhough they may intersect pairwise.

We then moved on to §1.2. We began discussing row echelon form and row-reduced echelon form, and discussed what echelon means. We will see this will be an easier way to view the solution space.

Next time we will finish up §1.2 and continue with §1.3.

Homework: Read 1.2 and do the assigned problems. Due Wednesday Sept. 11th (See course information above for link to homework list.)

Flying in echelon formation
This is Lockheed T2V "Sea Stars" in echelon flight formation (photographer unknown -- this appears to be an official Lockheed-Martin photo; I would like a better reference.)

In this youtube video of The Blue Angels pay attention starting at time 1 minute 30 seconds.

Week 2

Monday, September 2nd
NO CLASS - LABOR DAY

Wednesday, September 4th
We continued with §1.2, looking at row reduced and echelon forms and discussed what echelon means for 2x2 matrices. We stated the theorem that every matrix has a unique row-reduced echelon form. We did two examples of putting a matrix in row-reduced echelon form.

We then began §1.3 vector equations, beginning with how to sketch vectors, adding vectors algebraically and then geometrically. If you think of vectors as pushing chairs around, then there are no rules to memorize. We went through half of the axioms for a linear algebra, also known as a vector space. We'll continue with those next time.

Homework: Finish reading §1.2 and do the homework. Read §1.3 and start the homework.

Friday, September 6th
Today we finished reviewing the axioms for a vector space in Rn. We proved one of the axioms: (commutativity) followed from the usual commutativity of the addition of real numbers. We also discussed the linear combination of vectors and the span of vectors. We looked at a a few examples of linear combination of vectors and the span of vectors. I also defined linear independence and linear dependence. To be continued...

Last day to register to vote in Kentucky is October 7, 2024 at 4 pm. See govoteky

Homework: Read § 1.3 and start the homework.

Week 3

Monday, September 9th
Today we returned to the ideas of span, linear dependence and linear independence. We are beginning to also hint at the idea of a subspace. I hinted that span{[1 0 1]^T, [0, 1 0]^T} is an example of a 2-dimensional subspace of 3-dimensional space.

I then began §1.4 the matrix equation Ax = b. I stated when we can multiply two matrices A and B (A must be mxk and B must be kxn). I also did one example of the dot product of two vectors. To be continued...

Homework: Finish §1.3 homework. Due Monday, September 16th! Start to read §1.4 in preparation for Wednesday's lecture.

Wednesday, September 11th
§1.2 homework due.

Today is section 1.4, the matrix equation Ax = b. We defined matrix multiplication by first defining the dot product of two vectors. In this way the (i,j) entry of the product AB is the dot product of the ith row of A with the jth column of B. We also discussed anticipating the size of a matrix product and did a few examples.

The main result in this section is solving Ax = b can be viewed as finding scalars so that the vector b is a linear combination of the columns of the matrix A.

We ended with an example of determining for a given matrix A what possible vectors b can we have so that Ax = b has a solution. The example we computed showed that any vector b will work. Next time we will compute an example where the possible vectors are restricted.

Homework: Read §1.4 and do the homework. Due Wednesday, September 18th.

Friday, September 13th
Quiz 2 today!
Today we started with an example of when Ax = b has a restricted vector b. In contrast, we saw that Ax=b has a solution for any vector if there is a pivot in every row of A.

We discussed the 4 equivalencs: Ax = b has a solution for any vector b <==> b is a linear combination of the columns of A <==> the columns of A span m-dimensional Euclidean space if A is mxn. <==> A has a pivot in every row.

We then began section 1.5, looking at solutions to the homogeneous equation Ax=0. This had the solution span{[0, 1, -1]^T}, which is a line in R3.. To be continued...

Homework: Finish § 1.4 homework! Ready §1.5 and start the homework.

Week 4

Monday, September 16th
§1.3 homework due
We continued with section 1.5, looking for solutions of the homogeneous equation Ax=0. We saw that we always have the trivial solution, namely when x is the zero vector. The example from last time had the solution span{[0, 1, -1]^T}. Today's second example of solving Ax=0 only had the trivial solution. The third example (one equation with 3 unknowns) had one pivot column and two non-pivot columns. The solution space was a linear combinatorion of two vectors.

Pivots are entering the scene now. We will see many facts and info arising by knowing the pivots.

We ended looking at a 1 x 3 matrix A. We saw that solutions to Ax = b where b is a *non-zero* vector contain the solution to the homogeneous equation Ax = 0 (number of non-pivot columns gives the dimension) plus a shift by a vector.

Homework: Finish reading §1.5 and do the homework. Read §1.7 for Wednesday.

Wednesday, September 18th
§ 1.4 homework due
We again discussed linear dependence and independence. I formally introduced the standard basis vectors. In 3 dimensions these are the vectors e1 = [1 0 0]^T, e2 = [0 1 0]^T and e3 = [0 0 1]^T. (Here T indicates the transpose of the row vector to make it a column vector.)

We then began talking about transformations (section 1.8) and went over many examples. Most are geometrically-inspired. To be continued...

Homework: Do §1.7 homework. Read §1.8.

Friday, September 20th
Quiz 3 today.
We continued looking at many examples of transformations today. We found in an ad hoc fashion the matrix representation of all the examples in our lecture. I then gave the official definition of a linear transformation. We proved that if T is a linear transformation then T always maps the zero vector to the zero vector. We also proved that T(a u + b v) = a T(u) + b T(v) for any scalars a and b and vectors u and v.

Homework: Do §1.8 homework. Read §1.9 for Monday.

Week 5

Monday, September 23rd
§ 1.5 homework due. Today I began lecture by verifying that the sheer transformation example satisfied the two axioms for being a linear transformation.

We then moved on to §1.9. We stated and gave a constructive argument for finding the standard matrix of a linear transformation. We showed the standard matrix A of a linear transformation exists and is unique and looked at a few examples. We also considered the matrix of the linear transformation of rotating by an angle θ in the counterclockwise direction. We return to this next time...

Homework: Do §1.9 homework. Start reading §2.1.

Wednesday, September 25th
§ 1.7 homework due
We returned to the example of rotating R2 by an angle θ counterclockwise. We mentally checked that this preserves vector addition and scalar multiplication. Using the theorem we proved last time from §1.9, we wrote down the matrix in the case we have a π/2 counterclockwise rotation by just checking where the vectors e1 and e2 are sent.

We reviewed what it means for a transformation to be 1-1 (injective) and onto (surjective). For the case of a *linear* transformation we stated a transformation T is 1-1 if and only if the only T(v) = 0 implies v = 0. We then proved the gory details.

I also defined the kernel of a transformation. For the case of *linear* transformations, we proved equivalences for a transformation being 1-1, onto and having kernel be the zero vector.

The Story behind the Avignon Bridge by Leslie Farnsworth.

Homework: Read § 1.9 and finish the homework! Start reading §2.1 for next time.

Friday, September 27th
Quiz 4 today on Week 4 lectures via CANVAS.
The quiz will be available at 9:35 am via canvas.

(Recorded lecture due to Tropical Storm Helene.)
After a brief preview on the determinant, we then began discussing matrix operations and properties from section 2.1. We saw that taking powers of diagonal matrices is easy! To be continued...

Today's lecture on §2.1 (part 1) (youtube)

Homework: Watch and take notes on the recorded lecture. Read §2.1 and start the homework.

Week 6

Monday, September 30th
§1.8 homework due.
Continued with properties of the algebra of matrices today (§2.1). We saw that AB and BA are in general not the same (except for lucky things, such as when A = rB for some scalar r...) We proved (AB)^T = B^T A^T today, finishing up section 2.1. Beginning §2.2 we described the algorithm for deriving the inverse of a square matrix and computed an example. We also derived the inverse of a general 2x2 matrix, though have two subcases to check. To be continued...

Homework: Read section 2.1 and finish 2.1 homework. Read section 2.2 and start 2.2 homework.

Wednesday, October 2nd
Today we derived the inverse of a general 2x2 matrix. Emerging from this proof was the fact that its determinant must be nonzero in order for the matrix inverse to exist. In turn, this implied the columns of the original matrix are dependent. We will see that for a matrix to be invertible, two of the many equivalences is its determinant is nonzero, and its columns are linearly independent.

We ended by moving on to section 2.3. We stated 11 equivalences for a square matrix to be invertible. Homework: Read section 2.2 and do the homework!

Friday, October 4th
Quiz 5 today via canvas.
Online lecture:

Computing inverse for a 3x3 matrix: Example

§2.3 part 1 Plan of action for the big theorem on matrix invertibility: Lecture

Start of the proofs. To be continued on Monday... Please watch the video below for all of the arguments. Most of theses arguments are a review of the material we have covered. Some of them teach some matrix techniques that will come in handy.

§2.3 part 1 Lecture
§1.9 homework due.

Week 7

Monday, October 7th
§2.1 homework due.
Last day to register to vote in Kentucky is TODAY, October 7, 2024, at 4 pm. See govoteky

Finished proofs of the various equivalences for a matrix to be invertible. Started block matrices from §2.4.

Homework: Finish § 2.3 homework. Start §2.4.

Wednesday, October 9th
§2.2 homework due.
Finished §2.4 on block matrices. We saw that even if the submatrices are all not square, the dimensions get taken care of.

We then began §2.5, the A=LU decomposition and how to use this to solve A x = b via L(Ux) = b. (Solve Ly = b first...) Next time: how to find an LU decomposition.

Homework: Finish section 2.4 homework. Read §2.5 for next time. Next Friday's exam will be on chapter 1 and 2.

Friday, October 11th
Quiz 6 today.
Continued §2.5 today on the A=LU decomposition. We saw that U is just the upper triangular matrix formed by row-reducing A and L is the *inverse* of the row operations to go from A to U.

We also discussed what happens if you switch rows while row-reducing A, giving PA = LU, where P is a permutation matrix, as well as the matrix form A = PDU.

Matrix decompositions in general are very useful in speeding up computations. The text has some numerical notes that you should read.

Review sheet for Exam I
Answers

Homework: Read §2.5 and finish the exercises. Study for Exam I.

Week 8

Monday, October 14th
§2.3 homework due.

Today we began speaking about the determinant via the cofactor definition (section 3.1), We then discussed various properties of the determinant. In general, the cofactor method of computing the determinant is long timewise, so having all of these properties speeds things up considerably. I did a few examples of them in action. To be continued... Homework: Be sure to finish homework up to section 2.5 and do the review sheet so that you are ready for the exam on Friday.

Wednesday, October 16th
§2.4 homework due.
Review Day.

Friday, October 18th
Exam 1 in-class.

Week 9

Monday, October 21st
§2.5 homework due.
Reviewed properties of the determinant today and did a few examples of them in action. I then defined the symmetric group and gave the symmetric group definition of the determinant. This shows that an nxn matrix has n! terms. (Thus the engineer's dream way of computing a determinant fails for matrices that are 4x4 and larger.)

I then gave the real and fancy definition of the determinant, as the only alternating multi-linear function which has det(I) = 1. This real definition compactly encodes the properties of the determinant -- alternating means the determinant switches sign if you exchange two rows, multi-linear means that the det acts like a linear function, so if you keep all rows fixed except one, so we can slide out a constant from a row, etc.

Homework: Finish §3.1 homework and §3.2 homework. Quickly read §3.3 to prep for next time.

Wednesday, October 23rd
We went over an example of Cramer's rule today. We then proved Cramer's rule starting with the easy case A=I. We then showed the general case of solving Ax = b by row reduction leaves the ratios in Cramer's rule the same. Hence the proof reduces to the case A=I, which we did.

Homework: Read §3.3 and do the homework.

Friday, October 25th
Quiz 7 today.
§3.1 homework due.
We began section 4.1 on vector spaces in one, two and three-dimensional Euclidean space. Many of the vector space axioms are slight generalizations of properties you learned about numbers when you were a toddler.

We also discussed subspaces. Most of the axioms we do not have to check since subspaces inherit these properties.

Currently physicists say we live in a 10-dimensional space. You can read about this at phys.org. These various dimensions have interesting names: 1st dimension: length
2nd dimension: height
3rd dimension: depth
4th dimension: time
10th dimension: the point of infinite possibilities!
For a cosmic article about this, see here

Homework: Read §4.1 and start the homework.

Week 10

Monday, October 28th
NO CLASS -- FALL BREAK

Wednesday, October 30th
§3.2 homework due.
Finished §4.1 today, We discussed some interesting examples of vector spaces, including polynomials in x of degree at most 2 having real coefficients, the polynomial ring R[x] (infinite-dimensional...). We also hinted at the idea of isomorphism (from the Greek iso = same, morphe = structure/shape). Roots of this word appear in isotope (as in Carbon-12, Carbon-13 and Carbon-14) and isosceles triangle (two sides are the same).

Friday, November 1st
§3.3 homework due. Quiz today!
Began §4.2 going over the nullspace, row space and column space of a matrix. This are three of the *four* subspaces associated with a matrix, where the fourth one is the left nullspace. We will continue discussing the 2x3 example on Monday, linking together these four subspaces.

Homework: Read §4.2 and do the homework.

Week 11

Monday, November 4th
Continued with §4.2 today going over the *four* subspaces associated with a matrix: nullspace, rowspace, column space. The remaining one is left nullspace, which is Nul(A^T). All of this is section 4.2.

I defined the inner product of two vectors (also known as the dot product or scalar product). In chapter 6 we will see that this is a test for two vectors to be perpendicular. I hinted at the result that the row space and nullspace are perpendicular subspaces. The same holds the column space and left nullspace being perpendicular subspaces.

I also hinted strongly at one of the important results in linear algebra, namely that dim(Col(A)) + dim(Nul(A))= # columns of A. The other main result we will show is dim(Row(A)) = dim(Col(A)) = # pivots in A, as well as dim Nul(A^T) + dim(Col(A)) = # rows of A. We will soon return to this more formally.

Note that when I started to discuss §4.2, I proved that the span of any vectors forms a subspace. This makes it much easier to conclude the column space and row space are subspaces.

We then began §4.3 Linear independent sets and bases. After recalling the definition of a linearly independent set of vectors, I defined what a basis for a vector space is. We did the example of the standard basis for Rn. To be continued...

Homework: Read 4.2 and finish the homework. Read 4.3.

Wednesday, November 6th
Continued with bases today, looking at the examples of a polynomial basis for Pn and 2x3 matrices. I then state the main theorem for obtaining a basis from a spanning set of vectors. This theorem is very constructive -- you keep on adding one vector at a time that is not in the span of the previously added vectors.

We then moved on to §4.4 on coordinates. We showed that given a basis for a vector space, the coordinates of a vector with respect to the basis is unique. We also discussed the matrix which converts from one basis to another. It is easy to find the matrix converting vectors in a new basis B to the standard basis. The reverse operation, going from the standard basis to the new basis, is to take the matrix inverse of this matrix. I also gave the formal definition of two vecto spaces to be isomorphic. You need to have a linear map between them that is both 1-1 and onto.

Homework: Read §4.3 and do the exercises. Read §4.4 and do the exercises.

Friday, November 8th
§4.5 on dimension today. Note I do things a little differently than the text -- first I prove that any two bases for a vector space have the same number of elements. Then we can define dimension as the number of elements in any basis for a vector space. We then counted the dimension of various vector spaces, including R^n, mxn matrices, triangular matrices and symmetric matrices. As a corollary to the fact any two bases for a vector space have the same number of elements, any set with more than n vectors in an n-dim'l vector space is automatically linearly *dependent*.

We also defined the rank of a matrix as the dimension of the column space. More next time...

4.5 Dimension (video starts at 4:28 due to issues with ipad)

Homework: Read §4.5 and do the exercises.

Week 12

Monday, November 11th
Quiz today!! Material: §4.1 on vector spaces and subspaces. (I could ask you to verify a collection of vectors is a subspace...) Also know the basic definitions of row space, column space and nullspace.

We then began §4.6 by defining the rank of a matrix as the dimension of the column space of the matrix. This invariant will imply many fundamental facts about a matrix. We showed that if A ~ B, then Row(A) = Row(B). We proved that for an mxn matrix A the dimension of Row(A), Col(A) and rank(A) are equal. We also proved rank(A) + dim(Nul(A)) = n, after realizing this is just a fancy way to count the number of columns in a matrix by whether or not it has a pivot. NOTE: All of the theorems involving rank are *central* to linear algebra. Please learn them!

Homework: Read §4.6 and do the homework.

Wednesday, November 13th
While returning graded papers the class worked on the dimension and basis of a given matrix. I then began speaking about changing bases (§4.7). We will continue this on Friday.

Friday, November 15th
Quiz 10 today.
Today we discussed changing bases (section 4.7). If you can remember how to change from the basis B to the standard basis, and the basis C to the standard basis, then going from the basis B to the basis C is easy. (Just use the matrix C-1 B.) An easy way to compute this is to row reduce [C|B] to [I|C-1B]. Amazing and fabulous. Please watch the order..

Review sheet for Exam II
Answers

Homework: Read §4.7 and do the exercises. Study for Exam II.

Week 13

Monday, November 18th
§5.1 Eigenvalues and eigenvectors.

Homework: Read §5.1 and do the exercises.

Wednesday, November 20th
Review for exam.

Friday, November 22nd
Exam 2 in-class.

Week 14

Monday, November 25th
CLASS CANCELLED DUE TO GRADING

Wednesday, November 27th
NO CLASSES - THANKSGIVING HOLIDAY

Friday, November 29th
NO CLASSES - THANKSGIVING HOLIDAY

Week 15

Monday, December 2nd
§5.1 and 5.2 today.

Wednesday, December 4th
§5.3 Diagonalization.

Friday, December 6th
§5.5 complex eigenvalues and eigenvalues.

Week 16

Monday, December 9th
Continued Section§5.5 on complex eigenvalues and eigenvectors. We then discussed diagonalizing a matrix which has complex eigenvalues. You just proceed as before, except we allow our matrices in the A = P D P-1 decomposition to have complex entries. (This will be useful when you take Diff'l Equations.) We also showed that the text's way of handling the complex eigenvalue case A = PCP-1 where λ = a - ib is an eigenvalue giving the matrix C with first column [a b]T and second column [-b a]T corresponds to the geometry of the complex eigenvalue λ -- giving a rotation and a dilation/expansion.

Described Graham-Schmidt algorithm (chapter 6).

Wednesday, December 11th
Last Day of Classes. Finished Graham-Schmidt today.
Discussed the tennis ball problem.

Rest of semester review sheet

Graham-Schmidt cheat sheet (Will be provided at final exam.)

Week 16

Wednesday, December 11th
Final Exam
8:00 am - 10:00 am
The final exam is cumulative


Calendar (Things to do when you are not studying Matrix Algebra)

Math Club


Last updated: Thursday, December 12, 2024.
© 2000 Margaret A. Readdy.