Linear Algebra for Math Majors

Welcome to **Linear Algebra for Math Majors**! This is a rigorous, proof-based linear algebra class. The difference between this class and *Linear Algebra for Non-Majors* is that we will cover many topics in greater depth, and from a more abstract perspective. There will be a correspondingly smaller emphasis on computation in this class, and greater expectations for proof-writing and abstraction.

This course has high expectations. You should plan to spend **9 hours per week** on this class, *not including lecture*. It will also be necessary to supply **independent motivation** as *not all of the work you need to do for this class will be collected, or even assigned*. It is also essential to **recognize early** when you are struggling with a concept and **discuss it with me**.

Above all, you must **engage actively** with the material as we learn it. If you are studying actively, you will have questions. Use this principle to measure whether you are actively engaged.

The first thing to do when joining this course is to make sure these expectations align with your goals.

I encourage you to consult outside sources, use the internet, and collaborate with your peers. However, there are important rules to ensure that you use these opportunities in an academically honest way.

- Anything with your name on it must be your work and accurately reflect your understanding.
- Plagiarism will be dealt with harshly. Deliberate plagiarism will be reported to the Honor Code Office.

To avoid plagiarism, you should always **cite all resources you consult**, whether they are textbooks, tutors, websites, classmates, or any other form of assistance. Using others' words verbatim, without attribution, is absolutely forbidden, but so is using others' words with small modifications. *The ideal way to use a source is to study it, understand it, put it away, use your own words to express your newfound understanding, and then cite the source as an inspiration for your work.*

Do you have a question or comment about the course? The answer might be in the course policies, on this page. If your question isn't answered in the course policies, please send me an email. Or, if you prefer, you may send me a comment anonymously.

Instructor: Jonathan Wise

e-mail: jonathan.wise@colorado.edu

Office: Math 204

Office hours: calendar

Phone: 303 492 3018

My office is **Room 204** in the Math Department. My office hours sometimes change, so I maintain a calendar showing the times I will be available. I am often in my office outside of those hours, and I'll be happy to answer questions if you drop by outside of office hours, provided I am not busy with something else. I am also happy to make an appointment if my office hours are not convenient for you.

We will use several texts in this class. Most of them are available online, except for the following one, which is out of print:

Hans Samelson. *An introduction to linear algebra*.

We will use the following text more in the first third to half of the course than we will towards the end. It is available for free via the University of Colorado's subscription to SpringerLink.

Paul R. Halmos. *Finite-dimensional vector spaces*

The next two texts are both available for free online. We may or may not make explicit use of them in class, but you could find them useful regardless:

Jim Hefferon. *Linear algebra*, 3e.

Sergei Treil. *Linear Algebra done Wrong*.

Here are a few other textbooks I have used in the past, but that I don't plan to rely on explicitly in this course. You may find them useful if you are looking for another perspective.

Stephen H. Friedberg, Arnold J. Insel, and Lawrence E. Spence. *Linear algebra*, 4e.

Sheldon Axler. *Linear Algebra done Right*, 3e.

- Reproduce the definition or theorem. Execute the algorithm.
- Recognize examples and nonexamples of the definition. Recognize settings where the theorem applies or doesn't apply. Understand how the algorithm works.
- Write short proofs using the definition. Identify opporunities to use the algorithm. Use multiple ideas in the same problem.
- Write long proofs using multiple definitions, theorems, and algorithms. Know and understand the proof of the theorem. Prove that the algorithm does what is supposed to do. Apply the algorithm in novel situations. Adapt the algorithm to solve different problems.

Your grade will be the average of your scores on approximately 4 in-class quizzes, approximately 4 problem sets, and the final exam. Quizzes and problem sets will count equally, and the final exam will count as the equivalent of 2 quizzes.

The scoring on these assessments will be based on the goals listed above. Notably, your score will not be a simple sum of point values from each problem, but will instead be my overall assessment of the degree to which you have achieved the course's goals on the relevant topics.

At the end of the semester, I would like for your final grade to reflect your mastery of the course material. Exams and problem sets do not always measure this optimally, so you will be allowed to revise your scores by the following process: 1) decide which score you wish to revise; 2) identify the topics that were assessed (for example, from the course outline) and put these in a list to be handed in with your revision (you may want to clear your list with me before going on to the next step); 3) find or devise a list of problems that you can use to demonstrate your mastery of those topics (again, you may want to discuss these with me); 4) solve those problems and submit your solutions to me. I will assign a replacement grade based on your submission.

As a practical matter, I insist that your revisions be submitted within two weeks of the due date of the original assignment. This is meant to prevent an influx of revisions at the end of the semester, when I will not have time to look at all of them.

The revision policy will also be used to address missed assignments and exams. No grades will be dropped, apart from those replaced by revisions.

The following is an abbreviated list of topics covered in this course (the definitive list is in the assignments below) and may be a good guide for topics that will be addressed on the exam.

- Fields (Halmos, §1; Hefferon, pp. 145–146)
- The vector spaces \( F^n \) of column vectors and \( F_n \) of row vectors
- Spans of vectors in vector spaces
- Determining membership in a span
- Solving linear equations by row operations (Gaussian elimination) (Samelson, §2.4; Hefferon, Ch. 1, §III)
- Calculating minimal spanning subsets by column operations
- Definition of a vector space (Samelson, §1.1–1.2; Halmos, §2–3)
- Abstract examples of vector spaces (polynomials, function spaces)
- Axioms of a subspace (Samelson, §2.1)
- Linear independence and dependence (Samelson, §2.3)
- Definition of a basis (Samelson, §3.1)
- Coordinates of a vector in a basis (Samelson, §2.4)
- Dimension (Samelson, §3.1, 3.2)
- Definition of the dual vector space (Samelson, §5.1)
- Dual basis (Samelson, §5.1, 5.2)
- Definition and examples of linear transformations (Samelson, §8.1; Hefferon, Ch. 3, §II.1)
- Isomorphism of vector spaces (Halmos, §9; Hefferon, Ch. 3, §I)
- Matrix representation of a linear transformation in a basis (Halmos, §37–38; Hefferon, Ch. 3, §III; Samelson, §8.2)
- Change of basis matrices
- Matrix multiplication (Samelson, §1)
- Kernel and image of linear transformations (Samelson, §8.3)
- Rank-nullity theorem (Samelson, §8.4)
- Criteria for injectivity and surjectivity of a linear transformation (Samelson, §8.6)
- Definition of determinants (Samelson, §§7.1, 7.2)
- Calculating determinants using permutations, row or column expansion, or by row and column operations
- Relationships between determinants of real matrices and volume and orientation (Samelson, §§7.1, 7.4)
- Relationships between determinants, invertibility, and independence of rows and columns
- Determinants of products of matrices
- Definition of eigenvalues and eigenvectors (Samelson, §9.1)
- Eigenvalues and the characteristic polynomial (Samelson, §9.2)
- Finding eigenvectors and diagonalizing matrices (Samelson, §9.3)

- The
**final exam**is on**Sunday, December 16**from 1:30pm until 4pm in our usual classroom, MUEN E064. - The final problem set is due on Wednesday, December 12.
- The third problem set was due on Wednesday, November 28. Revisions will be accepted until the end of the semester.
- The second problem set was due on Friday, November 2. You may submit it in class or on on canvas. If you are writing your solutions using Latex, you may want to start with my tex file.
- The second quiz was in class on Friday, October 12. Grades were assigned according to the goals, which I interpreted as follows:
- I expected you to be able to solve Problem 1 to receive a D using row or column operations or a combination of the two. I was looking here to see if you knew what sorts of operations could be used to test for linear independence.
- To receive a C, you needed to show that you understood what sorts of things are preserved by row and column operations. There were opportunities to do this on Problems 2 and 3.
- To receive a B, you should have been able to articulate why your answers in Problem 2 were correct. I also wanted to see a basic setup of a proof in Problem 4, even if it was not correct in all particulars.
- An A required essentially correct answers to all questions. I was willing to overlook arithmetic errors or minor omissions provided you demonstrated a good understanding otherwise.

- The first graded problem set was due on Monday, October 1.
**Revisions of the first problem set are due on Monday, October 22.**Grades were assigned according to the goals, which I interpreted as follows: - The first quiz was in class on Friday, September 14. Revisions were due on Friday, September 28. Grades were assigned according to the goals, which I interpreted as follows:
- Problem 4 was very similar to problems considered in class, and we set up an algorithm to solve such problems. I expected you to be able to solve this problem to receive a D, although I was willing to overlook arithmetic errors and other minor mistakes if you were able to demonstrate your understanding in other problems.
- I assigned a final grade of C to papers that demonstrated an ability to use the ideas we discussed in class in more novel situations. Opportunities to do this appeared in Problem 1, which required you to divide complex numbers, and in Problem 5, which could be approached in many different ways using linear systems of equations or spans.
- You needed to be able to write a correct proof in Problem 3 to receive a B, as this was the only problem that involved proofwriting. I would have considered a good solution to Problem 2 as a substitute, but this did not come up.
- To get an A required good solutions or near-solutions to all of the problems.

- Read Samelson, Chapter 9, §§1–3 (pp. 147–155). Prove that every \( n \times n \) real matrix can be connected by a path to either \( \begin{pmatrix} 1 \\ & 1 \\ & & \ddots\\ & && 1 \\ &&&& 1 \end{pmatrix} \) or to \( \begin{pmatrix} 1 \\&1\\&&\ddots\\&&&1\\&&&&-1 \end{pmatrix} \). (If this is challenging, try doing if first for \( 2 \times 2 \) and then for \( 3 \times 3 \) matrices.) Prove that the determinant is multilinear and alternating: \( \det \begin{pmatrix} \vec u^1 & \cdots & \vec u^{i-1} & \vec v + \vec w & \vec u^{i+1} & \cdots & \vec u^n \end{pmatrix} = \det \begin{pmatrix} \vec u^1 & \cdots & \vec u^{i-1} & \vec v & \vec u^{i+1} & \cdots & \vec u^n \end{pmatrix} + \det \begin{pmatrix} \vec u^1 & \cdots & \vec u^{i-1} & \vec w & \vec u^{i+1} & \cdots & \vec u^n \end{pmatrix} \) and \( \det \begin{pmatrix} \vec u^1 & \cdots & \vec u^{i-1} & c.\vec u^i & \vec u^{i+1} & \cdots & \vec u^n \end{pmatrix} = c \det \begin{pmatrix} \vec u^1 & \cdots & \vec u^n \end{pmatrix} \) and \( \det \begin{pmatrix} \vec u^{\tau(1)} & \cdots & \vec u^{\tau(n)} \end{pmatrix} = \operatorname{sgn}(\tau) \det \begin{pmatrix} \vec u^1 & \cdots & \vec u^n \end{pmatrix} \).
- Read Samelson, Chapter 7, §§2–3. Do Samelson, Chapter 7, §2, #2, 3, 4b, 5.
**Problem Set #4 is due on Wednesday, December 12.** - Do Samelson, Chapter 7, §1, #4 (p. 104) and Chapter 7, §3, #2 (you have many methods to do this one, but you should do it using the determinant). In class, we connected the matrix \( \begin{pmatrix} 5 & 2 \\ 1 & 1 \end{pmatrix} \) to the matrix \( \begin{pmatrix} 3 & 2 \\ 0 & 1 \end{pmatrix} \) by the continuous path of matrices \( \begin{pmatrix} 5 - 2t & 2 \\ 1 - t & 1 \end{pmatrix} \). Connect \( \begin{pmatrix} 3 & 2 \\ 0 & 1 \end{pmatrix} \) to \( \begin{pmatrix} 3 & 0 \\ 0 & 1 \end{pmatrix} \) by a continuous path of matrices and then connect \( \begin{pmatrix} 3 & 0 \\ 0 & 1 \end{pmatrix} \) to \( \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} \) by a continuous path of matrices. Explain why every matrix in each of these paths is invertible.

The transpose of a matrix is obtained by flipping the matrix across its diagonal. If the entry of \( M \) in the \( i \)-th row and \( j \)-th column is \( a_{ij} \) then the entry of the transpose \( M^t \) in the \( i \)-th row and \( j \)-th column is \( a_{ji} \). Prove that \( |\det M| = | \det M^t| \). - Do Hefferon, Chapter 4, §I.2, #2.8 (just compute the absolute value of the determinant; use column operations, as discussed in class — what Hefferon calls "Gauss's method" is to use row operations), 2.20. Show that the determinant of \( \begin{pmatrix} y + z & x & 1 \\ x + z & y & 1 \\ x + y & z & 1 \end{pmatrix} \) is zero for all values of \( x, y, z \) (this matrix is the transpose of the one in Hefferon, Chapter 4, §I.2, #2.15). Read Samelson, Chapter 7, §4 (pp. 116–118).
- Read Samelson, Chapter 7, §§1, 4 (pp. 101–105, 116–118). Do Samelson, Chapter 8, §6, #1, 2, 8.
- For Wednesday, November 28:
**Problem Set #3 is due!**Read Samelson, Chapter 8, §§7–8. Do Samelson, Chapter 8, §4, #1–3 (pp. 132–133). - For Monday, November 26: Read Samelson, Chapter 8, §§3–4 (pp. 127–133). Complete the calculation of the formula for the \( n \)-th Fibonacci number from class. You can find the complete calculation is in the lecture notes. Enjoy your vacation!
**Problem Set #3 will be due on Wednesday, November 28.** - For Friday, November 16: Do Samelson, Chapter 6, §3, #3 (p. 96) using the method described in class on Wednesday. (The invese of a matrix \( M \) is a matrix \( N \) such that \( NM = MN = I_n \) is the identity matrix.) Do Samelson, Chapter 8, §3, #4, 7 (p. 131). (An invariant subspace for \( T \) is a subspace \( V \) of \( U \) such that \( T(\vec v \) \in V \) for every \( \vec v \in V \).
- For Wednesday, November 14: As discussed in class, each column operation corresponds to a change of basis. Find the change of basis matrix \( [\operatorname{id}]^T_S \) for each of these operations. Make sure to do Monday's reading assignment.
- For Monday, November 12: Read Samelson, Chapter 6, §4 (pp. 97–100). Warning: Samelson uses the notation \( T^\beta_\gamma \) for the matrix we would have called \( [\operatorname{id}]^\gamma_\beta \) in class. Do Samelson, Chapter 6, §4, #2a, 2c, 3. Read Samelson, Chapter 8, §3 (pp. 127–131).
- For Friday, November 9: Prove that matrix multiplication is associative. It is
*much easier*to do this by proving composition of linear transformations is associative and using \( [\psi]^S_T [\varphi]^R_S = [\varphi \psi]^S_T \) a few times than it is to do this using the formula for matrix multiplication!

Let \( \varphi : \mathbb R^3 \to \mathbb R^3 \) be a linear transformation such that \( \varphi(\vec v) = \vec v \) for all \( \vec v \in \operatorname{span} \left\{ \begin{pmatrix} 2 \\ -1 \\ 3 \end{pmatrix}, \begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix} \right\} \) and \( \varphi \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix} = \vec 0 \). Find a basis \( S \) of your choice and compute \( [\varphi]_S^S \). - For Wednesday, November 7: Read Samelson, Chapter 6, §1 (pp. 86–88). Do Samelson, Chapter 6, §1, #1, 5, 6 (p. 89). If you did not get a chance to do Monday's problems, you can hand them in on Wednesday. Let \( R_\theta : \mathbb R^2 \to \mathbb R^2 \) be the linear transformation that rotates vectors counterclockwise around the origin by the angle \( \theta \). Let \( E \) be the standard basis of \( \mathbb R^2 \). Compute \( [ R_\theta \circ R_\phi ]_E^E \) as \( [ R_\theta ]_E^E [ R_\phi ]_E^E \) and use this to prove the angle sum formulas for \( \sin \) and \( \cos \).
- For Monday, November 5: Do Samelson, Chapter 8, §1, #1–2 and §2, #1–2 (p. 126). Let \( P_n = \{ a_0 + a_1 t + \cdots + a_n t^n \: \big| \: a_i \in \mathbb R \} \) be the vector space of polynomials with real coefficients and degree at most \( n \). Let \( S = \{ 1, t, t^2, \ldots, t^n \) and \( T = \{ 1, (t-1), (t-1)^2, \ldots, (t-1)^n \} \). Let \( D : P_n \to P_n \) be the linear transformation \( D(\vec f) = \frac{d \vec f}{dt} \). Compute the matrices \( [ D ]^S_S \), \( [ D ]^S_T \), \( [ D ]^T_S \), and \( [ D ]^T_T \).
- For Friday, November 2:
**Problem Set #2 is due**either in class or on canvas. Read Samelson, Chapter 8, §2 (pp. 123–126). **Problem Set #2 is due on Friday, November 2.**For Wednesday, October 31: read Samelson, Chapter 8, §1 (pp. 119–123. Let \( V = \{ \vec f : \mathbb R \to \mathbb R \} \) be the \( \mathbb R \)-vector space of all functions from \( \mathbb R \) to \( \mathbb R \). Define liear functionals \( \varphi_n : V \to \mathbb R \) and \( \psi_m : V \to \mathbb R \) by the formulas \( \varphi_n(\vec f) = \int_0^{2\pi} \cos(nx) \vec f(x) dx \) and \( \psi_m(\vec f) = \int_0^{2\pi} \sin(mx) \vec f(x) dx \). Complete the proof that we started in class, using \( \varphi_n \) and \( \psi_m \) to prove that the functions \( \{ 1, \cos(x), \cos(2x), \ldots, \sin(x), \sin(2x), \ldots \} \) are linearly independent.**Problem Set #2 is due on Friday, November 2.**For Monday, October 29: do Samelson, Chapter 5, §2, #1–2.- For Friday, October 26: read Samelson, Chapter 5, §§2–3 (pp. 82–85). Do Samelson, Chapter 5, §3, #1 (p. 85). If you were unable to read the problems due on Wednesday, please hand them in on Friday.
**Revisions for Quiz 2 are due**, either in class or on Canvas. - For Wednesday, October 24: Find the basis of \( \operatorname{span} \left\{ \begin{pmatrix} 1 \\ 1 \\ -1 \end{pmatrix}, \begin{pmatrix} 2 \\ -1 \\ 3 \end{pmatrix} \right\} \) that is dual to the basis \( \{ \vec v_1, \vec v_3 \} \) of \( V^\ast \), where \( \vec v_i \begin{pmatrix} x_1 \\ x_2 \\ x_3 \end{pmatrix} = x_i \). How do you expect your answer to compare to the basis dual to \( \{ \vec v_1, \vec v_2 \} \) that we computed in class? How does your answer compare to your expectation?

Let \( P_n = \{ a_0 + a_1 t + \cdots + a_n t^n \: \big| \: a_i \in \mathbb R \} \), regarded as a vector space over \( \mathbb R \). For each \( i \), let \( \vec \varphi_i : P_n \to \mathbb R \) be the linear functional \( \vec \varphi_i(\vec f) = \vec f^{(i)}(7) \) (note, \( \vec f^{(i)} \) means the \( i \)-th derivative of \( \vec f \)). Prove that the linear functionals \( \{ \vec \varphi_0, \ldots, \vec \varphi_n \} \) form a basis of \( P_n^\ast \) by finding the basis of \( P_n \) that they are dual to. - For Monday, October 22: Do Samelson, Chapter 3, §1, #1, 5 (p. 40). Let \( V = \{ a_0 + a_1 t + a_2 t^2 + a_3 t^3 \: \big| \: a_i \in \mathbb R \} \) be the vector space of polynomials with real coefficients and degree at most 3. Find the dual basis of the basis \( \{ 1, (t-1), (t-1)^2, (t-1)^3 \} \) of \( V \). Read Samelson, Chapter 5, §1 (pp. 79–82).
- For Friday, October 19: Read Halmos, §§8–9 (pp. 13–16). Read Hefferon, Chapter 2, §III.2 (pp. 121–125). Do Hefferon, Chapter 2, Section III, Exercises 2.16 and 2.17. Let \( \{ \vec u^1, \ldots, \vec u^n \} \) be a basis of an \( F \)-vector space \( V \). Define \( \varphi : F^n \to V \) by the formula \( \varphi \begin{pmatrix} a_1 \\ \vdots \\ a_n \end{pmatrix} = a_1 \vec u^1 + \cdots + a_n \vec u^n \). Prove that \( \varphi \) is linear.
- For Wednesday, October 17: Do Samelson, Chapter 3, §4, #1–2 (p. 46). Let \( V \) be the vector space of all functions from \( \mathbb R \) to \( \mathbb R \). Prove that the set \( \{ \cos(x), \sin(x), \cos(2x), \sin(2x) \} \) is linearly independent. Hint: for any \( t \in \mathbb R \), there is a linear functional \( \varphi_t : V \to \mathbb R \) given by the formula \( \varphi_t(f) = f(t) \). Can you prove more generally that \( \{ \cos(x), \sin(x), \cos(2x), \sin(2x), \cos(3x), \sin(3x), \ldots \} \) is linearly independent?
- For Wednesday, October 10: Suppose that \( V \) is a vector space. Can a subspace of \( V \) be a linearly independent subset? Do Hefferon, Chapter II, §II.1, #1.21, 1.23.
**There is a quiz on Friday, October 12.** - For Monday, October 8: do Samelson, Chapter 2, §5 (pp. 34–35), #3–6, 8. Read Samelson, Chapter 3, §4 (pp. 45–46). Don't forget that
**there is a quiz on Friday, October 12**. **The second in-class quiz will be on Friday, October 12.**For Friday, October 5: do Samelson, Chapter 2, §5 (pp. 34–35), #1, 2, 6. Prove that a minimal spanning collection of vectors in a vector space \( V \) is a basis of \( V \). (Recall that \( T \) is a minimal spanning collection for \( V \) if, for every \( \vec x \in T \), the set \( T - \{ \vec x \} \) does not span \( V \).) Read Samelson, Chapter 3, §1.- For Wednesday, October 3: do Samelson, Chapter 2, §3 (p. 27), #1–5. Read Samelson, Chapter 2, §5 (pp. 31–34). If you misunderstood the directions for the quiz revisions (see the grading section), you may resubmit them on Wednesday.
- For Monday, October 1:
**the first graded problem set is due!**Submit in class or on Canvas (PDF format only!). Read Halmos, §§5–7 (pp. 7–11). - For Friday, September 28: read Samelson, Chapter 2, §3 (pp. 25–27) and Chapter 3, §1 (pp. 36–40). There is no homework assignment due on Friday, but
**revisions of the first quiz are due**and the first graded problem set is due on Monday, October 1. **The first graded problem set is due on Monday, October 1. Revisions of the first quiz are due on Friday, September 28.**For Wednesday, September 26: do Halmos, §14, #2–3 (p. 22); prove that, for every linear functional \( \varphi \) on \( F^n \), there is exactly one row vector \( \vec a \in F_n \) such that \( \varphi = \varphi_{\vec a} \) (this proof was started in class).- For Monday, September 24: read Samelson, Chapter 4, §1, up to Proposition 1.1 (pp. 47–49) and Samelson, Chapter 2, §2 (pp. 21–24); do Samelson, Chapter 4, §1, #1–3 (note Example 4 of Problem #1 subsumes the others, so you can use the first parts to develop intuition for Example 4, or you can do Example 4 alone and skip the others); prove that the kernel of a linear functional \( A : V \to F \) is a subspace of \( V \); do Samelson, Chapter 4, §2, #2; find a linear functional on \( \mathbb R^3 \) whose kernel is \( \operatorname{span} \left\{ \begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}, \begin{pmatrix} 1 \\ 0 \\ 1 \end{pmatrix} \right\} \) (this problem is really Samelson, Chapter 4, §2, #4, rephrased); find a vector in \( \mathbb R^3 \) that spans \( \operatorname{ker} (A) \cap \operatorname{ker} (B) \) where \( A \begin{pmatrix} x \\ y \\ z \end{pmatrix} = x + y \) and \( B \begin{pmatrix} x \\ y \\ z \end{pmatrix} = x + z \). Do these last two questions seem similar?
- For Friday, September 21: prove that \( \{ f : \mathbb R \to \mathbb R \: \big| \: f(7) = 0 \} \) is a subspace of \( \{ f : \mathbb R \to \mathbb R \} \) (this proof was started in class); is \( \{ f : \mathbb R \to \mathbb R \: \big| \: f(7) = 1 \} \) a subspace of \( \{ f : \mathbb R \to \mathbb R \} \)? Do Samelson, Chapter 2, §1, #8 (p. 21); read Halmos, §13 (pp. 20–21).
- For Wednesday, September 19: Do Samelson, Chapter 2, §1, #5. Let \( S \) be any set and let \( F \) be any field. Let \( V \) of functions from \( S \) to \( F \). Define \( \vec{0}(x) = 0 \) for all \( x \in S \); define \( (\vec{f} + \vec{g})(x) = \vec{f}(x) + \vec{g}(x) \) for all \( \vec{f}, \vec{g} \in V \) and all \( x \in S \); define \( (c.\vec{f})(x) = c(\vec{f}(x)) \) for all \( c \in F \), all \( \vec{f} \in V \), and all \( x \in S \). Prove that these definitions satisfy any two of the properties VS
_{1}–VS_{9}of a vector space from p. 8 of a Samelson (note that we proved VS_{4}in class). If you have not already done Chapter 1, §2, #1–2, please do them as well for Wednesday. In class, we will discuss Chapter 2, §1. - For Monday, September 17:
**the first quiz will be on Friday!**Read Samelson, Chapter 2, §1; you may also want to reread Chapter 1, §2; do Samelson, Chapter 1, §2, #1–2 (be careful on #2: what is the zero vector?); do Samelson, Chapter 1, §1, Exercise #1–2. - For Wednesday, September 12: read Samelson, Chapter 1, §§1–2 (pp. 5–15) and/or Hefferon, Chapter 2, §I.1 (pp. 78–86); convert the system of linear equations from Hefferon, Chapter 1, §III.1, Exercise 1.8(d), into a question about whether a vector is in the span of another collection of vectors; solve this problem twice, once using row operations (for the equations version) and once using column operations (for the span version).
- For Monday, September 10: read Hefferon, Chapter 1, §I.1 (pp. 2–9); do Samelson, Chapter 2, §4, #1, parts (1) and (iv) on p. 31; do Hefferon, Chapter 1, §I.3, Exercise 3.21 on p. 34; for which values of \( a \) and \( b \) is the vector \( \left( \begin{array}{c} 1 \\ a \\ b \\ 1 \end{array} \right) \) contained in \( \operatorname{span} \left\{ \left( \begin{array}{c} 1 \\ -1 \\0 \\0 \end{array} \right), \left( \begin{array}{c} 0 \\ 0 \\ 1 \\ 1 \end{array} \right), \left( \begin{array}{c} 1 \\0 \\1 \\2 \end{array} \right) \right\} \)?
- For Friday, September 7: do Samelson, Chapter 2, §4, #1, parts (ii) and (iii) on p. 31; prove the theorem that we stated in class (see the lecture notes)
- For Wednesday, September 5: read Samelson, Chapter 2, §§3–4 (pp. 25–30); do Chapter 2, §1, #1–4, 7; you may wish to do #1 and #4 first, as these should be the easiest ones; the concept of
*linear combination*has not been defined yet in class, so please refer back to p. 18 for the definition; the notation in Problem #7 might be unclear: Samelson means \( \{ X : x_2 = x_3 \} = \left\{ \left( \begin{array}{c} x_1 \\ x_2 \\ x_3 \\ x_4 \\ x_5 \end{array} \right) \in \mathbb R^5 \: \Bigg| \: x_2 = x_3 \right\} \) - For Friday, August 31: do Exercises 1.5 and 1.7(a) of Halmos; read §II.1 of Chapter 1 of Hefferon
- For Wednesday, August 29: if you are not yet familiar with the complex numbers, or would like a review, please read the first 6 pages of this article by Balázs Szendrői and Richard Earl (written by Frances Kirwan); watch this video about complex numbers and the fundamental theorem of algebra; do Exercise 1.1 of Halmos.
- For Monday, August 27: read the course policies (this page); read Halmos, §1; complete the before class survey.

- Lecture 41 (December 12): repeated eigenvalues and non-real eigenvalues
- Lecture 40 (December 10): eigenvalues and eigenvectors
- Lecture 39 (December 7): summary of determinants
- Lecture 38 (December 5): permutations and determinants
- Lecture 37 (December 3): determinants of products
- Lecture 36 (November 30): determinants and volume
- Lecture 35 (November 28): kernels, injectivity, isomorphisms
- Lecture 34 (November 26): rank-nullity theorem
- Lecture 33 (November 16): the Fibonacci sequence
- Lecture 32 (November 14): row and column operations as change of basis; kernel and image
- Lecture 31 (November 12): column operations as change of basis
- Lecture 30 (November 9): change of basis
- Lecture 29 (November 7): finding a convenient basis
- Lecture 28 (November 5): composition of linear transformations
- Lecture 27 (November 2): coordinates of linear transformations
- Lecture 26 (October 31): linear transformations
- Lecture 25 (October 29): independence of the Fourier basis
- Lecture 24 (October 26): dual bases and row/column operations
- Lecture 23 (October 24): dual of the dual
- Lecture 22 (October 22): more dual bases
- Lecture 21 (October 19): dual bases
- Lecture 20 (October 17): dimension
- Lecture 19 (October 15): coordinates from bases
- Lecture 18 (October 10): quiz review
- Lecture 17 (October 8): quiz problems
- Lecture 16 (October 5): testing for dependence with row and column operations
- Lecture 15 (October 3): more about existence of bases and checking for linear dependence
- Lecture 14 (October 1): existence of bases
- Lecture 13 (September 28): bases
- Lecture 12 (September 26): some examples of linear functionals
- Lecture 11 (September 24): more about linear functionals
- Lecture 10 (September 21): linear functionals
- Lecture 9 (September 19): subspaces
- Lecture 8 (September 17): examples of vector spaces
- Lecture 7 (September 12): span membership and systems of equations; vector spaces
- Lecture 6 (September 10): row operations
- Lecture 5 (September 7): column echelon forms
- Lecture 4 (September 5): column operations
- Lecture 3 (August 31): vectors and spans
- Lecture 2 (August 29): complex numbers
- Lecture 1 (August 27): fields

A reference discussing most of the important aspects of LaTeX you will use.

A very quick introduction to LaTeX.

If you need to know the command for some symbol, try using Detexify.

Agnès Beaudry and Kate Stange list more LaTeX resources...

The Mathematics Academic Resource Center (MARC) is staffed with learning assistants and undergraduate and graduate students that can help you with concepts in this class. This is an excellent resource, and I encourage you to use it, but **remember to use this resource responsibly!**

**Do** ask for help from MARC with the daily, uncollected homework assignments in this class.

**Do** Ask for clarification of ideas from the textbook and from discussion in class.

**Do not** ask for help from MARC with specific problems on the the larger, collected (graded) assignments.

**Do not under any circumstances** submit a solution that was given to you by MARC as your own work.

The Office of Academic Affairs officially recommends a number of statements for course syllabi, all of which are fully supported in this class.

If you need special acommodation of any kind in this class, or are uncomfortable in the class for any reason, please contact me and I will do my best to remedy the situation. You may contact me in person or send me a comment anonymously using the form below.