Home

Syllabus

Lecture Topics

Homework

Policies


Math 3130: Introduction to Linear Algebra, Fall 2016


Lecture Topics


Date
What we discussed/How we spent our time
Aug 22
Syllabus. Text.

Section 1.1: Systems of linear equations. Augmented matrix of a linear system. Elementary row operations.

Aug 24
Section 1.2: Row reduction. (Reduced) row echelon form. Pivots, pivot positions and pivot columns. Free and pivot variables. Parametric form of solution set.
Aug 26
We reviewed prior reading and lectures by working on these questions. Then we defined addition and multiplication of matrices and identified some laws these operations satisfy. ($M_{m\times n}(\mathbb R)$ is a commutative group under $+, -, 0$. Matrix multiplication is associative and distributes over addition when the matrices have the appropriate dimensions.) We saw that matrix multiplication is not commutative.
Aug 29
We discussed how to visualize vector arithmetic in $\mathbb R^n$. We defined linear combination and span. We discussed the possibilities for $\textrm{Span}\; X$ when $X$ is a set of 0, 1, 2 or 3 vectors. Quiz 1.
Aug 31
We reviewed the previous lecture and worked out an exercise of the form: Determine if ${\bf b}$ belongs to $\textrm{Span} \{{\bf v}_1,\ldots,{\bf v}_k\}$. This led to the realization that ${\bf b}\in\textrm{Span} \{{\bf v}_1,\ldots,{\bf v}_k\}$ if and only if $A{\bf x}={\bf b}$ is consistent when $A = \left[{\bf v}_1\;\cdots\;{\bf v}_k\right]$. We discussed the geometric interpretation of this. Then we discussed $A{\bf x}={\bf b}$ as a mapping problem. Finally we stated and explained a theorem characterizing those matrices $A$ with the property that $A{\bf x}={\bf b}$ is consistent for all ${\bf b}$.
Sep 2
We started by working on practice problems. Then we discussed the structure of the coimage of the mapping ${\bf x}\mapsto A{\bf x}$. This led to the concept of a homogeneous system. Note: next Monday's quiz has been moved to Wednesday.
Sep 7
We discussed the decomposition of a function. Then we took this quiz.
Sep 9
We worked out the answers to this worksheet. Then we discussed the general solution of a homogeneous/nonhomogeneous linear system.
Sep 12
We discussed (trivial versus nontrivial) dependence relations, and linear dependence/independence. We gave an algorithm for determining if a set $Y$ of vectors is linearly independent. Quiz.
Sep 14
We defined linear transformation, and showed that matrix mappings $T\colon \mathbb R^n\to \mathbb R^m\colon {\bf v}\mapsto A{\bf v}$ are linear. We gave an example of a nonlinear transformation. We defined the standard basis $({\bf e}_1,\ldots,{\bf e}_n)$, and explained how to find the standard matrix for a linear transformation. We found the standard matrix for `differentiation of quadratic polynomials', expressed as the transformation $D\left(\left[\begin{array}{c}a\\b\\c\end{array}\right]\right) = \left[\begin{array}{c}2a\\b\end{array}\right]$.

I announced that a guest, Professor Agnes Szendrei, will lecture next week. Everything will go as usual (e.g., quiz on Monday, HW due Wednesday).

Sep 16
We worked out the answers to this worksheet. Then we discussed how to find matrices for rotations with respect to the origin, reflections through a line through the origin, and dilations with respect to the origin in $2$-dimensional space.

I announced that a guest, Professor Agnes Szendrei, will lecture next week. Everything will go as usual (e.g., quiz on Monday, HW due Wednesday).

Sep 19
Applications: network flow and nutritional diet. Quiz.
Sep 21
Handout. Inverse of a matrix: definition, basic properties and algorithm for finding the inverse.
Sep 23
Various characterizations of invertible matrices. Inverse of a linear transformation.
Sep 26
Review of left, right and 2-sided invertibility. We started discussing subspaces, bases and dimension. Quiz.
Sep 28
We discussed the column space algorithm and the null space algorithm.
Sep 30
We discussed homogeneous coordinates, affine tranformations, said a few words about partitioned matrices, and explained how to find the $3\times 3$-matrix representing (in homogeneous coordinates) a given affine rotation of the plane. Practice problems!.
Oct 3
We worked out an example showing how to find the matrix that reflects a $2$-dimensional vector in homogeneous coordinates through the line of slope $-1$ that passes through the point $(1,1)$.

We next began to discuss length, area and volume. We derived the area formula
$A\left( \left[\begin{array}{c}a\\c\end{array}\right], \left[\begin{array}{c}b\\d\end{array}\right]\right)=ad-bc$. Quiz.

Oct 5
We worked on practice problems. As we discussed the solutions, we observed that any two bases of a subspace have the same size.

We then discussed length, area, and volume in $\mathbb R^1$, $\mathbb R^2$, and $\mathbb R^n$, $n\geq 3$. We evolved a definition of signed volume in $\mathbb R^n$: a multilinear alternating function normalized to $1$ on the unit hypercube.

Oct 7
We discussed the determinant.
Oct 10
We continued discussing the determinant. I circulated a review sheet.
Oct 12
We reviewed for the midterm. I circulated the quiz that I forgot to give on Monday.
Oct 14
Midterm.
Oct 17
Cramer's rule. The Vandermonde determinant.
Oct 19
Abstract vector spaces: definition and examples.
Oct 21
We worked on practice problems. We briefly discussed last week's midterm. Then we proved that if $S$ is a finite spanning set for a vector space $\mathbb V$, then $S$ contains a basis for $\mathbb V$. We defined the $\mathcal B$-coordinate vector, $\left[{\mathbf u}\right]_{\mathcal B}$, for a vector ${\mathbf u}\in\mathbb V$ with respect to an ordered basis $\mathcal B$.
Oct 24
We reviewed the meaning of isomorphism (= invertible linear transformation whose inverse is also linear). We explained why a linear transformation that is 1-1 and onto is an isomorphism. We proved that a finitely generated real vector space is isomorphic to $\mathbb R^n$ for some $n$. Quiz.
Oct 26
We discussed matrices for linear transformations, in particular change of basis matrices ${}_{\mathcal C}[I]_{\mathcal B}$.
Oct 28
Practice problems about coordinates. We discussed why $[{\bf v}]_{\mathcal B}$ may be calculated by solving $[{\mathcal B}][{\bf x}] = [{\bf v}]$, or may be computed as $[{\mathcal B}]^{-1}[{\bf v}]$, and why ${}_{\mathcal C}[I]_{\mathcal B}$ may be computed as $[{\mathcal C}]^{-1}[{\mathcal B}]$. We defined $\mathbb V+\mathbb W$ and $\mathbb V\cap\mathbb W$, and explained how to compute bases for these subspaces given bases for $\mathbb V$ and $\mathbb W$.
Oct 31
We worked on practice problems and took a quiz.
Nov 2
We discussed e-values and e-vectors. (Read 265-276.)
Nov 4
We worked on practice problems, then discussed how to compute a particular coefficient of $\chi_A(\lambda) = \det(\lambda I-A)$. Namely, the coefficient $s_i$ of $\lambda^{n-i}$ is $(-1)^i$ times the sum of the $i\times i$ principal minors of $A$. The most important special cases are $s_1=\textrm{tr}(A)$ and $s_n=\det(A)$.
Nov 7
We discussed the definitions of algebraic multiplicity and geometric multiplicity of an e-value, and computed those numbers in $2$ examples. Quiz.
Nov 9
We discussed the direct sum ${\mathbb U}\oplus{\mathbb W}$ of two subspaces of $\mathbb V$. We proved that a set of e-vectors for distinct e-values is independent. We derived that if $A\in M_{n\times n}(\mathbb R)$ has $n$ distinct e-values, then $\mathbb R^n$ is a direct sum of the 1-dimensional e-spaces of $A$. We proved that if $\mathcal B$ is a basis of e-vectors for $A$, then ${}_{\mathcal B}[A]_{\mathcal B}$ is a diagonal form for $A$ which has the e-values of $A$ on the diagonal.
Nov 11
We worked on practice problems. Then discussed why the characteristic polynomial of a matrix is unaffected by a change of basis, and why the algebraic multiplicity of an e-value is greater or equal to the geometric multiplicity of the e-value.
Nov 14
We discussed complex numbers, complex e-values, and complex e-vectors. We discussed diagonalization over complex numbers and block diagonalization over the real numbers. Quiz.
Nov 16
We worked out Extra Problem 3 on the HW. Then we discussed the Cayley-Hamilton Theorem, defined minimal polynomial, and asserted that a matrix $A$ is diagonalizable iff $\textrm{min}_A(t)$ factors into distinct linear factors.
Nov 18
We worked on practice problems about linear transformations of order $2$. Then we discussed an application of diagonalization to the solution of ordinary differential equations. Namely, we showed how to reduce a single $n$-th order, homogeneous, ODE with constant coefficients to a system of $n$ first-order equations, then (when diagonalizable) to a system of $n$ equations of the form $z'=\lambda z$. We showed that an equation like this has solution $z=C e^{\lambda t}$.
Nov 28
We worked on practice problems and took a quiz.
Nov 30
We discussed dot product, length, angle, orthogonality, and orthogonal complements. We explained why ${\bf u}\bullet{\bf v}=\|{\bf u}\|\cdot\|{\bf v}\|\cdot\cos(\theta)$. We proved that if $X\subseteq \mathbb R^n$, then $X^{\perp}$ is a subspace.
Dec 2
We described an algorithm to compute $X^{\perp}$ for any subset $X\subseteq \mathbb R^n$.
Dec 5
We proved that if $U\leq \mathbb R^n$ is a subspace, then $\mathbb R^n = U\oplus U^{\perp}$ and $U^{\perp\perp} = U$. We introduced the normal equations, $A^TA{\bf x}=A^T{\bf b}$, and worked on this worksheet.
Dec 7
We discussed the method of least squares (Section 6.5). We worked on this worksheet. I circulated this review sheet.
Dec 9
We reviewed for the final exam.