Topics

In the first few weeks of course, we will cover basic material that will form the foundation for the remainder of the semester. It is important to keep up and not get behind these first few weeks.

**Vectors, Matrices & Linear Equations**

We will begin with the notion of a vector, and describe algebraic and geometric properties of spaces of vectors. This material will be the foundation of the course, and will lead us to develop linear equations. After seeing what linear equations can do for us, we will see how matrices are used to solve systems of linear equations. We'll see that solutions are usually not unique, but rather are members of an infinte family of solutions. This infinite family has inherits interesting geometrical properties from the larger space in which it lives, and we'll study how to describe and build these so-called *subspaces*. Finally, we learn how this abstract nonsense is the perfect way to describe how traffic flows around the Marquette Round-About.

**Matrices & Linear Transformations**

After learning the basic arithmetic of matrices, we'll see how some special matrices are used in cool applications such as Archaeology and Cryptography. The fun will continue when we learn how to make shapes move, grow and deform using matrix transformations. We'll tie all of this together with applications to economics, population dynamics, sociology, fractals and computer graphics.

**Determinants & Eigenvectors**

We'll associate a special number, called the determinant, to a matrix and then be amazed at how much information is contained in this one number. Concepts such as area and invertibility are just a few of the nuggets of knowledge hidden inside a matrix's determinant. We'll learn how another special number, an eigenvalue, determines the amoiunt ot stretching involved in a transformation. A special vector associated to this eigenvalue is called an eigenvector and we'll see how Google uses an eigenvector to sort web pages in a search.

**Vector Spaces**

We'll learn to see images and videos as vectors and discover some more general types of vector spaces, and explore ideas such as linear combinations, independence, bases and rank. We'll use special *orthonormal* bases to project images onto one another and learn how to transform one basis onto another.

**Coordinate Representations**

We'll learn how to think of a vector as a point in space, like a GPS. We can create objects in 3-D, and use matrix tranformations to move them, i.e., to change their coordinates. We can do so without changing the shape of the object *an isometry*, or we can deform it in a variety of ways. A discussion of diagonalization of matrices will allow us to seperate transformations into complementary mini-transformations. In this way, we can view any transformation as a combination of rotations, translations and dilations.

**The Singular Value Decomposition***

We'll build on the ideas of diagonalization to constrct an important decomposition widely used in many industries. This will allow us to peer into and through a matrix to learn things such as: its fundamental subspaces, directions of maximal and minimal energy of a point cloud and more.

**Applications to Video Processing***

We'll apply the SVD and most of what we've learned to tackle cool image processing problems like image segmentation and data clustering.

**Coverage of topics with an * will depend on time**