Portland Community College | Portland, Oregon Portland Community College

CCOG for MTH 261 archive revision 201403

You are viewing an old version of the CCOG. View current version »

Effective Term:
Summer 2014 through Winter 2015
Course Number:
MTH 261
Course Title:
Applied Linear Algebra I
Credit Hours:
5
Lecture Hours:
50
Lecture/Lab Hours:
0
Lab Hours:
0

Course Description

Surveys linear algebra with some applications. Includes linear systems, vectors, and vector spaces, including eigenspaces. Graphing calculator required. TI-89 Titanium or Casio Classpad 330 recommended. Audit available.

Addendum to Course Description

This course is designed to familiarize students with the elementary concepts of linear algebra. The emphasis of the course is applications of linear algebra; abstract theory is kept to a minimum. Upon completion of the course, students will be familiar with the vocabulary of linear algebra and will have been exposed to numerous applications.

Intended Outcomes for the course

  Upon completion of this course the learner should be able to do the following things:

• Analyze real world scenarios to recognize when vectors, matrices, or linear systems are appropriate, formulate problems about the scenarios, creatively
model these scenarios (using technology, if appropriate) in order to solve the problems using multiple approaches, judge if the results are reasonable, and then interpret and clearly communicate the results.
• Appreciate linear algebra concepts that are encountered in the real world, understand and be able to communicate the underlying mathematics involved
to help another person gain insight into the situation.
• Work with vectors, matrices, or linear systems symbolically and geometrically in various situations and use correct mathematical terminology, notation,
and symbolic processes in order to engage in work, study, and conversation on topics involving vectors, matrices, or systems of linear equations with
colleagues in the field of mathematics, science or engineering.

Quantitative Reasoning

Students completing an associate degree at Portland Community College will be able to analyze questions or problems that impact the community and/or environment using quantitative information.

Course Activities and Design

In-class activities are primarily lecture/discussion and problem-solving sessions. The students may use appropriate technology to investigate and reinforce concepts presented in class. A graphing calculator is required.

Outcome Assessment Strategies

 1.   Demonstrate an understanding of various types of linear systems and their applications to real world problems in:

·            At least two in-class proctored exams of one or more hours, one of which is a comprehensive final exam.

·            Proctored exams should be worth at least 50% of the overall grade.

In addition at least two of the following measures:

·            Take home exam(s)

·            Quizzes

·            Computer lab assignments

·            Homework

2.   Demonstrate the ability to communicate with colleagues on the topics of linear algebra by doing

·            At least one group or individual project with written report and/or oral in-class presentation and at least one of the following:

·            Participation in class discussions.

·            In-class group activities

·            Attendance

Course Content (Themes, Concepts, Issues and Skills)

 SKILLS

1. Context Specific Skills

·            Students will learn to describe linear structures verbally, from the geometric, symbolic, and numeric points of view.

·            Students will learn to recognize underlying vector space structures in a variety of abstract and applied contexts.

·            Students will learn to apply the terminology and notation of Linear Algebra correctly and appropriately in a variety of abstract and applied contexts.

·            Students will learn to reduce to echelon/rref, compute the inverse and perform a variety of (arithmetic) matrix operations by hand for at least 3x3 matrices.

·            Students will learn to construct linear models for a variety of applied problems.

·            Students will acquire proficiency in the use of linear transformations and matrix algebra in solving a variety of abstract and applied problems.

2.   Learning Process Skills

·            Classroom activities will include lecture/discussion and group work.

·            Students will communicate their results in oral and written form.

·            Students will apply concepts to real-world problems.

·            Calculators and/or computers will be used by the students for tasks such as row reduction, diagonalization, and special factorization of matrices, as well as for solving systems of linear equations.

THEMES CONCEPTS, and ISSUES

1.0 MATRIX ALGEBRA

1.1 Systems of Linear Equations.

1.1.1      Write the augmented matrix of a system of m-linear equations in n-variables.

1.1.2      Identify matrices that are in row echelon form and reduced row echelon form.

1.1.3      Transform an mxn matrix to reduced row echelon form using elementary row operations and Gauss-Jordan elimination.

1.1.4      Solve a system of m-linear equations in n-variables by transforming its augmented matrix to reduced row echelon form.

1.1.5      Identify when a linear system is consistent and when it is inconsistent and interpret the solution geometrically.

1.1.6      For consistent systems, identify when the system has one unique solution and when it has infinitely many solutions.

1.1.7      When a system has infinitely many solutions, express the solution set by solving for the pivot variables in terms of the free variables. Write the solution set in parametric form and in vector form.

1.1.8      Define a homogeneous linear system and recognize that homogenous systems are always consistent. In particular, a non-trivial solution exists if there are more variables than equations.

1.1.9      Construct (back-engineer) specific linear systems of m equations and n unknowns to yield predetermined solutions in terms of geometries. For example, the solution to a 5 x 4 system should take on the form of a line off of the origin in R4

1.1.10   When arriving at a parametric (vector) form of a solution for a linear system  involving both free and pivot variables, demonstrate the ability to express this solution form using different combinations of free and pivot variables, other than  those immediately returned from an augmented matrix. 

1.2 Matrix Operations.

1.2.1      Define the operations of addition and scalar multiplication of mxn matrices. Study the algebraic properties of mxn matrices under these operations.

1.2.2      Define the transpose of an mxn matrix and demonstrate the ability to use transpose properties (sums, scalar products, products, transpose of a transpose, etc).

1.2.3      Define the product of an mxn matrix and an nxp matrix. Study the algebraic properties of matrix multiplication.

1.2.4      Define the nxn identity matrix I. Study the identity property of matrix multiplication.

1.2.5      Define and study the properties of invertible matrices. Identify when a square matrix is not invertible and compute the inverse of an invertible matrix.

1.2.6      Write a system of linear equations as a matrix equation. If the coefficient matrix of the system is invertible, solve the system using the inverse of the coefficient matrix.

1.2.7      Define the determinant of a square matrix and compute determinants by cofactor expansion across any row or down any column of a square matrix. Use determinants to determine the invertibilty of a square matrix.

2.0 VECTOR SPACES

 

2.1 n -Dimensional Euclidean Space, Rn.

2.1.1      Define addition and scalar multiplication of vectors in Rn. Study the algebraic properties of vectors under vector addition and scalar multiplication.

2.1.2      Define a linear combination of vectors in Rn. Solve a linear system to determine if a particular vector is a linear combination of a given set of vectors.

2.1.3      Define the linear dependence or independence of a subset of vectors in Rn. Determine the linear dependence or independence of a given set of vectors by solving a homogeneous linear system.

2.1.4      Recognize that a square matrix is invertible if and only if its column vectors are linearly independent.

2.2 General Vector Spaces.

2.2.1      Define a vector space in terms of an arbitrary set with defined operations of vector addition and scalar multiplication that satisfy the vector space axioms.

2.2.2      Introduce different examples of vector spaces including matrix spaces and function spaces, under different definitions of vector addition and scalar multiplication.

2.2.3      Define a subspace of a vector space. Determine whether a given subset of a vector space is a subspace.

2.2.4      Define the span of a set of vectors from a vector space and recognize that the span of a subset of vectors is a subspace of the parent vector space.

2.2.5      Define the nullspace and the columnspace of an mxn matrix A and study properties and the significance of these subspaces.

2.2.6      Define a basis of a vector space. Determine whether a given subset of vectors forms a basis for a vector space.

2.2.7      Define the dimension of a vector space. Study the relationships between linearly independent sets, spanning sets, bases, and dimension. Extract a basis from a set of vectors that spans a vector space. Extend a linearly independent subset of vectors to a basis for the vector space.

2.2.8      Define an ordered basis for a vector space. Define the coordinates of a particular vector with respect to a given ordered basis. Compute the coordinates of a vector with respect to a given ordered basis. Use a transition matrix to convert the coordinates for a vector with respect to one ordered basis to its coordinates with respect to a second ordered basis.

2.3 Orthogonality.

2.3.1      Define the dot product on Rn. Study algebraic properties of the dot product.

2.3.2       Define distance and magnitude in terms of the dot product. Study algebraic properties of distance and magnitude.

2.3.3      Define orthogonal vectors. Define orthogonal and orthonormal subsets of Rn.

2.3.4      Compute the coefficients of vectors in Rn with respect to orthogonal and orthonomal bases using the dot product.

2.3.5      Decompose vectors in Rn into components lying in, and orthogonal to a given subspace of Rn. Construct the projection matrix relative to a given subspace.

2.3.6      Produce an orthogonal basis for a given subspace of Rn using the Gram-Schmidt process.

2.3.7      Decompose an invertible square matrix A into a product QR, for Q an orthogonal matrix, and Q'R', for Q' an orthonormal matrix. In this case, R' = (Q')tA

3.0 LINEAR TRANSFORMATIONS

3.1 Properties of Linear Transformations.

3.1.1      Define a linear transformation form a vector space V to a vector space W. Distinguish between linear and non-linear transformations. Study algebraic properties of linear transformations.

3.1.2      Define the null space (kernel) and the range (image) of a linear transformation T: V -> W. If N(T) is the null space of T then N(T) is a subspace of V. If R(T) is the range of T , then R(T) is a subspace of W.

3.1.3      Define the nullity and the rank of a linear transformation. Study the rank-nullity Theorem; i.e. if T: V -> W is a linear transformation, then dim(V) = dim(R(T))+ dim(N(T)).

3.1.4      Define a vector space isomorphism. Characterize vector space isomorphisms in terms of the rank-nullity theorem.

3.1.5      Define the inverse of a vector space isomorphism.

3.2 Matrix Representations of Linear Transformations.

3.2.1      Define the matrix representation of a linear transformation T: Rn -> Rm relative to the standard bases for Rn and Rm. Compute the matrix representation of such a linear transformation.

3.2.2      Define the matrix representation of a linear transformation T: Rn -> Rm  relative to arbitrary ordered bases, B1 and B2 for Rnand Rm respectively. Compute the matrix representation of such a linear transformation relative to B1 and B2.

3.2.3      Define the matrix representation of a linear transformation T: V -> W where V and W are arbitrary vector spaces, relative to ordered bases, B1 and B2 for Vand W respectively. Compute the matrix representation of such a linear transformation relative to B1 and B2.

3.2.4      Define similar matrices. Study similar matrices from the perspective of the matrix representation of a linear operator relative to an ordered basis, and a given change of basis.

3.3 Eigenvalues and Eigenvectors.

3.3.1      Define eigenvectors and eigenvalues of an nxn matrix, A. Calculate eigenvalues by solving the characteristic equation of the matrix, det(A- λI)=0. Calculate eigenvectors by solving the linear system (A- λI)x=0.

3.3.2      Define the eigenspace corresponding to an eigenvalue, λ. Prove that an eigenspace is a subspace of Rn. Define eigenvectors and eigenvalues of a linear operator, T on a vector space V.

3.3.3      Determine geometric and algebraic multiplicities for all eigenvalues of a given an n x n matrix A. The geometric multiplicity of λ equals the nullity of (A- λI).

3.3.4      Emphasize the significance of the eigenvalue λ = 0, as it relates to the invertibility of A.

3.3.5      Define a diagonalizable nxn  matrix, A. Define an eigenbasis for A. Prove an nxn matrix A is diagonalizable if and only if Rn has an eigenbasis for A. Study diagonalizable matrices in terms of the matrix representation of a linear transformation relative to an eigenbasis. Diagonalize a diagonalizable nxn matrix.

3.3.6      Compute powers of a diagonalizable matrix; i.e. if A = PDP-1 then An = P DnP-1.

3.3.7      Define a diagonalizable linear operator.

4.0 APPLICATIONS

4.1 Students will demonstrate mastery of three or more linear algebra applications similar in depth to those listed below.

4.1.1      Encryption and coding of messages and other data.

4.1.2      Linearization of non-linear systems of equations.

4.1.3      Markov chains (powers of transition matrices).

4.1.4      Path components of digraphs (via powers of the adjacency matrix).

4.1.5      Least squares approximation (more general context than statistics).

4.1.6      Fourier series approximations of periodic functions

4.1.7      Transforms (z, Fast Fourier, Walsh) and filters, for digital signal processing.

4.1.8      Principal Axes for Quadric Surfaces.

4.1.9      Principal Axes of rotation (from moment of inertia matrix).

4.1.10   Principal component analysis (from the covariance matrix).

4.1.11   Second derivative test in 2 dimensions or higher (Hessian matrix).

4.1.12   Normal modes of vibration.

4.1.13   Computer graphics.

4.1.14   Applications of matrix powers in linear recursion relations.

4.1.15   Exponentiation of matrices (base e) in differential equations.