copyright © 2011 pearson education, inc. solving linear systems using matrices section 6.1 matrices...

7

Upload: calvin-lambert

Post on 17-Jan-2016

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants
Page 2: Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants

Copyright © 2011 Pearson Education, Inc.

Solving Linear Systems Using Matrices

Section 6.1

Matrices and Determinants

Page 3: Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants

Copyright © 2011 Pearson Education, Inc. Slide 6-3

6.1

A matrix is a rectangular array of real numbers. The rows of a matrix run horizontally, and the columns run

vertically. A matrix with only one row is a row matrix or row vector. A matrix with only one column is a column matrix or column

vector. A matrix with m rows and n columns has size m n (read “m

by n”). The number of rows is always given first.

A square matrix has an equal number of rows and columns. Each number in a matrix is called an entry or an element. The matrix of coefficients for a system of equations in standard

form, respectively, is the coefficient matrix for the system. The constants from the right-hand side of the system are

attached to the matrix of coefficients, to form the augmented matrix of the system.

Matrices

Page 4: Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants

Copyright © 2011 Pearson Education, Inc. Slide 6-4

6.1

Any of the following row operations on an augmented matrix gives an equivalent augmented matrix:

1. Interchanging two rows of the matrixAbbreviated Ri Rj (interchange rows i and j)

2. Multiplying every entry in a row by the same nonzero real numberAbbreviated aRi Ri (a times row i replaces row i)

3. Adding to a row a nonzero multiple of another rowAbbreviated aRi + Rj Rj (aRi + Rj replaces Rj)

Summary: Row Operations

Page 5: Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants

Copyright © 2011 Pearson Education, Inc. Slide 6-5

6.1

The goal of the Gaussian elimination method is to convert the coefficient matrix (in the augmented matrix) into an identity matrix using row operations.

The diagonal of a matrix consists of the entries in the first row first column, second row second column, third row third column, and so on.

A square matrix with ones on the diagonal and zeros elsewhere is an identity matrix.

If the system has a unique solution, then it will appear in the rightmost column of the final augmented matrix.

Definitions

Page 6: Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants

Copyright © 2011 Pearson Education, Inc. Slide 6-6

6.1

To solve a system of two linear equations in two variables using Gaussian elimination, perform the following row operations on the augmented matrix.

1. If necessary, interchange R1 and R2 so that R1 begins with a nonzero entry.

2. Get a 1 in the first position on the diagonal by multiplying R1

by the reciprocal of the first entry in R1.

3. Add an appropriate multiple of R1 to R2 to get 0 below the first 1.

4. Get a 1 in the second position on the diagonal by multiplying R2 by the reciprocal of the second entry in R2.

5. Add an appropriate multiple of R2 to R1 to get 0 above the second 1.

6. Read the unique solution from the last column of the final augmented matrix.

Procedure: The Gaussian Elimination Method for an Independent System of Two Equations

Page 7: Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants

Copyright © 2011 Pearson Education, Inc. Slide 6-7

6.1

A system is independent if it has a single solution. The coefficient matrix of an independent system is

equivalent to an identity matrix. A system is inconsistent if it has no solution and is

dependent if it has infinitely many solutions. Applying Gaussian elimination to an inconsistent system

causes a row to appear with 0 as the entry for each coefficient but a nonzero entry for the constant.

For a dependent system of two equations in two variables, a 0 will appear in every entry for some row.

Likewise, for a system of three equations in three variables we have the same results for inconsistent and dependent systems.

Inconsistent and Dependent Equations