important definitions of linear algebra

16
1 Important Definitions Linear combination: Let V be a vector space and S a nonempty subset of V. A vector  is called a linear combination of vectors of S if there exists a finite number of vectors , ,…,   and scalars , ,…,  Ϝ such that . Span: Let S be a nonempty subset of vector space V. The span of S, denoted span(S), is the set consisting of all linear combinations of the vectors in S . For convenience, we def ine span( )= {0}. Linear Transformation:  Let and  be vector spaces over F. We call a function :  a linear transformation from V to W if , , we have (closure under linear combination) 1.   2.   We will often state that T is linear if the above conditions are satisfied. Null space (kernel) and Nullity: Let V and W be vector spaces, an d let :  be linear. We define the null space (or kernel ) N(T) to be the set of all vectors x in V s.t. T(x) = 0; that is, {   ∈∶ 0}  Nullity is dim(N(T)) . Image (Range) and Rank: We define the image R(T) of T to be the subset of W consisting of all images (under the linear transformation T) of vectors in V; that is, {   : }. x is also a basis vector of V. The Rank is dim(R(T)). Onto: If : is a function with range of B, that is,     then f is called onto. So f is onto if and only if the range of f equals the codomain of f. One to One:  : is one to one if   implies  or equivalently, if implies   

Upload: sam-higginbotham

Post on 03-Jun-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 1/16

1

Important Definitions

Linear combination: Let V be a vector space and S a nonempty subset of V. A vector ∈  is

called a linear combination of vectors of S if there exists a finite number of vectors

, , … ,   ∈ and scalars

, , … ,   ∈ Ϝsuch that

⋯ .

Span: Let S be a nonempty subset of vector space V. The span of S, denoted span(S), is the set

consisting of all linear combinations of the vectors in S. For convenience, we define span(ᴓ)=

{0}.

Linear Transformation: Let and  be vector spaces over F. We call a function : →  a

linear transformation from V to W if ∀ , ∈ ∈ , we have (closure under linear

combination)

1.   

2.   

We will often state that T is linear if the above conditions are satisfied.

Null space (kernel) and Nullity: Let V and W be vector spaces, and let : →  be linear. We

define the null space (or kernel ) N(T) to be the set of all vectors x in V s.t. T(x) = 0; that is, {  ∈ ∶ 0} 

 Nullity is dim(N(T)).

Image (Range) and Rank: We define the image R(T) of T to be the subset of W consisting ofall images (under the linear transformation T) of vectors in V; that is, { : ∈ }. x

is also a basis vector of V.

The Rank is dim(R(T)).

Onto:  If : → is a function with range of B, that is,    then f is called onto. So f isonto if and only if the range of f equals the codomain of f.

One to One: 

 : →   is one to one if

   implies

 or equivalently, if

≠  

implies  ≠  

Page 2: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 2/16

2

Subspace: Let V be a vector space and W a subset of V. Then W is a subspace of V iff. the

following three conditions hold for the operations defined in V.

1.  0 ∈  

2. 

∈  whenever

∈  and

∈  

3.  ∈ whenever ∈   and ∈  

Linearly Independent /Linearly Depended: A subset S of a vector space V is called linearly

dependent if there exists a finite number of distinct vectors , , … ,   ∈  and scalars, , … ,   ∈ Ϝ , not all zer0, such that

⋯ 0 

in this case we also say that the vectors of S are linearly dependent- if not then the Subset of

Vectors is linearly independent.

Basis: A basis β for a vector space V is a linearly independent subset of V that generates (spans)V. If β is a basis for V, we also say that the vectors of β form a basis for V. 

Dimension: A vector space is called finite-dimensional if it has a basis consisting of a finite

number of vectors. The unique number of vectors in each basis for V is called the dimension of

V and is denoted by dim(V). A vector space that is not finite-dimensional is called infinite-

dimensional.

Coordinates: If { , , . . . , } is an ordered basis for V and ∈ , then the coordinates

of

with respect to

 is the vector

Ϝ given by

  ⋮ ∑

=  

Determinant: If

   

is a 2x2 matrix with entries from a field F, then we define the determinant of , denoted det(A) or

|A| to be the scalar ad-bc. Thus the determinant is a transformation : → .

Useful Properties of Determinants:

  det(AB) = det(A)det(B)

  Adding a multiple of a row to another row of the matrix does not change the determinant.

  If A is an nxn matrix, it is invertible if and only if det(A) is NOT = 0.

  The determinant of an upper triangular matrix is the product of its diagonal.

Page 3: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 3/16

3

  If a row is the sum of two rows, then the resulting determinant of the matrix that includes

the sum is the determinant of one part of the sum with the rest of the matrix untouched

and the determinant of the other part of the sum with the rest of the matrix untouched.

Eigenvector and Eigenvalue: 

Let T be a linear operator on a vector space V. A nonzero vector ∈ is called an

eigenvector of T if there exists a scalar λ s.t. λv. The scalar λ is called the eigenvalue

corresponding to the eigenvector v.

Let A be in Mmxn (F). A nonzero vector ∈  is called an eigenvector of A if v is an

eigenvector of LA; that is if  λv for some scalar λ. The scalar is also called the eigenvalue

of A corresponding to the eigenvector v.

Isomorphism: Let V and W be vector spaces. We say that V s isomorphic to W if there exists a

linear transformation

: →  that is invertible. Such a linear transformation is called an

isomorphism from V onto W. Isomorphisms are one to one and onto (bijective).

Transpose: If   ∈  then A transpose is the function of the form

 : →  

Thus the transform takes all entries Aij and makes the entries A ji for all entries.

Trace:  ∈  then trace is the linear transformation defined by

  ∑  ∀ 1,2, … , Thus   ∶ → ℝRank Nullity Theorem: If : →  is linear and dim(V)< ∞ then the following equality

holds:

dim 

Inverse: Let : →  be linear and : →  U is an inverse of T if U(T) = IW and T(U) = IV 

If these conditions are satisfied, then the we say that T is invertible.

Algebraic Multiplicity: If  is a eigenvalue of : →  the algebraic multiplicity of is the

largest positive integer M s.t.  is a factor of the characteristic polynomial.

Page 4: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 4/16

4

Geometric Multiplicity: The geometric multiplicity of an eigenvalue is the dimension of its

Eigenspace.

Invariant Subspace: If  is a vector space and : →  is a linear operator, then a subspace

⊆  is an invariant subspace if the following hold:

  ⊆  

That is for each ∈ ∃ ∈  

T-Invariant Cyclic Subspace: Fix a vector in ∈  and let the t-cyclic subspace be generated

 by this vector. The subspace is

{ , ,  , … } 

clearly

{ , ,

 , … } is a basis for

. (If you restricted W to a finite dimension, say k,

then the T-Cyclic Subspace generated by v, would have a basis v and k-1 transforms of v).

Invariant Subspace Characteristic Polynomial: If ⊆  is a T-invariant Subspace and is

finite dimensional then the characteristic polynomial of TW (the transformation restricted to the

subspace) divides the characteristic polynomial of : →  

De Moivre’s Formula: 

cos  

Theorem 7.5.3: If  ∈ ℝ with eigenvalues ±  and eigenvectors {}  and{}  where ∈ ℝ then

 

and also

−   

And by de Moivre’s formula

−  (cos sinsin cos ) 

Theorem 5.22: Let : →  be a linear transformation and  be finite dimensional. Let ∈ and { , ,  , … } and Let dim 

Page 5: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 5/16

5

1.  { , ,  , … , − } is a basis for .

2.  If ⋯ −− 0 

Then det 1  ⋯ −−  

Cayley- Hamilton Theorem: Let : →  be a linear transformation and the dimension of to be finite, and let    be the characteristic polynomial of T then   (Zero Transformation) that is that the Transformation satisfies its own characteristic

 polynomial.

Change of Coordinate Basis: If

{ , , . . . , } 

{ , , . . . , } 

Then the change of coordinate basis is the matrix that corresponds to the identity transform

relative to the basis above. The coordinate transform in this way changes  coordinates into  coordinates. Below is the formulaic method as well as the diagram method.

Let V be a vectors space generated by either of the bases above.

 

| ⋯ |  ⋯  | ⋯ |  

Where

  ⋮ 

ie

⋯ or rather for  the jth column of Q and  the entries of the change of coordinate matrix and  the ith vector in .

=  

Page 6: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 6/16

6

General Transforms: Let { , , . . . , } 

{

,

, . . . ,

A transform : →  may be represented as two different matrices doing the sametransformation relative to the different bases above.

    

| ⋯ |   ⋯  | ⋯ |  

Where

  ⋮ 

ie ⋯

Diagram Method Below:

Page 7: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 7/16

7

Direct Sum: If

and

 are both subspaces then the direct sum is the normal sum and the

condition that the intersect is zero.

 ⊕  

on the condition that { |  ∈  , ∈ } 

and that

 ⋂ {0 } 

Direct Sum of Matrices: It is convenient to write the sum of matrices. THE MATRICES MUST

BE SQUARE, and if the matrices are in question  , , … ,  

  ⊕ ⊕ … ⊕   [ 0 0 00 0 ⋱ 0

Inner Product: Let V be a vector space over F. An inner prodect on V is a function that assigns,

to every ordered pair of vectors x and y in V, a scalar in F, denoted

⟨,⟩, such that for all x, y,

and z in V and for all a in F the following hold.

1.  ⟨,⟩ ⟨,⟩ ⟨,⟩ 2.  ⟨,⟩ ⟨,⟩ 3.

 

⟨,⟩ ⟨,⟩  where the bar denotes the complex conjugate.

4.  ⟨,⟩ > 0 ≠ 0 

Conjugate Transpose:  Let  ∈ ℝ we define the conjugate transpose or adjoint of A to

 be the nxm matrix ∗ such that  ∗  for all i,j.

Theorem 6.1: Let V be an inner product space ( a space where an inner product is assigned).

Then for

,,,∈and

∈  the following statements are true. (follow from definition of

inner product)

1.  ⟨, ⟩ ⟨,⟩ ⟨,⟩ 2.  ⟨,⟩ ⟨,⟩ 3.  ⟨,0⟩ ⟨0 ,⟩ 0 

4.  ⟨,⟩ 0 if and only if 0  

5.  If ⟨,⟩ ⟨,⟩ for all then

Page 8: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 8/16

8

Norm or Length: Let V be an inner product space. For ∈, we define the norm or length of by

‖‖   √ ⟨,⟩ 

Theorem 6.2: Let V be an inner product space over F. Then for all , , ∈  and ∈  the

following statements are true.

1.  ‖‖ ||‖‖ 

2. 

‖‖ 0 if and only if

0 in any case

‖‖ > 0 

3. 

(Cauchy-Schwarz Inequality) |⟨,⟩|   ≤ ‖‖ ∗ ‖‖ 4.  (Triangle Inequality) ‖ ‖ ≤ ‖‖ ‖‖ 

Perpendicular or Orthogonal: If V is an inner product space with vectors ,,∈ then

vectors , are orthogonal if ⟨,⟩ 0. A Subset S of V is orthogonal if any two distinct

vectors in S are orthogonal. A vector  in V is said to be a unit vector if ‖‖   1. Finally asubset S of V is orthonormal if S is orthogonal and consists entirely of unit vectors.

Orthonormal Basis: Let V be an inner product space. A subset of V is an orthonormal basis for

V if it is an ordered basis that is orthonormal.

Theorem 6.4 (The Gram Schmidt Process): Let V be an inner product space and

  { , , … , } be a linearly independent subset of V. Define { , , … , } where and

∑ ⟨ , ⟩−=   2 ≤ ≤

Then S’ is an orthogonal set of nonzero vectors such that span(S’) = span(S) 

Theorem 6.6: Let W be a finite dimensional subspace of an inner product space V, and let

∈. Then there exists a unique vector ∈  and ∈⊥ such that y = u + z. Furthermore, if{ , , … , } is an orthonormal basis for W, then

∑⟨ , ⟩= 1

Page 9: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 9/16

9

The vector u is the unique vector in W, that is closest to y. we call u the orthogonal projection of

y. The picture below shows what each vector looks like

Fourier Coefficients: Let

 be an orthonormal subset (possibly infinite) of an inner product

space V, and let ∈. We define the Fourier Coefficients of relative to  to be the scalars⟨,⟩ where ∈ . Orthogonal Compliment: Let S be a nonempty subset of an inner product space V. We define⊥(read as S- perp) to be the set of all vectors in V that are orthogonal to every vector in S; that

is ⊥ { ∈∶ ⟨,⟩ 0 ∀ ∈ S} 

Theorem 6.7: Suppose that

{ , , … , } is an orthonormal set in a k-dimensional inner

 product space V. Then the following are true:

1.  S can be extended to an orthonormal basis { , , … , , +, … , } for V.

2.  If , then { +, +, … , } is an orthonormal basis for ⊥ 

3.  If W is any subspace of V, then dim dim dim⊥ 

Theorem 6.8: Let V be a finite dimensional vector space over F, and let : →   be a linear

transformation. Then there exists a unique vector ∈ such that ⟨,⟩ for all ∈ 

We may also calculate

 by the following:

If { , , . . . , } 

a orthonormal basis for V then

∑ =  

Page 10: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 10/16

10

Theorem 6.9: Let V be a finite dimension inner product space, and let T be a linear operator on

V. Then there exists a unique function ∗: →  such that ⟨, ⟩ ⟨,∗⟩ for all ,,∈. Furthermore ∗is linear.

Theorem 6.10: Let V be a finite dimensional inner product space with an orthonormal basis B. If

T is a linear operator on V. Then ∗ ∗ 

Theorem 6.11: Let V be an inner product space, and let T and U be linear operators on V. Then

the following are true:

1. 

∗ ∗ ∗ 

2.  ∗ ∗  For any c in the scalar field.

3. 

 4. 

∗∗  

5.  ∗  

The same properties hold for nxn matrices, as they represent linear transformations.

Theorem 6.12: Let  ∈  and ∈ . Then ∃   ∈  s.t.  ∗  ∗ and‖  ‖   ≤ ‖  ‖  ∀ ∈  . Furthermore if    thern  ∗ − ∗ 

The Process of “Least Squares”: 

For an abstract space: use theorem 6.6 to find the closest/ orthogonal projection onto the

subspace of interest.

For Theorem 6.12: In the x-y plane with m observations

Let

1⋮ ⋮ 1 ⋮ 

Theorem 6.13: Let  ∈  and ∈ . Suppose that   is consistent. Then the

following statements are true.

a)  There exits exactly one minimal solution, , of  , and ∈ ∗ 

 b)  The vector  is the only solution to   that lies in ∗; that is if  satisfies ∗  , then ∗  

Page 11: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 11/16

11

Lemma: Let T be a linear operator on a finite dimensional inner product space V. If T has an

eigenvector, then so does T*.

Theorem 6.14 (Schur): Let T be a linear operator on a finite dimensional inner product space V.Suppose that the characteristic polynomial of T splits. Then there exists an orthonormal basis β

for V such that the matrix [T]β is upper triangular.

Normal: Let V be an inner product space, and let T be a linear operator on V. We say that T is

normal if ∗ ∗. An n x n real or complete matrix A is normal is ∗ ∗ .

Theorem 6.15: Let V be an inner product space, and let T be a normal operator on V. Then the

following statements are true.

a)  ‖‖ ‖∗‖ for all ∈.

 b)  T –  cI is normal for all c in the scalar field.

c)  If  is an eigenvector of T, then is also an eigenvector of T*. In fact, if λ ,

then ∗ λ 

d)  If λ and λ are distinct eigenvalues of T with corresponding eigenvectors and  then

the eigenvectors are orthogonal.

Theorem 6.16: Let T be a linear operator on a finite dimensional complex inner product space

V. Then T is normal if and only if there exists an orthonormal basis for V consisting of

eigenvectors of T.

Self-Adjoint: Let T be a linear operator on an inner product space V. We say that T is self-

adjoint if ∗. An n x n real or complex matrix A is self-adjoint if  ∗.

Lemma: Let T be a self-adjoint operator on a finite dimensional inner product space V. Then

a)  Every eigenvalue of T is real.

 b) 

Suppose that V is a REAL inner product space. Then the characteristic polynomial of T

splits.(Complex inner product always splits because of the fundamental theorem of

algebra).

Theorem 6.17: Let T be a linear operator on a finite-dimensional real inner product space V.

Then T is self-adjoint if and only if there exists an orthonormal basis β for V consisting of

eigenvectors of T.

Page 12: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 12/16

12

Thus self adjoint implies there exists an orthonormal basis of eigenvectors… 

Thus an orthonormal basis of eigenvector of T insures both that T is self adjoint and normal ( as

 being self adjoint is sufficient for being normal)

Positive Definite/ Positive Semidefinite: A linear operator T on a finite dimensional inner

 product space is called positive definite/positive semidefinite if T is self-adjoint and⟨, ⟩ >0 ≥0 ≠0 

An nxn matrix A with entries from R or C (real or complex field) is called positive

definite/positive semidefinite if LA (the matrix representing the linear transformation) is positive

definite/positive semidefinite.

Problem 17 Consequence of P.D.: T is positive definite if and only if ALL the eigenvalues of T

are positive and T is self adjoint. This also means that all the eigenvalues are positive real.

Unitary/Orthogonal Operator: Let T be a linear operator on a finite dimensional inner product

space V (over F). If ‖‖ ‖‖ for all ∈. We call T a unitary operator if F = C (field

of complex numbers) and an orthogonal operator if F = R.

Theorem 6.18 Let T be a linear operator on a finite dimensional inner product space V. Then the

following statements are equivalent.

a)  ∗ ∗  

 b)  ⟨, ⟩ ⟨,⟩  ∀ ,,∈ 

c)  If β is an orthonormal basis for V, then T(β) is an orthonormal basis for V.

d)  There exits an orthonormal basis β for V such that T(β) is an orthonormal basis for V.

e)  ‖‖ ‖‖ 

A-E all imply each other…so it is sufficient to show only one property above to prove that T isin fact a unitary or orthogonal operator.

Lemma: Let U be a self-adjoint operator on a finite-dimensional inner product space V. If⟨, ⟩ 0 for all ∈ then  (The zero transformation).

Page 13: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 13/16

13

Corollary 1: Let T be a linear operator on a finite- dimensional real inner product space V. Then

V has an orthonormal basis of eigenvectors of T with corresponding eigenvalues of absolute

value 1 if and only if T is both self-adjoint and orthogonal.

Corollary 2: Let T be a linear operator on a finite dimensional complex inner product space V.

Then V has an orthonormal basis of eigenvectors of T with corresponding eigenvalues of

absolute value 1 if and only if T is unitary.

Reflection: Let L be a one-dimensional subspace of . We may view L as a line in the plane

through the origin. A linear operator T on is called a reflection of  about L if T(x) = -x for

all ∈ ⊥ 

Unitary/Orthogonal Matrix: A square matrix A is called an orthogonal matrix if      and unitary if ∗ ∗   

Theorem 6.19: Let A be a complex nxn matrix. Then A is normal if and only if A is unitarily

equivalent to a diagonal matrix. To be unitarily equivalent we mean:

  

For example if Q were a matrix whose columns are orthonormal eigenvectors of R, ( Q is

unitarily equivalent) then A would be unitarily equivalent to D and so A would be normal. (

think about the rotation matrix, it is not over R but it works…) 

Theorem 6.20: Let A be a real nxn matrix. Then A is symmetric if and only if A is orthogonally

equivalent to a real diagonal matrix.

Theorem 6.21(Schur): Let  ∈  be a matrix whose characteristic polynomial splits

over F.

a)  If F = C, then A is unitarily equivalent to a complex upper triangular matrix.

 b)  If F = R, then A is orthogonally equivalent to a real upper triangular matrix.

Page 14: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 14/16

14

Rigid Motion: Let V be a real inner product space. A function : →  is called a rigid motion

if ‖  ‖ ‖ ‖ 

Basically the length is persevered for all vectors under the linear transformation. This includestranslations and rotations, all orthogonal operators.

Theorem 6.22: Let : →  be a rigid motion on a finite dimensional real inner product space

V. Then there exists a unique orthogonal operator T on V and a unique translation g on V such

that

  ∘  

Theorem 6.23: Let T be an orthogonal operator on , and let  where β is the standard

ordered basis for . Then exactly ONE of the following conditions is satisfied.

a)  T is a rotation and det(A) = 1.

 b)  T is a reflection about a line through the origin, and det(A) = -1.

Corollary: Any rigid motion in

 is either a rotation followed by a translation or a reflection

about a line through the origin followed by a translation.

Conic Sections: We are interested in quadratics of the form:

We may rewrite this in the following matrix notation when we are interested in figuring out what

the quadratic form represents in R 2.

   2 

Where

   

Page 15: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 15/16

15

We may rewrite the system as a change of coordinates with respect to the eigenvalues and

eigenvectors of the matrix A. Because the matrix A is symmetric we can diagonalize it with

orthonormal eigenvectors. The diagonal matrix is

( 00 ) 

And the form of the quadratic becomes

 ′  

Orthogonal Projection: Let V be an inner product space and let : →  be a projection. We

say that T is an orthogonal projection if the following two conditions hold.

⊥  

⊥  

Theorem 6.24: Let V be an inner product space, and let T be a linear operator on V. Then T is

an orthogonal projection if and only if T has an adjoint T* and T2 = T = T*.

Theorem 6.25 (The Spectral Theorem): Suppose that T is a linear operator on a finite

dimensional inner product space V over F with the distinct eigenvalues:

,

, … ,

.Assume

that T is normal if F = C and that T is self-adjoint if T = R. For each

1 ≤ ≤,let

  be

the eigenspace of T corresponding to the eigenvalue  and let  be the orthogonal projection of

V onto . Then the follow statements are true:

a)  ⊕  ⊕ ⋯ ⊕   b)  If ′ denotes the direct sum of the subspaces for  ≠ , then ⊥ ′ c)    for 1 ≤ , ≤ d)  ⋯  

e)  ⋯  

Corollary 1: If F = C, then T is normal if and only if T*=g(T) for some polynomial g.

Corollary 2: If F = C then T is unitary if and only if T is normal and || 1 for every

eigenvalue of T.

Page 16: Important Definitions of Linear Algebra

8/11/2019 Important Definitions of Linear Algebra

http://slidepdf.com/reader/full/important-definitions-of-linear-algebra 16/16