invert ible
DESCRIPTION
transformacion invertibleTRANSCRIPT
-
A First Course in LinearAlgebraPrefaceDedication andAcknowledgements
Systems of LinearEquationsWhat is Linear Algebra?Solving Systems of LinearEquationsReduced Row-EchelonFormTypes of Solution SetsHomogeneous Systems ofEquationsNonsingular Matrices
VectorsVector OperationsLinear CombinationsSpanning SetsLinear IndependenceLinear Dependence andSpansOrthogonality
MatricesMatrix OperationsMatrix MultiplicationMatrix Inverses andSystems of LinearEquationsMatrix Inverses andNonsingular MatricesColumn and Row SpacesFour Subsets
Vector Spaces
In this section we will conclude our introduction to linear transformations bybringing together the twin properties of injectivity and surjectivity andconsider linear transformations with both of these properties.Subsection IVLT: Invertible LinearTransformationsOne preliminary denition, and then we will have our main denition for thissection.Denition IDLT: Identity Linear Transformation. The identitylinear transformation on the vector space is dened as
Informally, is the do-nothing function. You should check that is reallya linear transformation, as claimed, and then compute its kernel and range tosee that it is both injective and surjective. All of these facts should bestraightforward to verify ( ). With this in hand we can makeour main denition.Denition IVLT: Invertible Linear Transformations. Suppose that
is a linear transformation. If there is a function suchthat
then is invertible. In this case, we call the inverse of and write.
Informally, a linear transformation is invertible if there is a companion lineartransformation, , which undoes the action of . When the two lineartransformations are applied consecutively (composition), in either order, theresult is to have no real eect. It is entirely analogous to squaring a positivenumber and then taking its (positive) square root.Here is an example of a linear transformation that is invertible. As usual at thebeginning of a section, do not be concerned with where came from, justunderstand how it illustrates .
Example AIVLT: An invertible linear transformation
Example AIVLT: An invertible linear transformation is the linear transformation
A First Course in Linear Algebra Linear Transformations Invertible LinearTransformations Invertible LinearTransformations
W
: W W , (w) = wIW IWIW IW
Exercise IVLT.T05
T : U V S : V U
S T = IU T S = IVT S T
S = T 1
TS T
SDenition IVLT
Archetype V
http://linear.ups.edu/html/section-IVLT.html
1 de 11 05/05/15 01:55
-
Vector SpacesSubspacesLinear Independence andSpanning SetsBasesDimensionProperties of Dimension
DeterminantsDeterminant of a MatrixProperties of Determinantsof Matrices
EigenvaluesEigenvalues andEigenvectorsProperties of Eigenvaluesand EigenvectorsSimilarity andDiagonalization
Linear TransformationsLinear TransformationsInjective LinearTransformationsSurjective LinearTransformationsInvertible LinearTransformations
RepresentationsVector RepresentationsMatrix RepresentationsChange of BasisOrthonormalDiagonalization
PreliminariesComplex NumberOperationsSets
Archetypes ABCDEFGHI JKLM NOPQRSTUVWX
Dene the function dened by
Then
and
For now, understand why these computations show that is invertible,and that . Maybe even be amazed by how works so perfectlyin concert with ! We will see later just how to arrive at the correct formof (when it is possible).
(in context)
It can be as instructive to study a linear transformation that is not invertible.
Example ANILT: A non-invertible linear transformation
Example ANILT: A non-invertible linear transformation
T : , T (a + bx + c + d ) = [ ]P3 M22 x2 x3 a + bd a 2cb dS : M22 P3
S([ ]) = (a c d) + (c + d)x + (a b c d) + cac
bd
12 x
2 x3
(T S)([ ])ac
bd
= T (S([ ]))ac
bd
= T ((a c d) + (c + d)x + (a b c d) + c )12 x2 x3= [ ](a c d) + (c + d)
c(a c d) 2( (a b c d))12
(c + d) c= [ ]a
cbd
= ([ ])IM22 ac bd
(S T ) (a + bx + c + d )x2 x3= S (T (a + bx + c + d ))x2 x3
= S([ ])a + bd
a 2cb d
= ((a + b) d (b d)) + (d + (b d))x+ ( ((a + b) (a 2c) d (b d))) + (d)12 x2 x3= a + bx + c + dx2 x3= (a + bx + c + d )IP3 x2 x3
TS = T 1 S
TS
./knowls/example.AIVLT.knowl
http://linear.ups.edu/html/section-IVLT.html
2 de 11 05/05/15 01:55
-
ReferenceNotationDenitionsTheoremsDiagramsExamplesSageProof TechniquesGFDL License
Consider the linear transformation dened by
Suppose we were to search for an inverse function .First verify that the matrix is not in the range of .
This will amount to nding an input to , , such that
As this system of equations is inconsistent, there is no input columnvector, and . How should we dene ? Note that
So any denition we would provide for must then be a columnvector that sends to and we would have , contrary to thedenition of . This is enough to see that there is no function that willallow us to conclude that is invertible, since we cannot provide aconsistent denition for if we assume is invertible.Even though we now know that is not invertible, let us not leave thisexample just yet. Check that
How would we dene ?
or
Which denition should we provide for ? Both are necessary. Butthen is not a function. So we have a second reason to know that thereis no function that will allow us to conclude that is invertible. Ithappens that there are innitely many column vectors that would haveto take to . Construct the kernel of ,
T : C3 M22
T = [ ]abc
a b3a + b + c 2a + 2b + c2a 6b 2c
S : M22 C3
2 2 A = [ ]58 32 TTabc
a b2a + 2b + c
3a + b + c2a 6b 2c
= 5= 3= 8= 2
A R (T ) S (A)T (S (A)) = (T S) (A) = (A) = AIM22
S (A)T A A R (T )T S
TS (A) T
T
T
124
= [ ] = B35 22 T
038
= [ ] = B35 22
S (B)
S (B) = S T = (S T ) = =
124
124
IC3
124
124
S (B) = S T = (S T ) = =
038
038
IC3
038
038
S (B)S
S TS
B T
http://linear.ups.edu/html/section-IVLT.html
3 de 11 05/05/15 01:55
-
Now choose either of the two inputs used above for and add to it ascalar multiple of the basis vector for the kernel of . For example,
then verify that . Practice creating a few more inputs for thatwould be sent to , and see why it is hopeless to think that we couldever provide a reasonable denition for ! There is a wholesubspace's worth of values that would have to take on.
(in context)
In you may have noticed that is not surjective, since thematrix was not in the range of . And is not injective since there are twodierent input column vectors that sends to the matrix . Lineartransformations that are not surjective lead to putative inverse functions that are undened on inputs outside of the range of . Linear transformations
that are not injective lead to putative inverse functions that are multiply-dened on each of their inputs. We will formalize these ideas in
.But rst notice in that we only require the inverse (when itexists) to be a function. When it does exist, it too is a linear transformation.Theorem ILTLT: Inverse of a Linear Transformation is a LinearTransformation. Suppose that is an invertible lineartransformation. Then the function is a linear transformation.
So when has an inverse, is also a linear transformation. Furthermore, is an invertible linear transformation and its inverse is what you might
expect.Theorem IILT: Inverse of an Invertible Linear Transformation.Suppose that is an invertible linear transformation. Then is aninvertible linear transformation and .
Sage IVLT: Invertible Linear Transformations
Subsection IV: InvertibilityWe now know what an inverse linear transformation is, but just which lineartransformations have inverses? Here is a theorem we have been preparing forall chapter long.Theorem ILTIS: Invertible Linear Transformations are Injectiveand Surjective. Suppose is a linear transformation. Then isinvertible if and only if is injective and surjective.
K (T ) =
114
TT
x = + (2) =
124
114
30
4
T (x) = B TB
S (B)S (B)
./knowls/example.ANILT.knowl
Example ANILT TA T T
T BT S
TT S
TheoremILTIS
Denition IVLT
T : U V: V UT 1
ProofT T 1
T 1
T : U V T 1= T( )T 1 1
Proof
T : U V TT
http://linear.ups.edu/html/section-IVLT.html
4 de 11 05/05/15 01:55
-
When a linear transformation is both injective and surjective, the pre-image ofany element of the codomain is a set of size one (a singleton). This factallowed us to construct the inverse linear transformation in one half of theproof of (see ) and is illustrated in thefollowing cartoon. This should remind you of the very general which was used to illustrate about pre-images, only now we havean invertible linear transformation which is therefore surjective and injective( ). As a surjective linear transformation, there are no vectorsdepicted in the codomain, , that have empty pre-images. More importantly,as an injective linear transformation, the kernel is trivial ( ), soeach pre-image is a single vector. This makes it possible to turn around allthe arrows to create the inverse linear transformation .
Diagram IVLT Invertible Linear Transformation
Many will call an injective and surjective function a bijective function or just abijection. tells us that this is just a synonym for the terminvertible (which we will use exclusively).We can follow the constructive approach of the proof of toconstruct the inverse of a specic linear transformation, as the next exampleshows.
Example CIVLT: Computing the Inverse of a LinearTransformationsWe will make frequent use of the characterization of invertible lineartransformations provided by . The next theorem is a goodexample of this, and we will use it often, too.Theorem CIVLT: Composition of Invertible LinearTransformations. Suppose that and are invertiblelinear transformations. Then the composition, is aninvertible linear transformation.
When a composition is invertible, the inverse is easy to construct.Theorem ICLT: Inverse of a Composition of LinearTransformations. Suppose that and are invertiblelinear transformations. Then is invertible and .
Notice that this theorem not only establishes what the inverse of is, it
Proof
Theorem ILTIS Proof Technique CDiagram KPI
Theorem KPITheorem ILTIS
VTheorem KILT
T 1
Theorem ILTIS
Theorem ILTIS
Theorem ILTIS
T : U V S : V W(S T ) : U W
Proof
T : U V S : V WS T = (S T )1 T 1 S1
ProofS T
http://linear.ups.edu/html/section-IVLT.html
5 de 11 05/05/15 01:55
-
also duplicates the conclusion of and also establishes theinvertibility of . But somehow, the proof of is a nicer wayto get this property.Does remind you of the avor of any theorem we have seenabout matrices? (Hint: Think about getting dressed.) Hmmmm.
Sage CIVLT: Computing the Inverse of a LinearTransformations
Subsection SI: Structure and IsomorphismA vector space is dened ( ) as a set of objects (vectors)endowed with a denition of vector addition ( ) and a denition of scalarmultiplication (written with juxtaposition). Many of our denitions aboutvector spaces involve linear combinations ( ), such as the span ofa set ( ) and linear independence ( ). Other denitionsare built up from these ideas, such as bases ( ) and dimension( ). The dening properties of a linear transformation require that afunction respect the operations of the two vector spaces that are thedomain and the codomain ( ). Finally, an invertible lineartransformation is one that can be undone it has a companion thatreverses its eect. In this subsection we are going to begin to roll all theseideas into one.A vector space has structure derived from denitions of the two operationsand the requirement that these operations interact in ways that satisfy theten properties of . When two dierent vector spaces have aninvertible linear transformation dened between them, then we can translatequestions about linear combinations (spans, linear independence, bases,dimension) from the rst vector space to the second. The answers obtained inthe second vector space can then be translated back, via the inverse lineartransformation, and interpreted in the setting of the rst vector space. We saythat these invertible linear transformations preserve structure. And we saythat the two vector spaces are structurally the same. The precise term isisomorphic, from Greek meaning of the same form. Let us begin to try tounderstand this important concept.Denition IVS: Isomorphic Vector Spaces. Two vector spaces and
are isomorphic if there exists an invertible linear transformation withdomain and codomain , . In this case, we write , and thelinear transformation is known as an isomorphism between and .A few comments on this denition. First, be careful with your language (
). Two vector spaces are isomorphic, or not. It is a yes/no situationand the term only applies to a pair of vector spaces. Any invertible lineartransformation can be called an isomorphism, it is a term that applies tofunctions. Second, given a pair of vector spaces there might be severaldierent isomorphisms between the two vector spaces. But it only takes theexistence of one to call the pair isomorphic. Third, isomorphic to , or isomorphic to ? It does not matter, since the inverse linear transformationwill provide the needed isomorphism in the opposite direction. Beingisomorphic to is an equivalence relation on the set of all vector spaces (see
for a reminder about equivalence relations).
Example IVSAV: Isomorphic vector spaces, Archetype V
Theorem CIVLTS T Theorem CIVLT
Theorem ICLT
Denition VS+
Denition LCDenition SS Denition LI
Denition BDenition D
Denition LT
Denition VS
UV T
U V T : U V U VT U V
ProofTechnique L
U V VU
Theorem SER
http://linear.ups.edu/html/section-IVLT.html
6 de 11 05/05/15 01:55
-
In we avoided a computation in by a conversion of thecomputation to a new vector space, , via an invertible lineartransformation (also known as an isomorphism). Here is a diagram meant toto illustrate the more general situation of two vector spaces, and , and aninvertible linear transformation, . The diagram is simply about a sum of twovectors from , rather than a more involved linear combination. It shouldremind you of .
Diagram AIVS Addition in Isomorphic Vector Spaces
To understand this diagram, begin in the upper-left corner, and by goingstraight down we can compute the sum of the two vectors using the additionfor the vector space . The more circuitous alternative, in the spirit of
, is to begin in the upper-left corner and then proceedclockwise around the other three sides of the rectangle. Notice that the vectoraddition is accomplished using the addition in the vector space . Then,because is a linear transformation, we can say that the result of
is equal to . Then the key feature is to recognizethat applying obviously converts the second version of this result into thesum in the lower-left corner. So there are two routes to the sum , eachemploying an addition from a dierent vector space, but one is direct andthe other is roundabout. You might try designing a similar diagram for thecase of scalar multiplication (see ) or for a full linearcombination.Checking the dimensions of two vector spaces can be a quick way to establishthat they are not isomorphic. Here is the theorem.Theorem IVSED: Isomorphic Vector Spaces have EqualDimension. Suppose and are isomorphic vector spaces. Then
.
The contrapositive of says that if and have dierentdimensions, then they are not isomorphic. Dimension is the simpleststructural characteristic that will allow you to distinguish non-isomorphicvector spaces. For example is not isomorphic to since theirdimensions (7 and 12, respectively) are not equal. With tools developed inSection VR we will be able to establish that the converse of istrue. Think about that one for a moment.Subsection RNLT: Rank and Nullity of a LinearTransformationJust as a matrix has a rank and a nullity, so too do linear transformations. Andjust like the rank and nullity of a matrix are related (they sum to the numberof columns, ) the rank and nullity of a linear transformation are
Example IVSAV P3M22
U VT
UDiagram DLTA
UExample IVSAV
VT
T ( ) + T ( )u1 u2 T ( + )u1 u2T 1
+u1 u2
Diagram DLTM
U Vdim (U) = dim (V )Proof
Theorem IVSED U V
P6 M34
Theorem IVSED
Theorem RPNC
http://linear.ups.edu/html/section-IVLT.html
7 de 11 05/05/15 01:55
-
related. Here are the denitions and theorems, see the Archetypes(Archetypes) for loads of examples.Denition ROLT: Rank Of a Linear Transformation. Suppose that
is a linear transformation. Then the rank of , , is thedimension of the range of ,
Denition NOLT: Nullity Of a Linear Transformation. Supposethat is a linear transformation. Then the nullity of , , is thedimension of the kernel of ,
Here are two quick theorems.Theorem ROSLT: Rank Of a Surjective Linear Transformation.Suppose that is a linear transformation. Then the rank of is thedimension of , , if and only if is surjective.
Theorem NOILT: Nullity Of an Injective Linear Transformation.Suppose that is a linear transformation. Then the nullity of iszero, , if and only if is injective.
Just as injectivity and surjectivity come together in invertible lineartransformations, there is a clear relationship between rank and nullity of alinear transformation. If one is big, the other is small.Theorem RPNDD: Rank Plus Nullity is Domain Dimension.Suppose that is a linear transformation. Then
said that the rank and nullity of a matrix sum to the number ofcolumns of the matrix. This result is now an easy consequence of
when we consider the linear transformation dened withthe matrix by . The range and kernel of are identical tothe column space and null space of the matrix ( ,
), so the rank and nullity of the matrix are identical to the rank andnullity of the linear transformation . The dimension of the domain of is thedimension of , exactly the number of columns for the matrix .This theorem can be especially useful in determining basic properties of lineartransformations. For example, suppose that is a lineartransformation and you are able to quickly establish that the kernel is trivial.Then . First this means that is injective by . Also,
becomes
So the rank of is equal to the rank of the codomain, and by we know is surjective. Finally, we know is invertible by . Sofrom the determination that the kernel is trivial, and consideration of various
T : U V T r (T )T
r (T ) = dim (R (T ))
T : U V T n (T )T
n (T ) = dim (K (T ))
T : U V TV r (T ) = dim (V ) T
Proof
T : U V Tn (T ) = 0 T
Proof
T : U Vr (T ) + n (T ) = dim (U)
ProofTheorem RPNC
TheoremRPNDD T : Cn Cm
m n A T (x) = Ax TA Exercise ILT.T20 Exercise
SLT.T20 AT T
Cn A
T : C6 C6
n (T ) = 0 T Theorem NOILTTheorem RPNDD
6 = dim ( ) = r (T ) + n (T ) = r (T ) + 0 = r (T )C6
T Theorem ROSLTT T Theorem ILTIS
http://linear.ups.edu/html/section-IVLT.html
8 de 11 05/05/15 01:55
-
dimensions, the theorems of this section allow us to conclude the existence ofan inverse linear transformation for . Similarly, can be usedto provide alternative proofs for , and
. It would be an interesting exercise to construct these proofs.It would be instructive to study the archetypes that are linear transformationsand see how many of their properties can be deduced just from consideringonly the dimensions of the domain and codomain. Then add in just knowledgeof either the nullity or rank, and see how much more you can learn about thelinear transformation. The table preceding all of the archetypes (Archetypes)could be a good place to start this analysis.
Sage LTOE: Linear Transformation Odds and Ends
Subsection SLELT: Systems of LinearEquations and Linear TransformationsThis subsection does not really belong in this section, or any other section, forthat matter. It is just the right time to have a discussion about theconnections between the central topic of linear algebra, lineartransformations, and our motivating topic from Chapter SLE, systems of linearequations. We will discuss several theorems we have seen already, but we willalso make some forward-looking statements that will be justied in Chapter R.
and are ideal examples to illustrate connections withlinear transformations. Both have the same coecient matrix,
To apply the theory of linear transformations to these two archetypes, employthe matrix-vector product ( ) and dene the lineartransformation,
tells us that is indeed a linear transformation. asks for solutions to , where . In the language of linear
transformations this is equivalent to asking for . In the language ofvectors and matrices it asks for a linear combination of the four columns of
that will equal . One solution listed is . With a nonempty preimage,
tells us that the complete solution set of the linear system is thepreimage of ,
The kernel of the linear transformation is exactly the null space of the
T Theorem RPNDDTheorem ILTD Theorem SLTD Theorem
IVSED
Archetype D Archetype E
D =
231
141
754
765
Denition MVP
T : , T (x) = Dx = + + +C4 C3 x1
231
x2
141
x3
754
x4
765
Theorem MBLT T Archetype DLS (D,b) b =
8124
(b)T 1D
b w =
7813
Theorem KPIb
w + K (T ) = {w + z z K (T )}T
http://linear.ups.edu/html/section-IVLT.html
9 de 11 05/05/15 01:55
-
matrix (see ), so this approach to the solution set should bereminiscent of . The kernel of the linear transformation is thepreimage of the zero vector, exactly equal to the solution set of thehomogeneous system . Since has a null space of dimension two,every preimage (and in particular the preimage of ) is as big as asubspace of dimension two (but is not a subspace).
is identical to but with a dierent vector ofconstants, . We can use the same linear transformation to discuss
this system of equations since the coecient matrix is identical. Now the setof solutions to is the pre-image of , . However, the vector
is not in the range of the linear transformation (nor is it in the column spaceof the matrix, since these two sets are equal by ). So theempty pre-image is equivalent to the inconsistency of the linear system.These two archetypes each have three equations in four variables, so eitherthe resulting linear systems are inconsistent, or they are consistent andapplication of tells us that the system has innitely manysolutions. Considering these same parameters for the linear transformation,the dimension of the domain, , is four, while the codomain, , hasdimension three. Then
So the kernel of is nontrivial simply by considering the dimensions of thedomain (number of variables) and the codomain (number of equations).Pre-images of elements of the codomain that are not in the range of areempty (inconsistent systems). For elements of the codomain that are in therange of (consistent systems), tells us that the pre-images arebuilt from the kernel, and with a nontrivial kernel, these pre-images areinnite (innitely many solutions).When do systems of equations have unique solutions? Consider the system oflinear equations and the linear transformation . If hasa trivial kernel, then pre-images will either be empty or be nite sets withsingle elements. Correspondingly, the coecient matrix will have a trivialnull space and solution sets will either be empty (inconsistent) or contain asingle solution (unique solution). Should the matrix be square and have atrivial null space then we recognize the matrix as being nonsingular. A squarematrix means that the corresponding linear transformation, , hasequal-sized domain and codomain. With a nullity of zero, is injective, andalso tells us that rank of is equal to the dimension of thedomain, which in turn is equal to the dimension of the codomain. In otherwords, is surjective. Injective and surjective, and tells us that
is invertible. Just as we can use the inverse of the coecient matrix to ndthe unique solution of any linear system with a nonsingular coecient matrix( ), we can use the inverse of the linear transformation toconstruct the unique element of any pre-image (proof of ).The executive summary of this discussion is that to every coecient matrix ofa system of linear equations we can associate a natural linear transformation.Solution sets for systems with this coecient matrix are preimages of
D Exercise ILT.T20Theorem PSPHS
LS (D,0) Db
Archetype E Archetype Dd =
232
T
LS (D,d) d (d)T 1d
Exercise SLT.T20
Theorem CMVEIC4 C3
n (T ) = dim ( ) r (T )C4= 4 dim (R (T )) 4 3= 1
Theorem RPNDDDefinition ROLTR (T ) subspace of C3
T
T
T Theorem KPI
LS (C, f) S (x) = Cx SC
TT
Theorem RPNDD TT Theorem ILTIS
T
Theorem SNCMTheorem ILTIS
http://linear.ups.edu/html/section-IVLT.html
10 de 11 05/05/15 01:55
-
elements of the codomain of the linear transformation. For every theoremabout systems of linear equations there is an analogue about lineartransformations. The theory of linear transformations provides all the tools torecreate the theory of solutions to linear systems of equations.We will continue this adventure in Chapter R.
Sage SUTH1: Sage Under The Hood, Round 1Reading QuestionsExercises
http://linear.ups.edu/html/section-IVLT.html
11 de 11 05/05/15 01:55