3. linear algebra - the omar al-ubaydli homepage - home. linear algebra.pdf · lecture note 3:...
TRANSCRIPT
Lecture note 3: Linear algebra
1
Linear algebra
Lecture note 3
Outline
1. Systems of linear equations
2. Vectors and Euclidean spaces
3. Linear independence
4. Matrix algebra
Lecture note 3: Linear algebra
2
Systems of linear equations
General system:
����� � � � ����� � ��
���� � � � ���� � �
Key features: m equations, n unknowns
Example: IS curve from ISLM model
� � � � � �
� � ��
� � � ��
Questions:
1. Does a solution exist?
2. How many solutions are there?
3. Is there an efficient algorithm that computes actual solutions?
Lecture note 3: Linear algebra
3
Solution method 1: substitution
1. Use first equation to solve for one variable in terms of the others
2. Substitute into the next equation
3. Repeat until last variable solved for exactly
4. Backwards substitute to get other variables
Example:
3� � 2� � 5
� � � � 6
Lecture note 3: Linear algebra
4
Solution method 2: elimination of variables
1. Add linear combinations of pairs of equations to eliminate one of the variables
2. Reduces you to m-1 equations with n-1 unknowns
3. Repeat until you get down to one equation with one unknown
Example:
� � 3� � 5
2� � 9� � 10
Lecture note 3: Linear algebra
5
Solution method 3: matrix methods
The above is only really works for small systems; otherwise it’s pretty crap and matrix methods
offer a much quicker method.
Moreover, it doesn’t give us a systematic way of analysing the existence/number of solutions
Reminder of how to write up a system in matrix form
Example 1:
� � 2� � 7
3� � 6� � 9
Example 2: IS curve
� � � � � �
� � ��
� � � ��
Lecture note 3: Linear algebra
6
Solution method 3 continued
There are lots of ways of trying to solve a system using matrix methods with plenty of far-reaching
mathematical methods.
As an economist, we don’t need to go into all of them. We will focus on issues of linear
independence. First we need to understand Euclidean spaces and vector algebra.
Euclidean spaces
The real line � is defined (for our purposes) as the set of all the numbers from �∞ to ∞ (not
inclusive).
�� is defined as the set of all the ordered n-tuples of real numbers.
• n-tuple means literally n numbers
• We say ordered because the order matters; i.e., �1,0� is not the same as �0,1�
Examples: � , �!
Lecture note 3: Linear algebra
7
Vectors
For our purposes, an n-dimensional vector is an element of �� (also a " # 1 matrix)
Elements of �� are called scalars
Notation: they should be bold or underlined, but nobody can be arsed so get used to differentiating
They are usually displayed in column form, but sometimes in row form with commas too.
Vectors as directions
You can think of a vector as information on how the dimensions of its elements must be related up
to a scalar.
For example:
• the vector $11% tells us that the values of � and � have to be the same (without specifying that
value)
• the vector $12% tells us that � has to be twice as big as � (without specifying that value)
• the vector $30% tells us that � and � are unrelated
Lecture note 3: Linear algebra
8
Vector algebra
Vector summation
• Requires same dimension
• Example: '123( � '456(
Vector multiplication
• Requires same dimension
• Example: '123( · '456(
• Multiplication by a scalar (along with special notation when multiplying by a scalar)
Properties of vector multiplication
• Commutativity: + · , � , · +
• Distributivity: + · �, � -� � + · , � + · -
• Effect of a scalar �: + · ��,� � ��+ · ,� � ��+� · ,
• + · + . 0
• + · + � 0 / + � 0
Lecture note 3: Linear algebra
9
Question: what is �+ � ,� · �+ � ,� equal to?
Linear independence
Let 01�, 1 , … , 13 be elements of � and let 0,�, , , … , ,3 be elements of ��. Then 0,�, , , … , ,3
are said to be linearly independent if and only if �1�, 1 , … , 1� � �0,0, … ,0� is the only solution to:
1�,� � 1 , � � � 1, � 0
NB: the 0 at the end of the equation is a vector
The expression 1�,� � 1 , � � � 1, is called a linear combination of the vectors 0,�, , , … , ,3.
Before we look at some examples, what is linear independence trying to capture?
As described above, each vector tells us something about how its dimensions have to be related to
each other.
When a vector ,� 4 0 is a linear combination of two vectors �, , ,!�, i.e., ,� � �, � 5,! for some ��, 5�, then the information it carries on how its dimensions are related is redundant given �, , ,!�.
Lecture note 3: Linear algebra
10
In other words, the restrictions on the dimensions implied by �, , ,!� include the restrictions
implied by ,�.
When a collection of vectors is linearly independent, that means that the none of the restrictions
implied by any one of the vectors is redundant. In other words, each vector imposes a different set
of restrictions to the other vectors (or any other combination of the other vectors).
This is best seen using examples.
Example 1: are $12% and $24% linearly independent?
Example 2: are $10% and $01% linearly independent?
Lecture note 3: Linear algebra
11
Example 3: are '110( and '320( linearly independent?
Back to solving linear systems
We reached the stage where we want to solve a system of the form 6� � �, where 6 has " rows
and 7 columns.
Question: what are the implied dimensions of � and �?
Lecture note 3: Linear algebra
12
Rank
Definition of row vectors of a matrix
Definition of column vectors of a matrix
Definition of row rank of a matrix
Definition of column rank of a matrix
Theorem: row rank (A) = column rank (A) = rank (A)
Corollary: ��"8�6� 9 min�7, "�
Definition of full rank
Corollary: a matrix is full rank only if (but not if) it is square
Lecture note 3: Linear algebra
13
Example 1: what is the rank of 6 � $1 0 20 1 2%?
Example 2: what is the rank of 6 � $2 52 3%?
Lecture note 3: Linear algebra
14
Rank and solving linear systems
Consider the linear system 6� � � where � = ��, � = � and � 4 0 (otherwise trivial solution)
We refer to 7 as the number of equations and " as the number of unknowns
Note that as part of the problem, we specify 6 and �. We are trying to infer �.
Theorem (a rather long one):
• if 7 > " (potential underidentification)
o 6� � � has 0 or infinite solutions
o If ��"8�6� � 7 then 6� � � has infinite solutions
• If 7 ? " (potential overidentification)
o 6� � � has 0, 1 or infinite solutions
o If ��"8�6� � " then 6� � � has 0 or 1 solution
• If 7 � " (potential exact identification)
o 6� � � has 0, 1 or infinite solutions
o If ��"8�6� � 7 � " then 6� � � has exactly 1 solution
Lecture note 3: Linear algebra
15
Examples of number of solutions to linear systems
1: Potential underidentification
Infinite solutions: $1 1 01 0 0% '��� �!( � $34%
Intuition: not placing enough restrictions
Zero solutions: $1 0 01 0 0% '��� �!( � $34%
Intuition: placing inconsistent restrictions
Lecture note 3: Linear algebra
16
2: Potential overidentification
Zero solutions: '1 00 11 1( $��� % � '245(
Intuition: too many restrictions that become inconsistent.
3: Potential exact identification
One solution: $1 11 2% $��� % � $23%
Intuition: the restrictions exactly pin down the variables.
Lecture note 3: Linear algebra
17
Solving exactly identified systems: matrix inversion
We start with a quick reminder on basic matrix algebra.
How to add
How to multiply
Laws of algebra:
• Associativity:
o �6 � @� � � � 6 � �@ � ��
o �6@� � � 6�@��
• Commutativity (addition only)
o 6 � @ � @ � 6
• Distributivity
o 6�@ � �� � 6@ � 6�
o �6 � @�� � 6� � @�
Lecture note 3: Linear algebra
18
How to transpose (and definition of symmetric matrix)
Theorem: �6@�A � @A6A
Definition of the inverse of a matrix for square matrices (and the identity) matrix
Equivalence of left and right inversion
Lemma: an inverse exists if and only if the matrix is full rank (non-singular)
Calculating the inverse matrix: the 2x2 case
Example: 6 � $1 34 2%
Lecture note 3: Linear algebra
19
An aside on the determinant of a matrix and its relationship to non-singularity
Inversion and determinant calculation in higher dimensions are left to computers
Useful rules on inverses (come in handy for econometrics; proofs later):
• �6B��B� � 6
• �6B��A � �6A�B�
• �6@�B� � @B�6B�
Lecture note 3: Linear algebra
20
Application: demand and supply system
There are three goods in the economy: xylophones ���, yaks ��� and zebras �C�
The price vector is DEF , EG, EHI
Demand and supply for each good are a function of the price vector:
• �J � 5 � EF , �K � 3 � EF
• �J � 10 � 2EG � EH, �K � 4 � EG (Yaks and zebras are obviously substitutes as pets)
• CJ � 8 � EH � 3EG, �K � 5 � EH
We want to find the equilibrium price vector. The three unknowns are DEF , EG, EHI. The three
equations are derived by equalising demand and supply in each market:
• EF � 2
• 3EG � EH � 6
• 2EH � 3EG � 3
Lecture note 3: Linear algebra
21
Solution (using matrix algebra):
Lecture note 3: Linear algebra
22
Summary
• An economic model usually has the following ingredients:
o Variables that we want to understand (e.g., consumption, savings, investment)
� Known as endogenous variables
o Actors who chose the values of these variables
o Parameters that define the choice problems of the actors
� Known as exogenous variables
• We then analyse the behaviour of the agents, typically under the assumption of utility (profit)
maximization
o The result is the endogenous variables as a function of the exogenous variables
o Usually, they will be linked together in a system
o In the simplest models, the system is a linear system
• A linear system can be expressed in matrix form which can then be solved (usually using a
computer)
• Key things to remember about linear systems
o The relationship between rank and the existence of a solution
o The relationship between rank and the number of solutions
Lecture note 3: Linear algebra
23
Epilogue: proofs using linear algebra
When you do econometrics, you will see a lot of proofs using linear algebra. It is important to get
used to that style of arguments.
There isn’t really a consistent logic to linear algebra proofs. For non-mathematical geniuses like
most of you guys and me, the only way to get the hang of them is to see and do lots of them.
I will go through some here.
• �6B��A � �6A�B�
• �6@�B� � @B�6B�
Lecture note 3: Linear algebra
24
• �6@�M � 6M@M if 6@ � @6
• If 6 is symmetric then 6B� is symmetric
• If 6 has an inverse then it is unique