part 8: iv and gmm estimation [ 1/48] econometric analysis of panel data william greene department...

48
Part 8: IV and GMM Estimation [ 1 /48] Econometric Analysis of Panel Data William Greene Department of Economics Stern School of Business

Upload: tanner-parcells

Post on 14-Dec-2015

228 views

Category:

Documents


13 download

TRANSCRIPT

Part 8: IV and GMM Estimation [ 1/48]

Econometric Analysis of Panel Data

William Greene

Department of Economics

Stern School of Business

Part 8: IV and GMM Estimation [ 2/48]

Dear Professor Greene,

I have to apply multiplicative heteroscedastic models, that I studied in your book, to the analysis of trade data.

Since I have not found any Matlab implementations, I am starting to write the method from scratch. I was wondering if you are aware of reliable implementations in Matlab or any other language, which I can use as a reference. 

Part 8: IV and GMM Estimation [ 3/48]

a “multi-level” modelling feature along the following lines? My data has a “two level” hierarchical structure: I'd like to perform an ordered probit analysis such that we allow for random effects pertaining to individuals and the organisations they work for.

ordered probit = ( | , , , )

(Integrate out directly - leads to normal probability)

Prob(y | , , ) (y | , , )

(

oit oit oit oi o

oit

oit oit oi o j oit oit oi o

oi

density f y

j G

G y

x

x x

0

01 1 1

, )

log ( , )

t oit oi

TO N

oit oit oio i t

w h

LogL G y w h

x

x

Part 8: IV and GMM Estimation [ 4/48]

01 1 1

01 1 1

01 1 1

log ( , )

Need to integrate out w and h .

log ( , ) ( )

log ( , ) (

x

x

x

TO N

oit oit oio i t

oi o

TO N

oit oit oi oi oio i toi

TO N

oit oit oio i to oi

LogL G y w h

LogL G y w h w dw

G y w h w

) ( )

( , , )

oi oi o odw h dh

LogL

Part 8: IV and GMM Estimation [ 5/48]

01 1 1

01 1 1

( , , )

log ( , ) ( ) ( )

How to do the integration? Monte Carlo simulation

( , , )

1log ( , )

TO N

oit oit oi oi oi o oo i to oi

O N M

oit oit oimo i m to

LogL

G y w h w dw h dh

LogL

G y w hM

x

x

1

01 1 1 1 1

( )

1 1log ( , )

T

o o

TO N R M

oit oit oim ro i r m t

h dh

G y w hR M

x

Part 8: IV and GMM Estimation [ 6/48]

01 1 1 1 1

01 1 1 1 1

( , , )

1 1log ( , )

1 1log ( , )

(Combine two simulations in one loop over two variables s

x

x

TO N R M

oit oit oim ro i r m t

TO N R M

oit oit oim ro i r m t

LogL

G y w hR M

G y w hR M

, , 1 1 1 2 1 3

,

1 1 1 10,

imulated at the same

time. h stays still while w varies.) (h , ), (h , ), (h , ), .

1log ,

x

o rm oi rm

TO N RM oi rmoit oito i rm t

rm

w w w etc

wG y

hRM

Part 8: IV and GMM Estimation [ 7/48]

-----------------------------------------------------------------------------Random Coefficients  OrdProbs ModelDependent variable                 HSATLog likelihood function     -1856.64320Estimation based on N =    947, K =  14Inf.Cr.AIC  =   3741.3 AIC/N =    3.951Unbalanced panel has    250 individualsOrdered probit (normal) modelLHS variable = values 0,1,...,10Simulation based on    200 Halton draws--------+--------------------------------------------------------------------        |                  Standard            Prob.      95% Confidence    HSAT|  Coefficient       Error       z    |z|>Z*         Interval--------+--------------------------------------------------------------------        |Nonrandom parametersConstant|    3.94945***      .24610    16.05  .0000     3.46711   4.43179     AGE|    -.04201***      .00330   -12.72  .0000     -.04848   -.03553    EDUC|     .05835***      .01346     4.33  .0000      .03196    .08473        |Scale parameters for dists. of random parametersConstant|    1.06631***      .03868    27.57  .0000      .99050   1.14213        |Standard Deviations of Random EffectsR.E.(01)|     .05759*        .03372     1.71  .0877     -.00851    .12369        |Threshold parameters for probabilities  Mu(01)|     .13522**       .05335     2.53  .0113      .03065    .23979  ... Mu(09)|    4.66195***      .11893    39.20  .0000     4.42884   4.89506--------+--------------------------------------------------------------------

Part 8: IV and GMM Estimation [ 8/48]

Agenda Single equation instrumental variable

estimation Exogeneity Instrumental Variable (IV) Estimation Two Stage Least Squares (2SLS) Generalized Method of Moments (GMM)

Panel data Fixed effects Hausman and Taylor’s formulation Application Arellano/Bond/Bover framework

Part 8: IV and GMM Estimation [ 9/48]

Structure and Regression

it it it

it

it it it

y E , i,t" = sibling t in family i

E 'true'education measurable only with error

S measured 'schooling' = E w , w=measurement error

it

Earnings (structural) equation

x β

Reduced

it it it it

it it it

2it it it w

y (S w )

= S + ( w )

for least squares (OLS or GLS)

Cov[S ,( w )] 0

Consistency relies on this covariance equaling 0.

How

it

it

form

x β

x β

Estimation problem

to estimate and (consistently)?β

Part 8: IV and GMM Estimation [ 10/48]

ExogeneityStructure

E[ ] =

Regression [ ]

E[ ]=

Projection

"Regression of on "

y = Xβ + ε

ε| X g(X) 0

y = Xβ + g(X) + ε - g(X)

= E[y| X] + u, u| X 0

ε = Xθ + w

y X

it it

it i1 i2 iT

Exogeneity: E[ | ] 0 (current period)

Strict Exogeneity: E[ | , ,..., ] 0 (all periods)

(We assume no correlation a

y = X(β+θ) + w

The problem:

X is not exogenous.

x

x x x

cross individuals.)

Part 8: IV and GMM Estimation [ 11/48]

An Experimental Treatment Effect

Health Outcome = f(unobserved individual characteristics, a

observed individual characteristics, x

treatment (interventions), T

randomness, )

Cardiovascular Disease (CVD) = + x + T(Hormone Replacement Therapy) + a +

Problem: HR

T is associated with greater risk of cardiovascular disease.

Experimental evidence suggests > 0. Observational evidence suggests < 0.

Why? T=T(a). Already healthy women with higher education and

higher income initiated

the treatment to prevent heart disease. HRT users had lower CVD in spite of the bad effects

of the treatment T. T is endogenous in this model. (Apparently.)

Part 8: IV and GMM Estimation [ 12/48]

Instrumental Variables Instrumental variable associated with changes in x, not with ε

dy/dx = β dx/dx + dε /dx = β + dε /dx. Second term is not 0.

dy/dz = β dx/dz + dε /dz. The second term is 0.

β =cov(y,z)/cov(x,z) This is the “IV estimator”

Example: Corporate earnings in year t Earnings(t) = β R&D(t) + ε(t) R&D(t) responds directly to Earnings(t) thus ε(t) A likely valid instrumental variable would be R&D(t-1) which probably does not respond to current year shocks to earnings.

Part 8: IV and GMM Estimation [ 13/48]

Least Squares

1 1

1

1

E[ ] E[ ]

( ) ( ) X ( )

= ( / N) ( / N)

plim = plim( / N) plim( / N)

= +

y= Xβ+ε

y| X Xβ ε| X Xβ

b XX Xy XX Xβ ε

β XX Xε

b β XX Xε

β Qγ

β

Part 8: IV and GMM Estimation [ 14/48]

The IV Estimator

it it

it it it it it it it

ni=1 i

i,t it it it

[ ], [ ]

E[ | ] 0

E[ | ] E[ (y ) | ]

(Using "n" to denote T )

E[(1/n) | ] E[(

1 2 K-1 K 1 2 K-1

The variables

X x ,x ,...x ,x Z x ,x ,...x ,z

The Model Assumption

z

z z z x β z 0

z z i,t it it it it1/n) (y ) | ]

E[(1/n) ] = E[(1/n) ]

Mimic this condition (if possible)

ˆ ˆ Find so (1/n) = (1/n)

z x β z 0

Z'y Z'Xβ

The Estimator :

β Z'y Z'Xβ

Part 8: IV and GMM Estimation [ 15/48]

A Moment Based Estimator

-1

ˆ ˆFind so (1/n) = (1/n)

ˆ ( )

(Not equivalent to replacing with .)K

The Estimator

β Z'y Z'Xβ

Instrumental Variable Estimator

β= Z'X Z'y

x z

Part 8: IV and GMM Estimation [ 16/48]

Cornwell and Rupert Data

Cornwell and Rupert Returns to Schooling Data, 595 Individuals, 7 YearsVariables in the file are

EXP = work experience, EXPSQ = EXP2

WKS = weeks workedOCC = occupation, 1 if blue collar, IND = 1 if manufacturing industrySOUTH = 1 if resides in southSMSA = 1 if resides in a city (SMSA)MS = 1 if marriedFEM = 1 if femaleUNION = 1 if wage set by unioin contractED = years of educationLWAGE = log of wage = dependent variable in regressions

These data were analyzed in Cornwell, C. and Rupert, P., "Efficient Estimation with Panel Data: An Empirical Comparison of Instrumental Variable Estimators," Journal of Applied Econometrics, 3, 1988, pp. 149-155.  See Baltagi, page 122 for further analysis.  The data were downloaded from the website for Baltagi's text.

Part 8: IV and GMM Estimation [ 17/48]

Wage Equation with Endogenous Weeks

logWage=β1+ β2 Exp + β3 ExpSq + β4OCC + β5 South + β6 SMSA + β7 WKS + ε

Weeks worked is believed to be endogenous in this equation.

We use the Marital Status dummy variable MS as an exogenous variable.

Wooldridge Condition (5.3) Cov[MS, ε] = 0 is assumed.

Auxiliary regression: For MS to be a ‘valid’ instrumental variable,

In the regression of WKS on [1,EXP,EXPSQ,OCC,South,SMSA,MS, ]

MS significantly “explains” WKS. A projection interpretation: In the projectionXitK =θ1 x1it + θ2 x2it + … + θK-1 xK-1,it + θK zit , θK ≠ 0.

(One normally doesn’t “check” the variables in this fashion.

Part 8: IV and GMM Estimation [ 18/48]

Auxiliary Projection

+----------------------------------------------------+| Ordinary least squares regression || LHS=WKS Mean = 46.81152 |+----------------------------------------------------++---------+--------------+----------------+--------+---------+----------+|Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] | Mean of X|+---------+--------------+----------------+--------+---------+----------+ Constant 45.4842872 .36908158 123.236 .0000 EXP .05354484 .03139904 1.705 .0881 19.8537815 EXPSQ -.00169664 .00069138 -2.454 .0141 514.405042 OCC .01294854 .16266435 .080 .9366 .51116447 SOUTH .38537223 .17645815 2.184 .0290 .29027611 SMSA .36777247 .17284574 2.128 .0334 .65378151 MS .95530115 .20846241 4.583 .0000 .81440576

Part 8: IV and GMM Estimation [ 19/48]

Application: IV for WKS in Rupert+----------------------------------------------------+| Ordinary least squares regression || Residuals Sum of squares = 678.5643 || Fit R-squared = .2349075 || Adjusted R-squared = .2338035 |+----------------------------------------------------++---------+--------------+----------------+--------+---------+|Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] |+---------+--------------+----------------+--------+---------+ Constant 6.07199231 .06252087 97.119 .0000 EXP .04177020 .00247262 16.893 .0000 EXPSQ -.00073626 .546183D-04 -13.480 .0000 OCC -.27443035 .01285266 -21.352 .0000 SOUTH -.14260124 .01394215 -10.228 .0000 SMSA .13383636 .01358872 9.849 .0000 WKS .00529710 .00122315 4.331 .0000

Part 8: IV and GMM Estimation [ 20/48]

Application: IV for wks in Rupert+----------------------------------------------------+| LHS=LWAGE Mean = 6.676346 || Standard deviation = .4615122 || Residuals Sum of squares = 13853.55 || Standard error of e = 1.825317 || Fit R-squared = -14.64641 || Adjusted R-squared = -14.66899 || Not using OLS or no constant. Rsqd & F may be < 0. |+----------------------------------------------------++---------+--------------+----------------+--------+---------+|Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] |+---------+--------------+----------------+--------+---------+ Constant -9.97734299 3.59921463 -2.772 .0056 EXP .01833440 .01233989 1.486 .1373 EXPSQ -.799491D-04 .00028711 -.278 .7807 OCC -.28885529 .05816301 -4.966 .0000 SOUTH -.26279891 .06848831 -3.837 .0001 SMSA .03616514 .06516665 .555 .5789 WKS .35314170 .07796292 4.530 .0000 OLS------------------------------------------------------ WKS .00529710 .00122315 4.331 .0000

Part 8: IV and GMM Estimation [ 21/48]

Generalizing the IV Estimator-1

2

1 2

2

Define a partitioned regression for n observations

= + +

K K variables

such that plim( /N) = and plim( /N) .

There exists a set of M K variables such that

p

1 1 2 2

1

y X β X β ε

X ε 0 X ε 0

W

1 W1

2 W2

1 1

2 2

lim(1/n)

plim(1/n)

plim(1/n) = is exogenous

plim(1/n) is exogenous

plim(1/n) is not exogenous

W'X Q 0

W'X Q 0

W'ε Q 0, W

X ε Q 0, X

X ε Q 0, X

Part 8: IV and GMM Estimation [ 22/48]

Generalizing the IV Estimator - 2

1 1

2 2

2 2

1 2 1 2 1

2 2

Define the set of instrumental variables

K linear combination of the M s

=

= an MxK matrix. is NxK

= [ , ]= [ , ]= [ , ]

Why must M be K ? So can have full column

Z

Z X

Z W

WP

P WP

Z Z Z X Z X WP

Z rank.

Part 8: IV and GMM Estimation [ 23/48]

Generalizing the IV Estimator

-1

2

By the definitions, is a set of instrumental variables.

ˆ [ ]

is consistent and asymptotically normally distributed.

ˆ ˆ( )'( )ˆ

N or (N-K)

Assuming homoscedasticity and no autocorelatio

Z

β= Z'X Z'y

y Xβ y Xβ

2 -1 -1

n,

ˆEst.Asy.Var[ ] [ ] [ ]ˆ β Z'X Z'Z X'Z

Part 8: IV and GMM Estimation [ 24/48]

The Best Set of Instruments

1 1

2 2

2 2

1 2 1 2 1

K linear combination of the M s

=

= an MxK matrix. is NxK

= [ , ]= [ , ]= [ , ]

What is the best P to use (the best way to

combine the exogenous instruments)?

(a) If M =

Z X

Z W

WP

P WP

Z Z Z X Z X WP

2

2

2

K , it makes no difference.

(b) If M < K , there are too few instruments to continue

(c) If M > K , there is one best combination, 2SLS.

Part 8: IV and GMM Estimation [ 25/48]

Two Stage Least Squares

1

1

A Class of IV estimators by = [ , ] = [ , ]

2SLS is defined by

(1) Regress ( and ) on all of ( and ), column by column,

ˆ ˆ ˆand compute predicted values, ( and ) = is reproduced

pe

1 2

1 2 1

1 2

Z Z Z X W P

X X X W

X X X. X

1 1

1

ˆrfectly by regressing it on itself so = [ , ] =[ , ]

ˆ ˆ[ , ] . For 2SLS,

ˆ(2) Regress y on to estimate .

(Does it work as an IV estimator?

ˆ is a linear com

1 1

-122 2

2

IX X W P X W

0

X X W P X= Z(Z'Z) Z'X

X β

X 1bination of and , so yes.)X W

Part 8: IV and GMM Estimation [ 26/48]

2SLS Estimator

-1

2

By the definitions, is a set of instrumental variables.

ˆ ˆ ˆ[ ]

is consistent and asymptotically normally distributed.

ˆ ˆ( )'( )ˆ

N or (N-K)

Assuming homoscedasticity and no autocorelat

Z

β= X'X X'y

y Xβ y Xβ

2 -1 -1

ion,

ˆ ˆ ˆ ˆ ˆEst.Asy.Var[ ] [ ] [ ]ˆ β X'X X'X X'X

Part 8: IV and GMM Estimation [ 27/48]

2SLS Algebra

-1 -1 1

2 -1 2

ˆ

ˆ ˆ

ˆ ˆ ˆ ˆ ˆ ˆ[ ] [ ] [ ]

ˆ ˆ( )'( )ˆ ˆ ˆEst.Asy.Var[ ] [ ] ,ˆ ˆn or (n-K)

-1

-1

-1 -1

X'X = Z(Z'Z) Z'X 'X

= X' Z(Z'Z) Z' X

= X' Z(Z'Z) Z' Z(Z'Z) Z' X

= X'X

Therefore

X'X X'X X'X X'X

y Xβ y Xββ X'X

Part 8: IV and GMM Estimation [ 28/48]

A General Result for IV We defined a class of IV estimators by the set of

variables

The minimum variance (most efficient) member in this class is 2SLS (Brundy and Jorgenson(1971)) (rediscovered JW, 2000, p. 96-97)

1 1

2 2

2 2

1 2 1 2 1

K linear combination of the M s

=

= an MxK matrix. is NxK

= [ , ]= [ , ]= [ , ]

Z X

Z W

WP

P WP

Z Z Z X Z X WP

Part 8: IV and GMM Estimation [ 29/48]

GMM Estimation – Orthogonality Conditions

m

General Model Formulation:

; plim[(1/n) ] 0 (possibly) K regressors in

M K Instrumental variables Z; plim[(1/n) ] = 0.

IV formulation implies M orthogonality conditions

E[z (y )]

y = Xβ+ε X'ε X

Z'ε

x'β

Mm m l=1 lm m

-1

0.

2SLS only K of these in the form

ˆ ˆ E[x (y )] 0 where x = z

ˆ ˆ ˆ ˆ Solution is ( )

Consider an estimator that uses all M equations when M > K

The orthogonality condition to

x'β

β= X'X X'y

ni=1 im i i

m

mimic is

E[ (1/n) z (y x )]=0, m=1,...,M

This is M equations in K unknowns each of the form E[g ( )]=0.

β

β

Part 8: IV and GMM Estimation [ 30/48]

GMM Estimation - 1ni=1 im i i

m

Ni=1 im i i

E[(1/n) z (y x )]=0, m=1,...,M

E[g ( )]=0, m = 1,...,M.

Sample counterparts - finding the estimator:

ˆ(1/n) z (y x )=0

(a) If M = K, the exact solution is 2SLS

(b) If M < K, there are too f

β

β

β

2M ni=1 im i im=1

ew equations. No solution.

(c) If M > K, there are excess equations. How to reconcile them?

First Pass: "Least Squares"

ˆ ˆTry Minimizing wrt (1/n) z (y x ) ( ) ( ) β: β g β 'g β

Part 8: IV and GMM Estimation [ 31/48]

GMM Estimation - 2

2M ni=1 im i im=1

ˆ ˆthe minimizer of (1/n) z (y ) ( ) ( )

( ) ( ) = ( ) ( )

Defines a "Minimum Distance Estimator" with weight matrix = .

More generally: Let

ˆ the minimizer of ( ) (

-1

β = xβ g β 'g β

g β 'g β g β ' I g β

I

β = g β ' A g β

)

Results: For any positive definite matrix ,

ˆ is consistent and asymptotically normally distributed.

( ) ( )ˆ Asy.Var[ ]= Asy.Var[ ( )]

(See JW, Ch. 14

A

β

g β g ββ A g β A

β' β'

for analysis of asymptotic properties.)

Part 8: IV and GMM Estimation [ 32/48]

IV Estimation

ˆ the minimizer of ( ) ( )

Results: For any positive definite matrix ,

ˆ is consistent.

( ) ( )ˆ Asy.Var[ ]= Asy.Var[ ( )]

For IV estimation,

-1

β = g β ' A g β

A

β

g β g ββ A g β A

β' β'

2 n 2i 1 i

( ) = (1/n) ( )

( )(1 / n)

Asy.Var[ ( )] (1 / n)

i

g β Z' y - Xβ

g βZ'X

β'

g β z z

assuming homoscedasticity and no autocorrelation.

Part 8: IV and GMM Estimation [ 33/48]

An Optimal Weighting Matrix

1

ˆ the minimizer of ( ) ( )

For any positive definite matrix ,

ˆ is consistent and asymptotically normally distributed.

( ) ( )ˆAsy.Var[ ]= Asy.Var[ ( )]

Is

β = g β ' A g β

A

β

g β g ββ A g β A

β' β'

1 1

1

there a 'best' matrix? The most efficient estimator in the GMM class

has = Asy.Var[ ( )] . Asy.Var[ ( )] Asy.Var[ ( )]

ˆ the minimizer of ( ) Asy.Var[ ( )] ( )

GMM

A

A g β A g β A = g β

β = g β ' g β g β

Part 8: IV and GMM Estimation [ 34/48]

The GMM Estimator

1

1

1

ˆ the minimizer of q = ( ) Asy.Var[ ( )] ( )

( ) ( )ˆAsy.Var[ ]= Asy.Var[ ( )]

( )For IV estimation, ( ) = (1/n) ( ), (1 / N)

Asy.Var[

GMM

GMM

β = g β ' g β g β

g β g ββ g β

β' β'

g βg β Z' y - Xβ Z'X

β'

2 N 2 2 2i 1 i

1 12 2 2

( )] (1 / n) =( / n )

ˆAsy.Var[ ]= ( (1 / n) )[( / n ) ] ( (1 / n) ) = [ ] ) !!!!!

i

-1 -1GMM

g β z z Z'Z

β X'Z Z'Z Z'X X'Z Z

IMPLICATION: 2SLS is not just efficient for IV estimators that use a linear

c

'Z Z'X

ombination of the columns of Z. It is efficient among all estimators that use

the columns of Z.

Part 8: IV and GMM Estimation [ 35/48]

GMM EstimationN Ni 1 i i i 1 i i

N 2i 1 i2

1 1( )= (y ) ε

N NAssuming homoscedasticity and no autocorrelation 2SLS

is the efficient GMM estimator. What if there is heteroscedasticity?

1Asy.Var[ ( )] , estim

n

i

i i

g β z xβ z

g β z z n 2i 1 i

i

1

n n 2 ni 1 i i i 1 i i 1 i i

1 1ated with e

n n

based on 2SLS residuals e. The GMM estimator minimizes

1 1 1 1q (y ) e (y ) .

n n n n

This is not 2SLS becau

i i

i i i i

z z

z x β ' z z z xβ

-1se the weighting matrix is not ( ) .Z'Z

Part 8: IV and GMM Estimation [ 36/48]

Application - GMM

NAMELIST ; x = one,exp,expsq,occ,south,smsa,wks$NAMELIST ; z = one,exp,expsq,occ,south,smsa,ms,union,ed$2SLS ; lhs = lwage ; RHS = X ; INST = Z $NLSQ ; fcn = lwage-b1'x ; labels = b1,b2,b3,b4,b5,b6,b7 ; start = b ; inst = Z ; pds = 0$

Part 8: IV and GMM Estimation [ 37/48]

Application - 2SLS

Part 8: IV and GMM Estimation [ 38/48]

GMM Estimates

Part 8: IV and GMM Estimation [ 39/48]

2SLS

GMM with Heteroscedasticity

Part 8: IV and GMM Estimation [ 40/48]

Testing the Overidentifying Restrictions

1

d 2

q = ( ) Asy.Var[ ( )] ( )

Under the hypothesis that E[ ( )] = 0,

q [M K]

M = number of moment equations

K = number of parameters estimated

(In our example, M = 9, K = 7.)

M - K = number of 'extr

g β ' g β g β

g β

a' moment equations. More than are

needed to identify the parameters.

For the example,

| Value of the GMM criterion: |

| e(b)tZ inv(ZtWZ) Zte(b) = 537.3916 |

Part 8: IV and GMM Estimation [ 41/48]

Inference About the Parameters

1N 2 N 2i=1 i i=1 i

1N 2i=1 i

0

1 0

ˆ [ ] [ ]ˆ ˆ

ˆEst.Asy.Var[ ]= [ ]ˆ

Restrictions can be tested using Wald statistics;

H : ( )=

H :Not H

ˆWald ( )- E

-1 -1GMM i i i i

-1GMM i i

GMM

β (X'Z) zz (Z'X) (X'Z) zz (Z'y)

β (X'Z) zz (Z'X)

r β h

r β h ' R

1

0 k

ˆ ˆst.Asy.Var[ ] ( )-

ˆ( )-

ˆ

E.g., for a simple test, H : =0, this is the square of the t-ratio.

GMM GMM

GMM

GMM

β R r β h

r β hR

β

Part 8: IV and GMM Estimation [ 42/48]

Specification Test Based on the Criterion

0

1

1 d 20 0

Consider a null hypothesis H that imposes restrictions on an

alternative hypothesis H ,

Under the null hypothesis that E[ ( )] = 0,

q = ( ) Asy.Var[ ( )] ( ) [M K ]

Under the alternative

0

0 0 0

g β

g β ' g β g β

1

1 d 21 1

d 20 1 1 0

0 1

hypothesis, H

q = ( ) Asy.Var[ ( )] ( ) [M K ]

Under the null, q q [K K ]

Restrictions can be tested using the criterion functions

statistic, q q .

(Weighting matrix must be the

1 1 1g β ' g β g β

0 1 same for H and H . Use

the unrestricted weighting matrix.)

Part 8: IV and GMM Estimation [ 43/48]

Extending the Form of the GMM Estimator to Nonlinear Models

N 2 N 2i 1 i i 1 i2

i

Very little changes if the regression function is nonlinear.

1 1 1Asy.Var[ ( )] , estimated with e

N N N

based on nonlinear 2SLS residuals e. The GMM estimator m

i i i ig β zz zz

1

N N 2 Ni 1 i i i 1 i i 1 i i

inimizes

1 1 1 1q (y f( )) e (y f( )) .

N N N N

The problem is essentially the same.

i i i iz x β ' zz z x β

Part 8: IV and GMM Estimation [ 44/48]

A Nonlinear Conditional Mean

i i

N 1 Ni=1 i i i=1 i i

Ni=1 i i

f( ) exp( )

E[ (y exp( ))]

Nonlinear instrumental variables (2SLS) minimizes

(y exp( )) [ ] (y exp( ))

Nonlinear GMM then minimizes

1(y exp( )) '[(1/ N)

N

i

i

i i

i

x β x β

z x β 0

x β z ' Z'Z z xβ

x β z 2 N 2 1 Ni 1 i i i=1 i i

N 1 Ni=1 i i i=1 i i

1] (y exp( ))ˆ

N

1 1ˆ(y exp( )) ' (y exp( )N N

i i

i i

zz z x β

x β z W z x β)

Part 8: IV and GMM Estimation [ 45/48]

Nonlinear Regression/GMM

NAMELIST ; x = one,exp,expsq,occ,south,smsa,wks$NAMELIST ; z = one,exp,expsq,occ,south,smsa,ms,union,ed$? Get initial values to use for optimal weighting matrixNLSQ ; lhs = lwage ; fcn=exp(b1'x) ; inst = z

; labels=b1,b2,b3,b4,b5,b6,b7 ; start=7_0$? GMM using previous estimates to compute weighting matrixNLSQ (GMM) ; fcn = lwage-exp(b1'x) ; inst = Z ; labels = b1,b2,b3,b4,b5,b6,b7 ; start = b

; pds = 0 $ (Means use White style estimator)

Part 8: IV and GMM Estimation [ 46/48]

Nonlinear Wage Equation EstimatesNLSQ Initial Values

Part 8: IV and GMM Estimation [ 47/48]

Nonlinear Wage Equation Estimates2nd Step GMM

Part 8: IV and GMM Estimation [ 48/48]

IV for Panel Data

11N N N2sls,lsdv i 1 i i i 1 i i i 1 i i

1N N Ni 1 i i i 1 i i i 1 i i

1N 1 N 1 N 1

2sls,RE i 1 i i i i 1 i i i i 1 i i i

Fixed Effects

Random Effects

ˆ ˆ ˆ

D D D

D D D

b XM Z ZM Z ZM X

XM Z ZM Z ZM y

b X Z Z Z Z X

1

1N 1 N 1 N 1i 1 i i i i 1 i i i i 1 i i i

ˆ ˆ ˆ

X Z Z Z Z y