14 vector autoregressions, unit roots, and cointegration

42
14 Vector Autoregressi ons, Unit Roots, and C ointegration

Post on 22-Dec-2015

308 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 14 Vector Autoregressions, Unit Roots, and Cointegration

14 Vector Autoregressions, Unit Roots, and Cointegration

Page 2: 14 Vector Autoregressions, Unit Roots, and Cointegration

What is in this Chapter?

• This chapter discusses work on time-series analysis starting in the 1980s.– First there is a discussion of vector autoregre

ssion models, – Next we talk of the different unit root tests. – Finally, we discuss cointegration, which is a m

ethod of analyzing long-run relationships between nonstationary variables. We discuss tests for cointegration and estimation of cointegrating relationships

Page 3: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.2 Vector Autoregressions

• In previous sections we discussed the analysis of a single time series

• When we have several time series, we need to take into account the interdependence between them

• One way of doing this is to estimate a simultaneous equations model as discussed in Chapter 9 but with lags in all the variables

• Such a model is called a dynamic simultaneous equations model

Page 4: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.2 Vector Autoregressions

• However, this formulation involves two steps: – first, we have to classify the variables into two categories, endog

enous and exogenous, – second, we have toimpose some constraints on the parameters t

o achieve identification.

• Sims argues that both these steps involve many arbitrary decisions and suggests as an alternative, the vectorautoregression (VAR) approach.

• This is just a multiple time-series generalization of the AR model.

• The VAR model is easy to estimate because we can use the OLS method

Page 5: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.2 Vector Autoregressions

Page 6: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.2 Vector Autoregressions

Page 7: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.2 Vector Autoregressions

Page 8: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.2 Vector Autoregressions

Page 9: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.2 Vector Autoregressions

Page 10: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.3 Problems with VAR Models in Practice

• We have considered only a simple model with two variables and only one lag for each.

• In practice, since we are not considering any moving average errors, the autoregressions would probably have to have more lags to be useful for prediction

• Otherwise, univariate ARMA models would do better.

Page 11: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.3 Problems with VAR Models in Practice

• Suppose that we consider say six lags for each variable and we have a small system with four variables

• Then each equation would have 24 parameters to be estimated and we thus have 96 parameters to estimate overall

• This overparameterization is one of the major problems with VAR models.

Page 12: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.3 Problems with VAR Models in Practice

• One such model that has been found particularly useful in prediction is the Bayesian vectorautoregression (BVAR)

• In BVAR we assign some prior distributions for the coefficients in the vector autoregressions

• In each equation, the coefficient of the own lagged variable has a prior mean 1, all others have prior means 0, with the variance of the prior decreasing as the lag length increases

• For instance, with two variables y1t and y2t and four lags for each, the first equation will be

Page 13: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.3 Problems with VAR Models in Practice

Page 14: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.4 Unit Roots

Page 15: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.4 Unit Roots

Page 16: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

Page 17: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

Page 18: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

Page 19: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

Page 20: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

• The Low Power of Unit Root Tests– Schwert (1989) first presented Monte Carlo evidence t

o point out the size distortion problems of the commonly used unit root tests: the ADF and PP tests.

– Whereas Schwert complained about size distortions, DeJong et al. complained about the low power of unit root tests

– They argued that the unit root tests have low power against plausible trend-stationary alternatives

Page 21: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

– They argue that the PP tests have very low power (generally less than 0.10) against trend-stationary alternatives but the ADF test has power approaching 0.33 and thus is likely to be more useful in practice.

– They conclude that tests with higher power need to be developed

Page 22: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

Page 23: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

Page 24: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

Page 25: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

• Structural Change and Unit Roots– In all the studies on unit roots, the issue of wh

ether a time series is of the DS or TS type was decided by analyzing the series for the entire time period during which many major events took place

– The Nelson-Plosser series, for instance, covered the period 1909-1970,which includes the two world wars and the Depression of the 1930s

Page 26: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

• If there have been any changes in the trend because of these events, the results obtained by assuming a constant parameter structure during the entire period will be suspect

• Many studies done using the traditional multiple regression methods have included dummy variables (see Sections 8.2 and 8.3) to allow for different intercepts (and slopes)

• Rappoport and Richlin (1989) show that a segmented trend model is a feasible alternative to the DS model.

Page 27: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

• Perron (1989) argues that standard tests for the unit root hypothesis against the trend-stationary (TS) alternatives cannot reject the unit root hypothesis if the time series has a structural break.

• Of course, one can also construct examples where, for instance– y1, y2, • • •, ym is a random walk with drift– ym+1, ..., ym+n is another random walk with a differe

nt drift– and the combined series is not the DS type

Page 28: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

• Perron's study was criticized on the argument that he "peeked at the data" before analysis—that after looking at the graph, he decided that there was a break

• But Kim (1990), using Bayesian methods, finds that even allowing for an unknown breakpoint, the standard tests of the unit root hypothesis were biased in favor of accepting the unit root hypothesis if the series had a structural break at some intermediate date.

Page 29: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.5 Unit Root Tests

• When using long time series, as many of these studies have done, it is important to take account of structural changes. Parameter constancy tests have frequently been used in traditional regression analysis.

Page 30: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.6 Cointegration

Page 31: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.6 Cointegration

• In the Box-Jenkins method, if the time series is nonstationary (as evidenced by the correlogram not damping), we difference the series to achieve stationarity and then use elaborate ARMA models to fit the stationary series.

• When we are considering two time series, yt and Xt say, we do the same thing.

• This differencing operation eliminates the trend or long-term movement in the series

Page 32: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.6 Cointegration

• However, what we may be interested in is explaining the relationship between the trends in yt and Xt

• We can do this by running a regression of Yt on Xt, but this regression will not make sense if a long-run relationship does not exist.

• By asking the question of whether yt and Xt are cointegrated, we are asking whether there is any long-run relationship between the trends in yt and xt.

Page 33: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.6 Cointegration

• The case with seasonal adjustment is similar• Instead of eliminating the seasonal components from y a

nd x and then analyzing the de-seasonalized data, we might also be asking whether there is a relationship between the seasonals in y and x

• This is the idea behind "seasonal cointegration.• Note that in this case we do not considerfirst differences

or I(1) processes• For instance, with monthly data we consider twelfthdiffer

ences yt-yt-12 Similarly, for Xt we consider Xt-Xt-12

Page 34: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.7 The Cointegrating Regression

Page 35: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.7 The Cointegrating Regression

Page 36: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.7 The Cointegrating Regression

Page 37: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.7 The Cointegrating Regression

Page 38: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.7 The Cointegrating Regression

Page 39: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.9 Cointegration and Error Correction Models

• If xt and yt are cointegrated, there is a long-run relationship between them

• Furthermore, the short-run dynamics can be described by the error correction model (ECM)

• This is known as the Granger representation theorem

• If Xt ~ I(1), yt ~I(1), and zt = yt -βxt is I(0), then x and y are said to be cointegrated

• The Granger representation theorem says that in this case Xt and yt may be considered to be generated by ECMs:

Page 40: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.9 Cointegration and Error Correction Models

Page 41: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.9 Cointegration and Error Correction Models

Page 42: 14 Vector Autoregressions, Unit Roots, and Cointegration

14.10 Tests for Cointegration