While our two previous posts in this series have been heavily theoretically motivated, here we present a step by step procedure on how to implement Part 1 and Part 2 in practice.
- Get a feel for the nature of the data.
- Ensure all variables are integrated of order I$(d)$ with $d < 2$.
- Specify how deterministics enter the ARDL model. Choose DGP $i=1,\ldots,5$ from those outlined in Part 1 and Part2.
- Determine the appropriate lag structure of the model selected in Step 3.
- Estimate the model in Step 4 using Ordinary Least Squares (OLS).
- Ensure residuals from Step 5 are serially uncorrelated and homoskedastic.
- Perform the Bounds Test.
- Estimate speed of adjustment, if appropriate.
Working Example
The motivation for this entry is the classical term structure of interest rates (TSIR) literature. In a nutshell, the TSIR postulates that there exists a relationship linking the yields on bonds of different maturities. Formally: $$R(k,t) = \frac{1}{k}\sum_{j=1}^{k}\pmb{\text{E}}_tR(1,t+j-1) + L(k,t)$$ where $\pmb{\text{E}}_t$ is the expectation operator conditional on the information at time $t$, $R(k,t)$ is the yield to maturity at time $t$ of a $k$ period pure discount bond, and $L(k,t)$ are the premia typically accounting for risk. To see that cointegration is indeed possible, repeated applications of the trick, $R(k,t) = R(k,t-1) + \Delta R(k,t)$, where $\Delta R(k,t) = R(k,t) - R(k,t-1)$, leads to the following expression: $$R(k,t) - R(1,t) = \frac{1}{k}\sum_{i=1}^{k-1}\sum_{j=1}^{i}\pmb{\text{E}}_t \Delta R(1,t+j) + L(k,t)$$ It is now evident that if $R(k,t)$ are I$(1)$ processes, $\Delta R(1,t+j)$ must be I$(0)$ processes, and the linear combination $R(k,t) - R(1,t)$ are therefore I$(0)$ processes provided $L(k,t)$ is as well. In other words, the $k$ period yield to maturity is always cointegrated with the first period yield to maturity, with cointegrating vector $(1,-1)^\top$. In fact, a little more work shows that the principle holds for the spread between any two arbitrary times $k_1$ and $k_2$. That is, \begin{align*} R(k_2,t) - R(k_1,t) &= R(k_2,t) - R(1,t) + R(1,t) - R(k_1,t)\\ &= \frac{1}{k_2}\sum_{i=1}^{k_2-1}\sum_{j=1}^{i}\pmb{\text{E}}_t \Delta R(1,t+j) + L(k_2,t) - \frac{1}{k_1}\sum_{i=1}^{k_1-1}\sum_{j=1}^{i}\pmb{\text{E}}_t \Delta R(1,t+j) + L(k_1,t)\\ &\sim \text{I}(0) \end{align*} Now that we have established a theoretical basis for the exercise, we delve into practice with real data. In fact, we will work with Canadian maturities collected directly from the Canadian Socioeconomic Database from Statistics Canada, or CANSIM for short. In particular, we will be looking at cointegrating relationships between two types of marketable debt instruments: the yield on a Treasury Bill, which is a short-term (maturing at 1 month, 3 months, 6 months, and 1 year from date of issue) discounted security, and the yield on Benchmark Bonds, otherwise known as Treasury Notes, which are medium-term (maturing at 2 years, 5 years, 7 years, and 10 years from date of issue) securities with bi-yearly interest payouts. The workfile can be found here.Data Summary
The first step in any empirical analysis is an overview of the data itself. In particular, the subsequent analysis makes use of data on Treasury Bill yields maturing in 1,3,6, and 12 months, appropriately named TBILL; in addition to using data on Benchmark Bond yields (Treasury Notes) maturing in years 2,5, and 10, appropriately named BBY. Consider their graphs below:Notice that each graph exhibits a structural change around June 2007, marking the beginning of the US housing crisis. We have indicated its presence using a vertical red line. We will incorporate this information into our analysis by indicating the post crisis period with the dummy variable dum0708. Namely, the variable assumes a value of 1 in each of the months following June 2007. Moreover, a little background research on the Central Bank of Canada (CBC) reveals that starting January 2001, the CBC would commit to a new set of transparency and inflation targeting measures to recover from the late 90's dot-com crash as well as the disinflationary period in the earlier part of that decade. For this reason, to avoid having to analyze too many policy paradigm shifts, we will only focus on data in the period after January 2001. We can achieve everything with the following set of commands:
'Set sample from Jan 2001 to end. smpl Jan/2001 @last 'Create dummy for post 07/08 crisis series dum0708 = @recode(@dateval("2007/06")<@date,1,0)
Testing Integration Orders
We begin our analysis by ensuring that no series under consideration is integrated of order 2 or higher. To do this, we run a unit root test on the first difference of each series. In this case, the standard ADF test will suffice. A particularly easy way of doing this is creating a group object with all variables of interest, and then running a unit root test on the group, specifying that the test should be done on the individual series. In the group view then, proceed to Proc/Unit Root Test..., and choose the appropriate options.Deterministic Specifications
Selecting an appropriate model to fit the data is both art and science. Nevertheless, there are a few guidelines. Any model in which the series are not centered about zero will typically require a constant term, whereas any model in which the series exhibit a trend, will in general have better fit when a trend term is incorporated. Our discussion in Part 1 and Part 2 of this series discussed the possibility of selecting from five different DGP specifications, termed Case 1 through Case 5. In fact, we will consider several different model specifications with various variable combinations.- Model 1: The Model under consideration will look for a relationship between the 10 Year Benchmark Bond Yield and the 1 Month T-Bill. In particular, the model will restrict the constant to enter the cointegrating relationship, corresponding to the DGP and Regression Model specified in Case 2 in Part 1 and Part 2.
- Model 2: The Model under consideration will look for a relationship between the 6, 3, and 1 Month T-Bills. Here, the model will leave the constant unrestricted, corresponding to the DGP and Regression Model specified in Case 3 in Part 1 and Part 2.
- Model 3: The Model under consideration will look for a relationship between the 2 Year Benchmark Bond Yield, and the 1 Year and 1 Month T-Bills. Here, the model will again leave the constant unrestricted, corresponding to the DGP and Regression Model specified in Case 3 in Part 1 and Part 2.
Specifying ARDL Lag Structure
Selecting an appropriate number of lags for the model under consideration is again, both science and art. Unless the number of lags is specified by economic theory, the econometrician has several tools at his disposal to select lag length optimally. One possibility is to select the maximal number of lags for the dependent variable, say $p$, and the maximal number of lags for each of the regressor variables, say $q$, and then run a barrage of regressions with all the different possible combinations of lags that can be formed using this specification. In particular, if there are $k$ regressors, the maximum number of combinations of the set of numbers $\{1, \ldots p\}$ and $k$ additional sets of numbers $\{0,\ldots, q\}$, is $p\times (q + 1)^k$. For instance, with EViews default values $p = q = 4$, the total number of models under consideration would be 100. The optimal combination is then set as that which minimizes some information criterion, say Akaike (AIC), Schwarz (BIC), Hannan-Quinn (HQ), or even the adjusted $R^2$. EViews offers the user an option on how to select from among these, and we will discuss this when we explore estimation next.Estimation, Residual Diagnostics, Bounds Test, and Speed of Adjustment
ARDL models are typically estimated using standard least squares techniques. In EViews, this implies that one can estimate ARDL models manually using an equation object with the Least Squares estimation method, or resort to the built-in equation object specialized for ARDL model estimation. We will use the latter. Open the equation dialog by selection Quick/Estimate Equation or by selecting Object/New Object/Equation and then selecting ARDL from the Method dropdown menu. Proceed by specifying each of the following:- List the relevant dynamic variables in the Dynamic Specification field. This is a space delimited list where the dependent variable is followed by the regressors which will form the long-run equation. Do NOT list variables which are not part of the long-run equation, but part of the estimated model. Those variables will be specified in the Fixed Regressors field below.
- Specify whether Automatic or Fixed lag selection will be used. Note that even if Automatic lag selection is preferred, maximum lag-orders need to be specified for the dependent variable as well as the regressors. If you wish to specify how automatic selection is computed, please click on the Options tab and select the preferred information criterion under the Model selection criteria dropdown menu. Finally, note that in EViews 9, if Fixed lag selection is preferred, all regressors will have the same number of lags. EViews 10 will allow the user to fix lags specific to each regressor under consideration.
- In the Fixed Regressors field, specify all variables other than the constant and trend, which will enter the model for estimation, but will not be a part of the long-run relationship. This list can include variables such as dummies or other exogenous variables.
- In the Fixed Regressors field, specify how deterministic specifications enter the long-run relationship. This is a dropdown menu which corresponds to the 5 different DGP cases mentioned earlier, and explored in Part 1 and Part 2 of this series. In particular, the Trend Specification dropdown menu offers the following options:
- None: This corresponds to Case 1 -- the no constant and trend case.
- Rest. constant: This corresponds to Case 2 -- the restricted constant and no trend case.
- Unrest. constant: This corresponds to Case 3 -- the unrestricted constant and no trend case.
- Rest. linear trend: This corresponds to Case 4 -- the restricted linear trend and unrestricted constant case.
- Unrest. constant and trend: This corresponds to Case 5 -- the unrestricted constant and unrestricted linear trend case. Note that this case will be available starting with EViews version 10.
Model 1: No Cointegrating Relationship
In this model, the dependent variable is the 10 Year Benchmark Bond Yield, while the dynamic regressor is the 1 Month T-Bill. Moreover, the DGP under consideration is a restricted constant, or Case 2, and we include the variable dum0708 as our non-dynamic regressor. We have the following output.To verify whether the residuals from the model are serially uncorrelated, in the estimation view, proceed to View/Residual Diagnostics/Serial Correlation LM Test..., and select the number of lags. In our case, we chose 2. Here's the output.
Similarly, testing for residual homoskedasticity, in the estimation view, proceed to View/Residual Diagnostics/Heteroskedasticity Tests..., and select a type of test. In our case, we chose Breusch-Pagan-Godfrey. Here's the output.
To test for the presence of cointegration, in the estimation view, proceed to View/Coefficient Diagnostics/Long Run Form and Bounds Test. Below the table of coefficient estimates, we have two additional tables presenting the error correction $EC$ term and the $F$-Bounds test. The output is below.
In fact, we can visualize the fit of the long-run equation and the dependent variable by extracting the $EC$ term and subtracting from it the dependent variable. This can be done as follows. In the estimation view, proceed to Proc/Make Cointegrating Relationship and save the series under a name, say cointno. Since the cointegrating relationship is the $EC$ term, we would like to extract just the long-run relationship. To do this, simply subtract the series cointno from the dependent variable. In other words, make a new series $\text{LRno} = \text{BBY10Y} - \text{cointno}$. Finally, form a group with the variables BBY10Y and LRno, and plot. We have the following output.
Model 2: Usual Cointegrating Relationship
In this model, the dependent variable is the 6 Months T-Bill, while the dynamic regressors are the 3 and 1 Month T-Bills. Moreover, the DGP under consideration specifies an unrestricted constant, or Case 3, and we include the variable dum0708 as our non-dynamic regressor. To avoid repetition, we will not present the output, but skip immediately to verifying whether the residuals from the model are serially uncorrelated and homoskedastic. We have the following outputs.To test for the presence of cointegration, we proceed again to the Long Run Form and Bounds Test view. We have the following output.
Model 3: Nonsensical Cointegrating Relationship
In this model, the dependent variable is the 2 Year Benchmark Bond Yield, while the dynamic regressors are the 1 Year and 1 Month T-Bills. Moreover, the DGP under consideration specifies an unrestricted constant, or Case 3, and we include the variable dum0708 as our non-dynamic regressor. To avoid repetition, we will only present tables where necessary to derive inference.As usual, we first verify whether the residuals from the model are serially uncorrelated and homoskedastic. We have the following outputs.
Request a Demonstration
If you would like to experience ARDL in EViews for yourself, you can request a demonstration copy here.EViews Program and Files
We close this series with the EViews program script that will automate most of the output we have provided above. To use the script, you will need the EViews workfile: ARDL.EXAMPLE.WF1'--------- 'Preliminaries '--------- 'Open Workfile 'wfopen(type=txt) http://www5.statcan.gc.ca/cansim/results/cansim-1760043-eng-2216375457885538514.csv colhead=2 namepos=last names=(date, 'bby2y,bby5y,bby10y,tbill1m,tbill3m,tbill6m,tbill1y) skip=3 'pagecontract if @trend<244 'pagestruct @date(date) wfuse pathto...ardl.example.WF1 'Set sample from Jan 2001 to end. smpl Jan/2001 @last 'Create dummy for post 07/08 crisis series dum0708 = @recode(@dateval("2007/06")<@date,1,0) 'Create Group of all Variables group termstructure tbill1m tbill3m tbill6m tbill1y bby2y bby5y bby10y 'Graph all series termstructure.line(m) across(@SERIES,iscale, iscalex, nodispname, label=auto, bincount=5) 'Do UR test on each series termstructure.uroot(dif=1, adf, lagmethod=sic) '--------- 'No Relationship '--------- 'ARDL: 10y Bond Yields and 1 Month Tbills. equation ardlno.ardl(trend=const) bby10y tbill1m @ dum0708 'Run Residual Serial Correlation Test ardlno.auto 'Run Residual Heteroskedasticity Test ardlno.hettest @regs 'Make EC equation. ardlno.makecoint cointno 'Plot Dep. Var and LR Equation group groupno bby10y (bby10y - cointno) freeze(mode=overwrite, graphno) groupno.line graphno.axis(l) format(suffix="%") graphno.setelem(1) legend(BBY10Y: 10 Year Canadian Benchmark Bond Yields) graphno.setelem(2) legend(Long run relationship (BBY10Y - COINTNO)) show graphno '--------- 'Non Degenerate Relationship '--------- 'ARDL term structure of Bond Yields. (Non-Degenerate) equation ardlnondeg.ardl(deplags=6, reglags=6, trend=uconst, cov=hac, covlag=a, covinfosel=aic) tbill6m tbill3m tbill1m @ dum0708 'Run Residual Serial Correlation Test ardlnondeg.auto 'Run Residual Heteroskedasticity Test ardlnondeg.hettest @regs 'Make EC equation. ardlnondeg.makecoint cointnondeg 'Plot Dep. Var and LR Equation group groupnondeg tbill6m (tbill6m - cointnondeg) groupnondeg.line freeze(mode=overwrite, graphnondeg) groupnondeg.line graphnondeg.axis(l) format(suffix="%") graphnondeg.setelem(1) legend(TBILL6M: 6 Month Canadian T-Bill Yields) graphnondeg.setelem(2) legend(Long run relationship (TBILL6M - COINTNONDEG)) show graphnondeg '--------- 'Degenerate Relationship '--------- 'ARDL term structure of Bond Yields. (Degenerate) equation ardldeg.ardl(trend=uconst, cov=hac, covlag=a, covinfosel=aic) bby2y tbill1y tbill1m @ dum0708 'Run Residual Serial Correlation Test ardldeg.auto 'Run Residual Heteroskedasticity Test ardldeg.hettest @regs 'Make EC equation. ardldeg.makecoint cointdeg 'Plot Dep. Var and LR Equation group groupdeg bby2y (bby2y - cointdeg) freeze(mode=overwrite, graphdeg) groupdeg.line graphdeg.axis(l) format(suffix="%") graphdeg.setelem(1) legend(BBY2Y: 2 Year Canadian Benchmark Bond Yields) graphdeg.setelem(2) legend(Long run relationship (BBY2Y - COINTDEG)) show graphdeg
Hi,
ReplyDeleteGreat series of posts. You mention that the fixed regressors do not appear in the long run equation, is a new feature, the ardl estimation in eviews 9 the fixed and dynamic regressors appear in the long run equation. Also, it would be useful to understand why they would not enter the long run equation if they are used to estimate the counteracting vector?
Thanks
That should read cointegrating vector sorry :)
DeleteThe latest implementatio of ARDL estimation is entirely consistent with theory, and we strongly urge you to update to our latest releases. To answer your question, the ECM consists of short-run dynamics and the cointegration equation. In the long-run, the short-run dynamics are done away with and what remains is the cointegrating, or equilibrating equation. Thus, some variables such as dummies or fixed regressors which can be used to define the short-run dynamics in the ECM estimation, become entirely irrelevant in the long-run, and should therefore NOT be included among the cointegrating variables.
DeleteGreat thanks, so Eviews 9, which is the version I am using is not the case? As these fixed regressors are included in the long run output, should this be ignored? Begs the question why are they included in the long run output?
Deleteshould an interactive dummy be used as a fixed regressor? or should it placed other variables to have lags ??
Delete
Deleteim using eviews 10 and i discovered that variables all variables that have 0 lags as estimate ADRL do not appear in the Short run estimates. See below
METHOD:ARDL
Variable Coefficient Std. Error t-Statistic Prob.*
LOGGDP(-1) 0.729338 0.094741 7.698244 0.0000
LOG(M2) -0.054603 0.015516 -3.519120 0.0022
LOG(EDU) 0.006620 0.043816 0.151092 0.8814
LOG(EDU(-1)) 0.078888 0.036176 2.180645 0.0413
LOG(INV) 0.088309 0.040008 2.207251 0.0391
LOG(INV(-1)) 0.010613 0.030427 0.348800 0.7309
LOG(INV(-2)) -0.074646 0.024045 -3.104468 0.0056
INF 0.001602 0.001033 1.551501 0.1365
INF(-1) -0.002897 0.001214 -2.386573 0.0270
INF(-2) 0.003304 0.000941 3.511306 0.0022
LOG(GE) -0.128762 0.027832 -4.626491 0.0002
LOG(TO) 0.079264 0.033308 2.379715 0.0274
C 0.985972 0.321798 3.063942 0.0061
R-squared 0.997087 Mean dependent var 4.376781
Adjusted R-squared 0.995340 S.D. dependent var 0.149235
S.E. of regression 0.010188 Akaike info criterion -6.048142
Sum squared resid 0.002076 Schwarz criterion -5.458609
Log likelihood 112.7943 Hannan-Quinn criter. -5.849782
F-statistic 570.5294 Durbin-Watson stat 1.986606
Prob(F-statistic) 0.000000
*Note: p-values and any subsequent tests do not account for model
ECM Regression
Case 3: Unrestricted Constant and No Trend
Variable Coefficient Std. Error t-Statistic Prob.
C 0.985972 0.105121 9.379364 0.0000
DLOG(EDU) 0.006620 0.025148 0.263251 0.7951
DLOG(INV) 0.088309 0.020482 4.311475 0.0003
DLOG(INV(-1)) 0.074646 0.018479 4.039497 0.0006
D(INF) 0.001602 0.000716 2.236764 0.0369
D(INF(-1)) -0.003304 0.000719 -4.592544 0.0002
CointEq(-1)* -0.270662 0.029385 -9.210970 0.0000
R-squared 0.809919 Mean dependent var 0.017598
Adjusted R-squared 0.766054 S.D. dependent var 0.018474
S.E. of regression 0.008935 Akaike info criterion -6.411779
Sum squared resid 0.002076 Schwarz criterion -6.094338
Log likelihood 112.7943 Hannan-Quinn criter. -6.304969
F-statistic 18.46395 Durbin-Watson stat 1.986606
Prob(F-statistic) 0.000000
* p-value incompatible with t-Bounds distribution.
i observed same in analysis and for (1 0 0 0) ARDL model selected, found only Coingr coefficient in ECM.
DeleteDid any of you get any solution for this problem yet? if yes, can you let me know how to solve this issue?
Deletehelp
DeleteI am encountering the same problems in eviews 12.
Deletesame
DeleteExcellent post. Many thanks. One question: Should the graph of the fit between the dependent variable and the equilibrating equation be NORMALIZED?
ReplyDeleteWe've updated the graphs to show normalized curves.
DeleteHow to normalize the cointegration relationships?
DeleteHi there,
ReplyDeleteWhat part of the ARDL process should be used for forecasting purposes - Eviews seems to generate a forecast from the original ARDL - which appears to be equivalent to the unconstrained ECM. Should this converge to the long run, given the model is dynamically stable?
Hi! You are right, the forecasts being used are not based on the constrained ECM and therefore do not a priori impose the cointegrating relationship in the forecast. The Pesaran and Shin (1998) original ARDL model (which is the one being used for EViews forecasting; the unconstrained ECM, if you will) demonstrates that the coefficients from that estimation are indeed consistent. Since the long run equation is determined from these parameters, it stands to reason that if a long term relationship exists, the forecasts being produced by EViews should converge to the cointegrating relationship.
DeleteOne comment with regards to using Dummy variables such as dum0708 that may sometimes necessitates the modification/simulation of the reported asymptotic critical values. The dummy here does not tend to zero with the sample size T and more importantly the fraction of observations where dum0708 are non-zero are "too larger" almost 60% of the sample. Please see Pesaran et al. (2001).
ReplyDeleteAgreed. We're mainly trying to illustrate how to use the features. Nevertheless, your comment is an important caveat. We will modify the content in the next few days to reflect this.
DeleteThank you. I and I believe many EV users appreciate your work and prompt responses.
DeleteOne more question regarding this issue. Having in mind that dummy variable in this example does not tend to zero with sample size T, wouldn't it be more reasonable to change its definition so that it takes value of 1 before june07 and 0 afterwards? Then the percentage of its non-zero values will be smaller. I'm no expert but does it seem like something we could do in order to avoid modification/simulation of critical values?
DeleteThank you EViews Team....A great job.....
ReplyDeleteHi,
ReplyDeleteIs there any plan to do similar kind of stuff for Panel ARDL estimators like PMG, MG & DFE estimators? It would be great to see a theory and application blog posts on that.
We will be producing similar blog posts on theoretical topics in the future, but topics and schedule will be somewhat ad-hoc.
DeleteI'll point out that there is really little relationship between the Bounds Test use of ARDL and Panel ARDL models, other than the name, so it doesn't immediately follow that panel ARDL would be discussed simply because of these posts.
I agree with you. Although Panel ECM/Panel ARDL does not have the concept of bounds test, yet they are extension of it in the panel context and estimate the long-run relationship and presence of congregation can be inferred from ecm term.
DeleteI am looking forward to many such elaborated theoretical topics with an application (or better REPLICATION).
Many many thank you.
Hi IHS Team,
ReplyDeleteThank you for such an elaborating post. I have the following questions regarding this post:
1. In Model 1, case 2 or restricted constant is chosen. My question is which variable decides trend specification: dependent variable or the regressor?
2. I think it is also necessary to test the stability of the ARDL model.
skd
You're welcome! We hope you're enjoying the series.
DeleteTo answer your questions:
1) Trend specification depends first and foremost on whether you want to have the trend specification present in the cointegrating relationship or not. In other words, if you choose to restrict the constant or time trend, what you're actually doing is saying that these deterministic variables will also be present in the long-run. However, choosing whether to include a deterministic variable is generally based on the nature of the dependent variable.
2) Stability in the context of the Pesaran Shin (1998) ARDL model is indeed an important subject. They make the assumption that the ARDL model being studied is in fact stable. In this regard, if you are simply looking to estimate an ARDL model to see if the estimates are valid, you should be concerned about stability. Luckily, this is easily verified by testing whether the root of the characteristic equation are outside the unit circle. In other words, does the ARDL lag polynomial produce stationary results. Nevertheless, the Pesaran, Shin, and Smith (2001) paper is a TEST for cointegration. In other words, it must allow for the possibility that the underlying cointegrating relationship may in fact NOT be stable. In this regard, the PSS(2001) paper does not a priori impose stability of the ARDL lag polynomial. However, if cointegration does indeed exist, the ARDL model will in fact be stable!
We hope that helps.
Yes, I am enjoying the series!
DeleteThank you.
skd
Hello Eviews,
DeleteShould I need to remove the structural break in the independent variables to make the model stability? Or let it be as the above explanation that ARDL model with cointegration will be stable.
Thanks,
Kate
Hi, can you deeply explain what is the nature of dependent variable?
Deletewhat if the dummy is an interactive dummy? should it be a fixed regressor? or should it also involve lags?
ReplyDeleteAs was pointed out earlier, having dummy variables can be a tricky situation. In general, if dummy variables are included, the non-zero components of the variable must vanish asymptotically (in the long-run), otherwise the critical values that are provided in the Pesaran, Shin, and Smith (2001) paper may be invalid. Nevertheless, estimation is still consistent and valid. This is because, if the dummy does not vanish asymptotically, then it will clearly be a part of the long-run equation, and new critical values must be obtained to account for this.
DeleteTo answer your specific question, if interact a dummy with a regressor, what I'm really doing is creating a new regressor which is just 0 in some parts, and not in others. This is equivalent to including a new regressor with some special features. There's no harm in inlcuding such variables, however, one must again be certain whether such a variable will be present asymptotically or not. If it is present asymptotically (in the long-run), then it must be a part of the cointegrating relationship. If this is the case, it is difficult to tell whether a modification to the critical values is necessary. This will probably depend on whether the dummy variable being used for interaction is present asymptotically as well. As to whether lags on this variable can be included, there's certainly no theoretical reason why they can't be. Should they? This is entirely a question of whether doing so will produce a more accurate model estimation... in other words, part science, part art.
Hope this helps.
You mean the interaction dummy should go with the dynamic regressors in the top box?
DeleteOnce we reject the t-Bounds test null hypothesis, is there a FORMAL or more streamlined way (especially in Eviews) to test for Degenerate case besides or in addition to graphing the relationship between the dependent variable and the equilibrating equation? Thanks.
ReplyDeleteI am confused with the automatic lag selection. Under what circumstances should both p and q lags be same and when can these be different. Dave Giles suggests adding more lags to the dependent only, when serial correlation is a problem. You suggest increasing both p and q.The two approaches give different results. For example lot easier to have a 4 and 2 maximum to get good model while 4 4 does not give a good model.I would love to use different maximums for p and q if you suggest these are technically acceptable without ifs and buts. Secondly What is the range for annual data some say 2 max some go till 5-6.
ReplyDeleteThe number of lags is entirely dependent on the data and model you're analysing. There is no general rule.
DeleteWill the Eviews have CUSUM test for ARDL. Running the test in OLS seems to have a bug.If I put 65 year data the graph would only show 2013 and 2014 or some times 2002 to 2017. As I increase variables and dummies the graph starts to reduce years shown.Is this a bug?Many others reporting same on the web.
ReplyDeleteEViews does not currently offer CUSUM for ARDL. We'll add it to the list of things to consider.
DeleteGood Day IHS Eviews.
ReplyDeleteThank you for sharing this valuable knowledge, everything becomes easier now.
I estimated an ARDL model of 6 variables. After several attempts (using different lags ) to find a better estimate, i got a selected ARDL model using AIC as (1,1,0,0,1,2) while using SIC is ARDL (1,0,0,0,1,2). My questions are:
1. Can I still use this model given these lags selection?
2. Which among the AIC and SIC is more appropriate?
ARDL (1,1,0,0,1,2) = AIC
ARDL (1,0,0,0,1,2,) = SIC
Thank you, in anticipation for your kind acknowledgement and assistance.
Best regards.
AIC and SIC are two very different things. AIC generally performs well when the objective is prediction. In fact, it is asymptotically equivalent to cross-validation. Moreover, AIC does not assume that your true model lies in the model space.
DeleteOn the other hand, SIC is something you would prefer if you are looking to obtain the most parsimonious representation among a group of models. In other words, SIC selects the simplest possible model to explain the data. Furthermore, SIC assumes that the true model lies in the model space.
Thus, to answer your question: it really depends on what the objective of the exercise is, and neither method dominates the other entirely. You can also visualize the model selection graph and table by clicking on View/Model Selection Summary/{Table,Graph}. There you can see how close the the competing model selection criteria as well as the models within them performed.
Thank you for the prompt response.
DeleteWith this kind of lags selection ARDL(1,0,0,,0,1,2), is it appropriate to continue using the model?
Thank you for your kind assistance.
Good post. i want to ask a question. what is the suitable criteria for choosing optimum lag in ardl estimation. AIC or SIC i have annual data of less than 60 years please guide me on which base i can choose criteria ... thanks.
ReplyDeleteVery useful post. I would like to raise a doubt. Can we use ARDL model when variables are seasonal in nature. Thanks in advance
ReplyDeleteHello,
ReplyDeleteThanks for the great post series on ARDL. I would like to ask why while using the dummy variable to account for the structural break in the ARDL estimation, you also don't use the break point unit root test also to test for unit root? Shouldn't the break point unit root test be used instead of the ADF since there is a structural break? Thanks..
Good point - it may be that the breakpoint unit root test is more appropriate in this case.
Deletehi i'm not very conversant with e views and i'm simply trying to know where i get the long run and short coefficients, i have ran the long run form and bound test as well as the error correction form tests, i suspect that the coefficient of cointeq(-1) is the short run coefficient, but i have no clue what the long run coefficient is, i would appreciate it if i could get a response
ReplyDeleteThank you for your great posts.
ReplyDeleteI would like to ask if in the case that we include a dummy for a break (e.g. Great Recession) as a fixed regressor in the ARDL, then when performing the bounds test with the F-statistic, do we take also into account the dummy variable? That is, the F-statistic tests whether all the coefficients are equal to 0 including the dummy, or the dummy is excluded when performing the bounds test? Thanks in advance..
Hi IHS Eviews and thank you for the great posts.
ReplyDeleteI would like to ask how did you manage to include a dummy(takes the values 1 for few periods and 0 everywhere else) as a fixed regressor in the ARDL estimation and this dummy does not appear in differences in the ARDL Error Correction Estimation?
When I include a dummy as a fixed regressor, then it always appears in differences as the rest of the variables (except of course the cointegrating equation where the dummy is in levels as it should) but in your example it does not. Is this correct and if yes what is the meaning of having a dummy variable in differences? Thank you.
Is your copy of EViews up to date? (Help->EViews Update)
DeleteThanks, the version is now updated and the problem is fixed!
DeleteHi IHS Eviews,
ReplyDeleteThank your great post of ARDL.
I would like to ask if it is possible to apply ARDL bound test for a model with interaction term. I have tried but I have faced the problem of syntax error with *. If it is impossible, what is the econometric method I should use to deal with the interaction term?
I really need your help. Thank you so much.
hello i have 3 variables model (Y X and Z),
ReplyDeletehow i can determine:
dlag Y(p,q,r)
dlag X(p,q,r)
dlag Z(p,q,r)
thank you
Not sure we understand the question - you might want to try posting on the EViews forum - forums.eviews.com
Deletehow i determine the optimal lag of ARDL model, ie for dy , dx, dz
DeleteGreat series of posts on this topic. Thank you Eviews team. :)
ReplyDeleteJust a quick question. Is this simulation done in Eviews 10? Because I am using Eviews 9 and it seems that t-Bounds test is not included in this version, only F-Bounds test. Am I right or am I missing something?
Yes, EViews 10
DeleteVery useful illustration to estimate the ARDL Bound test(Both t & F tests) t an based models. What is the command for getting the t and F Bound tests using Eviews 9.
ReplyDeleteYou'll need EV10
DeleteHi...
ReplyDeleteSome time We get very high value of F-statistic for Bound-test in ARDL, what that mean?
I have the student version. It doesn't contain ARDL estimates
ReplyDeleteStudent version 9 (and above) contains ARDL.
DeleteFrom the above "EViews 10 will allow the user to fix lags specific to each regressor under consideration"! Has this been implemented in EV10?
ReplyDeleteYes.
DeleteMy apology but unless I'm missing an update or missing how to implement it, EV10 does not allow this. It still allows the user to set a "fixed" lag to ALL regressors but not to each individually. Thanks.
Deletehttp://www.eviews.com/help/content/ardl-Estimating_ARDL_Models_in_EViews.html#ww265598
DeleteGood Morning,
ReplyDeleteplease I have one question; the stationary series around the trend (stationary trend) it is a stationary series, i.e. it is integrated of the order 0: (I (0))?
Best wishes
I'm not really sure I understand your question. If a series is stationary, then it is I(0). If a series has a trend, then it is not stationary but it is trend stationary. This means that after the removal of the trend, the series will be I(0).
DeleteThis comment has been removed by the author.
DeleteDear Eviews team,
ReplyDeleteThis blog is amazing. Thank you for all your efforts. :)
I have a question about the use of dummy variables in ARDL bounds testing framework. If we have reason to belive that there has been a shift in the long run (cointegrating) equation is it justifiable to include dummy variable (or even an interaction term) in a dynamic specification box and hence make them part of the cointegrating equation?
Indeed. The inclusion of dummy variables is supported in both theory and EViews. There is however an important caveat to this. Namely, the variation induced by the presence of dummy variables must not overpower the variation of the cointegrating relationship. In other words, the fraction of periods in which the dummy variables are non-zero shold tend to zero with sample size T, otherwise, the theory needs to be modified. This point was made explicit in the PSS(2001) paper, footnote 17.
DeleteThank you so much! :)
DeleteDear Eviews team,
ReplyDeletenice blog indeed!
For Model 2, the cointegrating factor is said to have a "p-value incompatible with t-bounds distribution". Does it mean that the model is misspecified or just that the t-bound are irrelevant in that case?
Also, the presence of heteroskedasticity in the residuals would not biase the coefficient but their errors (and thus t-stat too)? Am I correct?
Thank you, Tim
The note "p-value incompatible with t-bounds distribution" refers to the fact that the limiting distribution of the t-stat is not a t-distribution, but is rather of the Dickey-Fuller (functional of Brownian motion) type. Thus, the DF critical values should be used.
DeleteAs far as the time nature of residuals is concerned, theory states that it is serial correlation which causes bias, whereas heteroskedasticity causes inefficiency. Thus, serial correlation is a much bigger problem in practice than heteroskedasticity. In either case, we have take steps to correct for the presence of serial correlation by doing automatic lag selection to include enough lags, and we have accounted for heteroskedasticity by using HAC-versions of the t-statistic.
Hope this helps.
Thank you, much appreciated. T
DeleteDear Eviews team,
DeleteI noticed the example on model 2 does not test for heteroskedasticity after using HAC. Is there a reason for this? i.e. would the test for heteroskedasticity after using HAC fail to reject the null of homoskedasticity?
Dear EViews Team, Can I also use the DF critical values to perform the t-Bounds test when the comment "p-value incompatible with t-bounds distribution" appears in the conditional error correction regression?
DeleteHow can we display the Representation for the ARDL models separately for both Long run and Short run equations?
ReplyDeleteWhen the lag selection chooses the lag 0 for a given variable, why in the Conditional Error Correction Regression is it showed with the observation: ** Variable interpreted as Z = Z(-1) + D(Z)?? If the variable has lag 0, it doesn't appear in the ECM regression. The long run equation (which is the error correction component named as "CointEq(-1)" by Eviews) specifies each variable in levels. So in the final specification, shouldn't this variable appear in the form Z(-1)??
ReplyDeleteI suspect that the 0 lagged variable actually appears in the ECM regression, but Eviews omits it.
DeleteThen the form d(x) + x(-1) is achievable in the final specification.
Your understanding of what EViews does is not correct. First of all, the note Z = Z(-1) + D(Z) says that Z(-1)+D(Z) has lag 0 itself. What we are doing in the "Long run form and bounds test" is not using two parameters for each of Z(-1) and D(Z), which appears for all lagged variables under the theoretical ARDL decomposition. In other words, we are grouping Z(-1)+D(Z) into a single parameter, thereby imposing the lag 0 condition for Z. Given this fact, the CointEq(-1) estimate is the estimate of the long-run form, lagged by one. So, the lag 0 variable Z, which enters the long-run equation (and you can see this by looking at the "Long run form and bounds test" output), does indeed get lagged by 1 when estimated in the ECM regression. Hope that clarifies the issue.
DeleteThank you for the quick reply! I'm still confusing it though. My point is: the long run equation has the following specification: y = c + a.x + b.z, where x and z are regressors. If I'm not mistaken, the CointEq(-1) is the OLS residual of this equation, given by ECM = u_hat = y_hat - a.x - b.z. This term is is then lagged one period and replaced in the cointegration equation. So, from this specification, we already have a Z(-1). In order to have a term such like the one in the Eviews note (Z(-1)+D(Z)) we need a D(Z). A differentiated variable must come from the ECM regression, but its output doesn't feature a D(Z) if its lag selection is zero*. So, where does the D(Z) actually come from?
Delete*note that, for this short example, if the model is an ARDL(2,2,0), Eviews' ECM regression is then D(Y) = D(Y(-1)) + D(X(-1)) + D(X(-2)) + CointEq(-1)
Thank you!
Again, your understanding is a bit incomplete. First of all, denote the lag 0 regressor as Z, and the lagged regressors as X. Next, assume that we only have two regressors, Z and X. Then, EViews obtains the long-run equation from the regression in the "Long run form and bounds test" output. However, the long-run equation is NOT the residual series from this regression. Please see our FIRST POST on ARDL to understand the details. This is where I think you are making erroneous conclusions. In fact, the long-run equation is actually Y - cz/(1 - c1).Z - cx/(1 - c1).X, where c1 is the coefficient associated with Y(-1), cz is the coefficient associated with Z, and cx is the coefficient associated with X. Now, denote this long run term as CointEq. That is, CointEq = Y - a.Z - b.X, where a=cz/(1 - c1) and b=cx/(1 - c1). Now, CointEQ(-1) = Y(-1) - a.Z(-1) - b.X(-1). Finally, note that Z = Z(-1)+D(Z). Here, for the time being, just set Z = H = Z(-1)+D(Z). Thus, H(-1) = Z(-1). However, H(-1) = Z(-2) + (Z(-1)-Z(-2)) = Z(-1). So, you can see that Z(-1) indeed enters the CointEq(-1) without the need for an additional D(Z). This is precisely why we defined Z=Z(-1)+D(Z) to begin with. To be able to handle everything consistently, while still imposing lag 0 on Z.
DeleteHopefully this clarifies everything.
Thank you very much for all your patience, I think I caught it now. A related question: is it correct to say that the variable with a 0 lag has no short run impact in the underlying ARDL model (due to the fact that it isn't included in the ECM regression, which is defined in the literature as the short run adjustment)??? Thank you.
DeleteHi, thanks for the explanations. I have the same concern: please guide me in explaining short run impact for those exogenous variables with 0 lag
DeleteUp...
DeleteWhat can we say about short term effects of zero lagged variables?
Consider the long-run and short run impact of the zero-lag variable as you would the constant, which is technically a zero-lag variable as well. In other words, as in the restricted constant version, the zero-lag variable gets absorbed by the long-run equation (cointegrating equation) entirely.
DeleteThis comment has been removed by the author.
DeleteThis comment has been removed by the author.
DeleteHello EViews team, I.e. the interpretation of the short-term equation (ECM Regression) does not concern these variables (Zero lag variable), despite we can calculate the short term coefficients of these variables!
DeleteHello Eviews Team, I am currently facing the same problem as indicated by Lucas. Could you please provide a layman's interpretation of the shortrun coefficients of the variables indicated as thus: ** Variable interpreted as Z = Z(-1) + D(Z). Thank you
Deletedear sir, how can we conduct wald test for asymmetry if Z = Z(-1) + D(Z)?
DeleteGood Morning, Please... If a series it is Ts (Trend Stationary) at the level, it imply that this series it is I (0) and not I (1)?
ReplyDeleteBest wishes
From a purely technical definition, a trend stationary process is NOT stationary. For instance, if we have the following setup:
Deletey_t = Ay_{t-1} + B + Ct + e_t
You can clearly see that we have two cases to consider. One in which |A|<1 and the other in which |A|=1 (ignoring explosive cases). In both cases, the series is NOT stationary as it currently stands. Even if |A|<1, the expectation of y_t, namely E(y_t), depends on t. Hence, the process is not stationary. However, after demeaning, it becomes stationary. That is, the process y_t - E(y_t) is stationary because the impact of the trend has been removed through demeaning.
On the other hand, if |A|=1, the proces is always non-stationary, even after demeaning. That is, demeaning, while removing dependence on t coming from the trend, does not remove dependence on t through the variance which is coming from the fact that |A|=1.
Hope that helps.
Hello IHS Eviews,
DeletePlease, what the antonym of explosive?
Cordially
Hello researchers, pleas help me
ReplyDeleteI have some questions, in my paper I have six variables as well as dummy variable. the variable are stable at the level and first difference.
I tried to use ARDL Modle to analyze time series data for 34 years. But I faced two problems:
1. The model suffers from serial correlation problem ( Even though I were used a number of data transfers( log and %), and different agents for my variables, also different lag ( lag 3 maximum )
2. The value of the ECT less than - 1 and some times less than -2.
My question are :
1 - How to remove the serial correlation problem from ARDL MODEL ?
2. The value of ECT can be less than -1 or less than - 2 or not ?
3. What the alternative methodology and suitable to analyze my data other than ARDL if I cant solve this problems ?
Thank you in advance for your cooperation.
Hi kindly help me..
ReplyDeleteI got these results using ARDL in Eviews 9.5 for Short Run (ECM) Model. how to interpret this ?? kindly explain in details because non of variable appeared here in difference form as in CEC model it appeared.
ARDL Error Correction Regression
Dependent Variable: D(AP)
Selected Model: ARDL(1, 0, 0, 0, 0)
Case 2: Restricted Constant and No Trend
Date: 12/28/17 Time: 02:03
Sample: 1980 2016
Included observations: 36
ECM Regression
Case 2: Restricted Constant and No Trend
Variable Coefficient Std. Error t-Statistic Prob.
CointEq(-1)* -0.520381 0.095594 -5.443658 0.0000
R-squared 0.385599 Mean dependent var 0.014078
Adjusted R-squared 0.385599 S.D. dependent var 0.038916
S.E. of regression 0.030504 Akaike info criterion -4.114527
Sum squared resid 0.032568 Schwarz criterion -4.070540
Log likelihood 75.06148 Hannan-Quinn criter. -4.099174
Durbin-Watson stat 2.117325
* p-value incompatible with t-Bounds distribution.
F-Bounds Test Null Hypothesis: No levels relationship
Test Statistic Value Signif. I(0) I(1)
F-statistic 4.233344 10% 2.2 3.09
k 4 5% 2.56 3.49
2.5% 2.88 3.87
1% 3.29 4.37
This comment has been removed by the author.
DeleteHello,
DeleteSorry, I worked with the Eviews 9.5 and now I just want to confirm these something:
1)- How many cases-ARDL in EViews 9.5?
2)- Are the critical values of Bounds test-ARDL in case 4 (with TREND) on EViews 9.5 the same with case 5 on EViews 10?
B.W
If I would like to estimate an equation with say 5 variables using the ARDL bound test approach, is it necessary for me to perform the bound test for 5 different times with each of the 5 variables as the dependent variable in turn to check whether there is any endogeneity issue?
ReplyDeleteARDL assumes only one cointegrating relationship. If you are searching for this relationship, then you should perform 5 different ARDL analyses and make your conclusions from them. Alternatively, if you know what the dependent variable is, you only need to run the ARDL model once.
Deletewhen it comes to interpretation of estimated ARDL model what are the things except error correction term we need to focus on? if research focuses on estimating effect of independent variable on dependant one, what about the significance of long run parameters? If they are insignificant what should be the conclusion?and finally, if bound test confirms no cointegration but error correction term is signifant and shows convergence, what does it mean?
ReplyDeleteWhich one is better automatic lag input or fixed.
ReplyDeleteDear IHS team, than you very much for this great post! One question. If F-test shows that there is no equalibrating relationship among variables, what should I do? I do not want to difference series and use OLS
ReplyDeleteHi,
ReplyDeleteWhen running ARDL case 1 (none in trend regression) with sample size 45, I got the critical value for F-bound test and t-bound test is -1 for all significance level. What does it mean?
Make sure your copy of EViews is up to date.
DeleteHi,
ReplyDeleteYour help would be much appreciated! I am trying to understand whether oil revenues have a an impact on female labor force partciaption. I have attached my results.
I use an ARDL model in eviews and find that the lag of oil is significant. But when I go for long run relationships, ie ARDL long run form and bounds test, I do not find any significance, and cannot reject the null of no cointegration. I am just wondering if I can still interpret my results from the first step, even if variables are not cointegrated.
Hello IHS Eviews,
ReplyDeleteThank you so much for very informative and exhaustive post on ARDL. Can you please let me know, if it is valid to to apply Fisher's combined probability test on individual bound test results of different cross sections in a panel data set.
How can I get the current Eview 10.0
ReplyDeletehttp://www.eviews.com/general/prices/prices.html
Deletehow to identify short run and long run coefficient in ardl model in eviwes10
ReplyDeleteI'm not sure I understand what exactly you mean by identification. Nevertheless, if you mean what is the coefficient associated with the cointegrating relationship (the long run), then, after estimating the ARDL model, you can obtain this coefficient by going to View/Coefficient Diagnostics/Error Correction Form. The coefficient will be labeled CointEq(-1) in the table output. All other coefficient estimates in this output will be associated with the transitory (short-run) variables.
DeleteIf I understood correctly, the step 2 states that you have to check whether variables are all non-stationary in 2nd difference. Why you didn't test for unit root in 2nd difference to ensure that none of variables are I(2)?
ReplyDeleteIf a process is I(2) then its first difference is I(1). In other words, the first difference has a unit root. Thus, we test for a unit root in the first difference process.
DeleteIs it possible that you get contradicting results between F-Bounds test and t-bounds tests for a given level of significance? For instance, I got F-statistic value as 4.6194414 with I(0) = 2.86 and I(1)= 4.01. Also t-statistic value is -3.612247 with I(0) = -2.86 and I(1) = -3.99. Thus, with F-statistic there is clearly cointegration while with t-statistic, what can I conclude about cointegration? This is at 5% level of significance. Please help me out.
ReplyDeleteThis comment has been removed by the author.
DeleteNote that the F- and t- tests are not alternative tests. Their functions are entirely different. Thus, you cannot get "contradicting" results between them. Both tests are necessary and complement each other. Please have a look at the last branch graph in the ARDL part 2 entry. From here, you can see taht if you fail to reject the null hypothesis (based on F-test), then you don't have to run the t-test. However, if you do reject the null hypothesis, then you should run the t-test to identify which of the 3 alternative hypotheses arises. If you fail to reject the null hypothesis of the t-test, then you know that alternative hypothesis A1 (non-sensical relationship) came about. Alternatively, if you reject the t-test hypothesis, then either alternative A2 (cointegrating relationship is standard, and it does indeed exist) or A3 (cointegrating relationship is degenerate) arises.
DeleteThis comment has been removed by the author.
DeleteThis comment has been removed by the author.
ReplyDeleteI got an EC term of -1.13 in my ECM Regression which is negative but less than -1. Is this a problem or we can interpret it somehow? What could be the reason?
ReplyDeleteThis is a great question! To answer it however, we need to understand what dynamic (in)stability means. The idea of dynamic stability (convergence) stems from the study of difference equations. In time series, and in our particular case, this typically reduces to first-order difference equations of the form:
Deletey_{t} = b*y_{t-1} + u_{t}
Now, mathematical theory of difference equations says that there are 4 possible types of dynamic paths that y_{t} can assume in this system:
1) Oscillatory path (fluctuates above and below some value): this happens when b < 0
2) Non-oscillatory path: this happens when b > 0
3) Convergence (stability): this happens when -1 < b < 1
4) Divergence (instability): this happens when b =< -1 or b >= 1
Now, consider the ECM equation which we're going to simplify as follows:
Delta y_{t} = (EC)*y_{t-1} + other terms
where EC is the error-correction coefficient. Next, re-write this equation in terms of y_{t} to obtain:
y_{t} = (1 + (EC))*y_{t-1} + other terms
In light of our discussion on difference equations, we can see that the dynamics of this system are effectively governed by the term b = 1 + (EC). This is the same b we used above. Alternatively, in terms of (EC) we obtain that (EC)=b-1.
Consider now the four dynamic cases:
1) Oscillatory (b < 0): This implies that (EC) < -1
2) Non-oscillatory (b<0): This implies that (EC) > -1
3) Convergence (-1 < b < 1): This implies that -2 < (EC) < 0
4) Divergence (b =< -1 or b >= 1): This implies that (EC) =< -2 or (EC) > =0
From this, we can see that the allowable range (the one that leads to a stable system) for the error-correction coefficient in an ECM regression is in fact from 0 to -2 (case 3) above) and NOT from 0 to -1 as most people perceive. Furthermore, we see that if the error-correction coefficient (EC) is less than -1 but not less than -2 (which is case 1) and case 3) simultaneously, and the case you have in your regression), this is perfectly fine. Your system is in fact stable, but, according to case 1) above, will simply lead to oscillatory behaviour. In other words, your system will oscillate above and below the equilibrium value in a dampening fashion until it eventually settles down to the equilibrium path.
Hopefully this answers your question.
Thank you very much for your comprehensive answer. It is the most sensible way of interpreting EC term I’ve found so far.
DeleteHello EViews team...
DeleteExcuse me, the best value of EC, it is between -1 and 0? or between -2 and -1?
Thank you
There is no such thing as "best". Generally people prefer to have the EC term bounded between 0 and -1. This is because the model is convergent without an oscillatory trajectory and is therefore easier to interpret.
DeleteThis comment has been removed by the author.
DeleteGood Evening Eviews team
DeletePlease, if EC belongs to the interval -2, -1 the equilibrium is faster?
B.W.
Hi there eViews team
ReplyDeleteI'm using NARDL in eViews and I would like to ask something
In Shin et. al(2013), they drop insignificant different lags from their model. I was wondering how do we achieve this using eViews and if we can't, will there be any issues?
Thank you in advance!
Hi all, I have an ARDL model with all variables lagged by one. When when I go to the Error correction form, it will only show the variables but not their lag. Do you know why the lagged variables (in this case lag 1) are missing from the ECM Regression in Eviews 10, please? Thank you
ReplyDeleteThis comment has been removed by the author.
DeleteThanks for your reply. What do you mean, please?
DeleteHello,
DeleteYou have to go to Eviews10, and after the general ARDL model estimation please click on:
ARDL Long Run Form and Bounds Test
Cprdially
This comment has been removed by the author.
ReplyDeleteDear Prof. Dr. Dave Giles. I salute to you. This is one of the best blogs I have ever seen. I appreciate it and I would learn more from your analysis. Can you please share this data with me so that i could personally try doing so for practice. I would be highly obliged. Here is my email khan.himayatullah@aup.edu.pk
ReplyDeleteThe blog was written by the EViews team...
DeleteHi team
ReplyDeleteI am using ARDL model on eviews 10. My variables are cointergrated and ECM is negative and significant. However I am expected to report on short run and long run estimates. Which of of the coefficients are long run coefficients?
coeff diag/long run and bound test -> level equation.
Deletehi team eviews, my case is same please guide which of the coefficients are short run coefficient in eviews 10
Deletehow about the short-run coefficient in eviews 11?
DeleteHi eviews!
ReplyDeletewhy t-test should be use if i=1,3,5. How about i=2 or 4?
Thankyou
Hello,
DeleteSorry, and Warning; why t-Bounds test... and no t-test?
B.W
This comment has been removed by the author.
DeleteMy F bound test shows cointegration does not exist, but the ect in ecm is highly significant. How should i explain this?
ReplyDeleteHello,
DeleteAccording to the ARDL procedure: if the F test indicates no co-integration, so stop.
Cordially
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteHello EViews team,
ReplyDeletePlease, for example: in ARDL(1,1,0)
To do the Bounds test, what equation ((1) or (2)) are we taking?
ls d(y) c y(-1) x1(-1) x2(-1) d(x1) ......... (1)
ls d(y) c y(-1) x1(-1) x2 d(x1) ......... (2)
Best wishes.
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteHi many thanks for the post. It is really helpful.
ReplyDeleteI have a question about ARDL forecasting,I got 4338 initial sample size,I want to use ARDL to see the future value of each variables(like VAR forecasting),but after I extended the sample range (size)to 50000,I made forecasting,finally I still back to the initial sample size. It is more likely to be a simulation rather than forecasting. May I have some suggestions aabout this?
Thanks again!
Hi. I have two short questions about constant and trend specifications: firstly, my version of EViews 9 only shows three options ("none", "constant" and "linear trend"), while the images you show here display apparently five cases. In what version can I access these options?
ReplyDeleteSecondly, the inclusion of constant and trend has a considerable effect in my coefficients, even when the constant and trend are not significant. Is that normal? What is the guideline here - simply drop them out when they are non-significant or comparing some criteria (e.g. AIC)?
By the way, my dependent variable is nominal (prices), so we do not expect a growing trend for long periods (indeed the OLS regressossion against a constant and a linear trend yields a coefficient near 0 for the trend).
Thank you!
ARDL estimation was changed dramatically between EViews 9 and 10, and this blog post applies to EViews 10 and beyond only.
Deletehi i'm not very conversant with e views and i'm simply trying to know where i get the long run and short coefficients, i have ran the long run form and bound test as well as the error correction form tests, i suspect that the coefficient of cointeq(-1) is the short run coefficient, but i have no clue what the long run coefficient is, i would appreciate it if i could get a response
ReplyDeleteIn short, the long-run coefficients are those which are inside the cointegrating equation. Everything else is considered to be a short-run variable. After estimation, if you go to View/Coefficient Diagnostics/Long Run Form and Bounds Test, you will see a table with the header "Levels Equation". The coefficients in this table are considered long-run coefficients.
Deletehow do i specifically check if the coefficient of the levels equation is equal a specific number eg that the coefficient is significantly different from 1?
Delete@IHSEViews Should we analyze the estimates under the "Conditional Error Correction Regression" as the short-run estimates or the estimates from the "ECM Regression" after going to view/Coefficient Diagnostics/Error Correction Form?
DeleteShould we merge the two (pick the estimates whose lags are at level from the Conditional Error Correction Regression and add to the the estimates from the "ECM Regression")?
What is the relevance of imposing restrictions on the choice of any of them if any exist?
What happens when the f statistic is highier than the critical value but the t-statistic of the t-Bounds Test lien in between the critical values?
ReplyDeleteMost papers do not even report the T-statistic and some that to do, have the T statistic in between the critical values and do not even comment on it, as if it didn´t even exist. I’ve only seen comments on it when the T-statistic is above the I(1) critical value just as a complementary confirmation of cointegration of the variables.
DeleteFrom what I´ve read, this problem might arise when the explanatory variables are mutually cointegrated, which if true, should arise as an extremely common problem for this kind of tests shouldn’t it?
This comment has been removed by the author.
ReplyDeleteGood morning EViews team.
ReplyDeleteWhy EViews, estimates the short-term equation without the variables having a zero lag?
For example; ARDL(21234560)
Eviews estimates the short-term equation without the seventh explanatory variable!
My F bound test shows that the cointegration is exist, but the ect in ecm is not significant. How should i explain this?
ReplyDeleteHello,
DeletePlease, are you in case 1, 2, 3, 4 or 5?
3
DeleteAnd how if my stability test show that the model is stable, but the ect <-2?
DeleteHello,
DeleteOk, what is the result of "t-Bounds test" for the coefficient of Yt-1?
This comment has been removed by the author.
ReplyDeleteHello,
ReplyDeleteThank you for the great efforts you put enhancing Eviews.
I have a few doubts about ARDL estimation.
I would be appreciate it if Eviews team can guide me here!
1- Is CUSUM test performed on the ECM of the long run equation?
2- Can the long run coefficients be interpreted as positively/ negatively impact the independent variable in the long run?
3-In a two-variable (y and x) ARDL model, if the F-Statistics in the Bounds test is greater than I(1) bound, but the long run coefficient is not significant, can one say there is a long run relationship between x and y?
Thank you!
A
1) At the moment, we don't have the CUSUM statistic implemented.
Delete2) The most we can say in the long run (if there is evidence of cointegration) is that both the independent and dependent variables are governed by common forces. This implies that if you do run a regression in the long run, that regression will be valid and whatever relationship exists at that point can be interpreted in the usual way.
3) If if F-statistic is greater than the I(1) bound then you reject H0: no cointegration. Thus, fail to reject that the series are cointegrated. In the next step, if you run the test on the long-run coefficient and you reject significance, it means that the relationship between x and y is non-sensical. Please read Part 2 of the ARDL blog series for further details.
IHSEviews, thank you for your reply!
DeleteIf I may ask one final question. Can I test the stability or the stationarity of the EC model by testing the inverse roots (AR Structure). Kindly can you advice me how to?
Thank you again!
Amir
This comment has been removed by the author.
ReplyDeleteHello EViews team.
ReplyDeletePlease, this is just for confirmation, Am I right or am I missing something? "if we have case 4 or case 2 in ARDL modeling, we can have a degenerate cointegration but can not have a nonsensical cointegration".
Cordially
Have you got the answer for this one? Part 2 is not opening here. It seems that they removed it. In cases 2 and 4, we do not perform the t-test for nonsensical cointegration. Do you have a clue why?
DeleteHI Eviews team,
ReplyDeletethank you for your great efforts.
i have found evidence of co-integration since my F-statistic is beyond the upper bound. however when i generate long run coefficients through ( long run form and bound test) i find large coefficients:
Variable 1(-45.26939
Variable 2 (46.50257)
Variable 3(-22.63170)
Variables 4 (57.91694)
Variable 5 (-82.49271)
Variable 6 (15.40217)
Variable 7 (-73.66744)
Constant 459.2406
is this normal, provided that the regression passed all tests
thank you
Hi , your post is really special I am using annual time series data of 27 years of 5 variables(using eviews 9). The f stat shows there is cointegrating relationship in the variables. ECT is also negative and significant. My only question us that are the values of f stat reliable for a small sample like mine. To date I haven't read eviews or Ardl methodology mentions that sample should be greater greater or equal to 30.
ReplyDeleteCheers!
Hello,
ReplyDeleteThank you for the great efforts you put enhancing Eviews.
but I still have a few doubts about ARDL estimation, although i have read all of posts n comments in each post.
you have said that "Everything else outside the CointEq is considered to be a short-run variable." but there exist three alternative representations so there will be three different results.
my question are:
1. what's the difference between parameter estimation in intertemporal dynamic estimation and in conditional error correction (CEC) representation, i know that CEC is the reduce form, i mean the difference in how to interpret the parameter
2. what is the correct parameter to interpret as a short run parameter, using parameter in intertemporal dynamic estimation or in conditional error correction (CEC) representation? because i have read many journal and got the different result about using short run parameter. and as we know that the result is also different.
I would be appreciate it if Eviews team can guide me here!
This comment has been removed by a blog administrator.
ReplyDeleteHullo, my ARDL F-statistic = 328.914.
ReplyDeleteIs this acceptable for concluding cointegration?
Dear Eviews team,
ReplyDeleteI've already tested de data and the residual for my regresion, it doesn't appear to have any problems whatsoever. However when I tried to visualize the fit of the long-run equation (by generating the "Cointegrating Relationship") I get big numbers (around 7 and all positive terms). Do you have any idea what could be causing this? Thanks in advance!
By the way, when I run your example I get "Cointegrating representation from ARDL equation ardlno
Deleteardlno.makecoint" when I run mine I get "Cointegrating representation from ARDL equation
{%equation}.makecoint"
For example, how if my gdp didn't appear in ECM, but it does appear in long-run ?
ReplyDeleteHi EViews team. Thank you for the wonderful and concise work in this three-part post on ARDL models.
ReplyDeleteI've been wondering about this tricky issue for years now, maybe someone can help me out:
As we know, the unrestricted CEC regression and the ECM regression are estimated via OLS. As so, I believe that one should be able to replicate the coefficients estimated within the EViews' ARDL window in the Least Square window. Note that for the ECM regression, one should run the long-run equation, save its residuals (Proc/Make Residual Series) and use its lagged form to run the ECM regression (i.e., it is a two-step estimation and I know researchers that did exactly that way before EViews 9 provided the ARDL tool).
To make it clearer, for an ARDL (2,2) model, one should be able to replicate the coefficients running an OLS as: d(y) d(y(-1)) d(x(-1)) d(x(-2)) resid(-1), where 'resid' is the residual series obtained from the long-run equation. However, it turns out that all coefficients, standard errors, and most diagnostics are totally different when comparing the built-in ARDL method to the two-step OLS approach. I've tried this several times in the last years, with different data. I’ve tried all OLS, FMOLS, and DOLS and got anywhere near the actual coefficients displayed in the built-in method. For the sake of comparison, there are similar packages in other languages that precisely replicate the ARDL coefficients with the more basic LS regression methods. Any thoughts on that?
If you are asking why would I want to run the CEC and ECM regressions via the OLS window, it is because we can’t compute Wald tests on the coefficients of the CEC regression. Note that in View\Representations, the user has access to coefficients from the basic estimation (the one that pops out as soon as you regress the model), which I believe represents the 'intertemporal dynamic estimation' as seen in part 1. In fact, everything one can do in terms of coefficient diagnostics (confidence intervals, variance inflation factor, etc) refers to these coefficients, not to those seen at either CEC or ECM windows.
HI Eviews Team, I started used Eviews a day ago. I was looking at the graphs above and wanted to understand how did you normalise the data series in the graph. The example codes do not show it. Will appreciate your guidance. Thanks.
ReplyDeletehi i did my thesis by using ARDL model but still now ECR is not clear and some variables are not appear in the short run model. can you explain about this?
ReplyDeleteHello EViews Team, i really want to know why the lags are decreasing in the ECM, for example variables that have lag optimum (6), in the ECM it shows (-5). I belief we use ARDL(6,6,3) above. Why does it?
ReplyDeleteBonjour l'équipe evews comment interpréter le coefficient de fichiers modèle 5 (contant et trend).
ReplyDeleteDear all,
ReplyDeleteI detected heteroskedasticity in my model. I applied the HAC covariance matrix, but the problem continues. After applied the test again, both f-statistic and p-value havent changed.
Any idea what it is happening or how I could solve the heteroskedasticity problem?
Thanks,
Aline
This comment has been removed by the author.
DeleteDid you apply the logarithmic transformation to your data? In general, a Box-Cox type transformation should solve your problem (in principle) in order to eliminate the variability of your variance, the logarithmic transformation is one of them.
DeleteDear EViews Team, Can I also use the DF critical values to perform the t-Bounds test when the comment "p-value incompatible with t-bounds distribution" appears in the conditional error correction regression?
ReplyDeleteHello Eviews team, best regards and thank you very much for this help with this type of estimation.
ReplyDeleteI have a general query that goes one step beyond the example cited in this tutorial, what would happen if the variable Y must be explained by 3 variables X1,X2 and X3 but each of those Xs variables has structural breaks in different periods of time? To place dummies, what would you prioritize, the variables or the error of the model in general?
From already thank you very much for your time!
Does the ARDL condition require that the dependent variable be I(1) ?
ReplyDeletehow i can increase the F in bound test
ReplyDeleteThank you for the great effort.
ReplyDeleteI have a question regarding Coefficient covariance matrix and Standard error.
My ARDL model in Eviews 13. Dependent variable followed by regressors: cgb10y ctb3m
Fixed regressor: ccpi, ip
Lag selection: Automatic
When I change Coefficient covariance matrix from Ordinary to HAC, I expect to have different standard errors. Is my expectation wrong or why there is no change in Standard error?
Thank you in advance for your help.
Hi. Thanks for a great tutorial. I would like to know the remedy to "nonsensical cointegration". Regards
ReplyDeleteThank you for your explanation! I have question regarding the forecasting in future? how can i predict for the future 24 run (2 years, monthly data) using ARDL with different lag ARDL(1,1,1,2,0). i already run for the bound test and also the coefficient for long run. When im using VAR, the lag become 2 for all the variable :(
ReplyDeleteBut in ARDL you mast take N(30,80)
Deletesorry, means?
DeleteTo apply the ARDL , the time series must be from 30 to 80 observations
DeleteLess than 30 you can apply Bostraping ARDL
DeleteMy time series data is 64 observation. I have problem to forecast in future (i want to forecast 24 observation in future). Yet i cant find how to run the forecasting in future after determine my best model
DeleteI am using EViews 14. Previously, I used EViews 12, but I upgraded to EViews 14 for the quantile ARDL model. My question is that in EViews 12, the long-run coefficients and their standard errors in the ARDL model were explicitly shown, but they are not visible in EViews 14. While I can manually calculate the long-run coefficients, it is difficult to calculate the standard errors from the coefficients in lagged form. Someone suggested using the delta method, but it is almost impossible to calculate the standard errors using that method when there are numerous variables. Is there any way to find the standard errors of the long-run level coefficients? Thank you for your time.
ReplyDelete