Introduction to econometrics II eco 356 faculty of social sciences course guide course Developers: Dr. Adesina-Uthman



Download 1.75 Mb.
View original pdf
Page54/96
Date10.11.2023
Size1.75 Mb.
#62567
1   ...   50   51   52   53   54   55   56   57   ...   96
Introduction to Econometrics ECO 356 Course Guide and Course Material
2.3.3.3 t Tests and Confidence Intervals
The t tests on the regression coefficients are performed in the same way as for simple regression analysis. Particular attention should, however,be taken when looking up the critical level of t at any given significance level. It depends on the number of degrees of freedom (n k); the number of observations n minus the number of parameters estimated k. The confidence intervals are also obtained in the same manner as in simple regression analysis and equally based on the number of degrees of freedom (n k).
2.3.3.4 Consistency
Once the fourth Gauss–Markov condition is satisfied, OLS yields consistent estimates in the multiple regression models, as is the casein thesimple regression model. One condition for consistency is that when n becomes large, the population variance of the estimator of each regression coefficient tends to 0, and the distribution falls to a spike. The other condition for consistency is since the estimator is unbiased, the spike would be located at the true value.
2.3.4.0 MULTICOLLINEARITY
In most situations, the available data for use in multiple regression analysis would not provide significant solutions to problems at hand. The reason being that the standard errors are very high, or the t test ratios are very low.Which means the confidence intervals for such parameters are very wide. A situation of this nature occurs when the explanatory variables show little variation and high intercorrelations. Multicollinearity is the aspect of the situation where the explanatory variables are highly intercorrelated.


INTRODUCTION TO ECONOMETRICS II

ECO 306

NOUN
80 Lets look at multicollinearity in a model with two explanatory variables. It would be observed that the higher the correlation between the explanatory variables, the larger the population variances of the distributions of their coefficients and the greater the possibility of attaining irregular estimates of the coefficients. You should, however, bear in mind that a high correlation does not necessarily lead to poor estimates. If all the other elements determining the variances of the regression coefficients are properly in the number of observations and the sample variances of the explanatory variables are large and the variance of the disturbance term small, good estimates could still be obtained. Multicollinearity, therefore, must be caused by a mixtureof a high correlation and one or more of the other elements being inappropriate. This is a matter of degree and not kind of element of which any regression will suffer from it to some extent unless all the explanatory variables are uncorrelated. But the consequence is only taken into consideration when it is obviously going to have aserious effect on the regression results. It is a common problem in time series regressions, particularly where the data consists of a series of observations on the variables over a number of time periods. Which may give rise to multicollinearity if two or more of the explanatory variables are highly correlated in a strong time trend. Using Table 3.1 as an example lets consider first the case of exact multicollinearity where the explanatory variables are perfectly correlated. Table 3.1
X
2

X
3

Y
Change in X
2


Change in X
3


Approximate change in Y
10 19 51+u
1 1
1 5
11 21 56+ u
2
1 1
5 12 23 61+ u
3 1
1 5
13 25 66+ u
4 1
1 5
14 27 71+ u
5 1
1 5



Download 1.75 Mb.

Share with your friends:
1   ...   50   51   52   53   54   55   56   57   ...   96




The database is protected by copyright ©ininet.org 2024
send message

    Main page