Introduction to econometrics II eco 356 faculty of social sciences course guide course Developers: Dr. Adesina-Uthman



Download 1.75 Mb.
View original pdf
Page56/96
Date10.11.2023
Size1.75 Mb.
#62567
1   ...   52   53   54   55   56   57   58   59   ...   96
Introduction to Econometrics ECO 356 Course Guide and Course Material


(
) (
)
(
) (
)
(
) (
) , (
-)-

…[2.69]
Which is unusual for thereto bean exact relationship among the explanatory variables in a regression. So, when this occurs, it is typical because there is a logical error in the specification.
2.3.4.1 Multicollinearity in Models with More Than Two Explanatory Variables
The previous discussion of multicollinearitywas restricted to the case where there are two explanatory variables. In models with a greater number of explanatory variables, multicollinearity maybe caused by an approximately linear relationship among them. It maybe difficult to discriminate between the effects of one variable and those of a linear combination of the remainder. In the model with two explanatory variables, an approximately linear relationship automatically means a high correlation, but when there are three or more, this is not necessarily the case. A linear relationship does not inevitably imply high pairwise correlations between any of the variables. The effects of multicollinearity are the same as in the case with two explanatory variables and as in that case, the problem may not be serious if the population variance of the disturbance term is small, the number of observations large and the variances of the explanatory variables are equally large.
2.3.4.2 Ways to alleviate multicollinearity problems
Two categories exist to alleviate multicollinearity problems
i.
The direct attempts to improve the four conditions responsible for the reliability of the regression estimates, and


INTRODUCTION TO ECONOMETRICS II

ECO 306

NOUN
83
ii.
The indirect methods. First, you may try to reduce
. The disturbance term is the joint effect of all the variables influencing Y that you have not included explicitly in the regression equation. If you can think of an important variable that you have omitted, and is therefore contributing to u, you will reduce the population variance of the disturbance term if you add it to the regression equation. Second, consider n, the number of observations. If you are working with cross-section data (individuals, households, enterprises, etc) and you are undertaking a survey, you could increase the size of the sample by negotiating a bigger budget. Alternatively, you could make a fixed budget go further by using a technique known as clustering. A further way of dealing with the problem of multicollinearity is to use minor information, if available, concerning the coefficient of one of the variables.
…[2.70] For example, suppose that Y in equation is the aggregate demand fora category of consumer expenditure, X is aggregate disposable personal income, and P is a price index for the category. To fit a model of this type, you would use time series data. If X
and P possess strong time trends and are therefore highly correlated, which is often the case with time series variables, multicollinearity is likely to be a problem. Suppose, however, that you also have cross-section data on Y and X derived from a separate household survey. These variables will be denotedY' and X' to indicate that the data are household data, not aggregate data. Assuming that all the households in the survey were paying roughly the same price for the commodity, one would fit the simple regression
̂
…[2.71] Now substitute for in the time series model
…[2.72] Subtract from both sides,



Download 1.75 Mb.

Share with your friends:
1   ...   52   53   54   55   56   57   58   59   ...   96




The database is protected by copyright ©ininet.org 2024
send message

    Main page