…[2.39- Thus b 1 is an unbiased estimator of provided that the Gauss–Markov conditions 1 and 4 are satisfied. Of course in any given sample the random factor will cause b 1 to differ from
…[2.40] Equation [2.40] has three obvious implications. First, the variances of both b 1 and b 2 are directly inversely proportional to the number of observations in the sample. This makes good sense. The more information you have, the more accurate your estimates are likely to be. Second, the variances are proportional to the variance of the disturbance term. The bigger the the variance of the random factor in the relationship, the worse the estimates of the parameters are likely to be. Third, the variance of the regression coefficients is inversely related to the variance of X. What is the reason for this Remember that (1) the regression coefficients are calculated on the assumption that the observed variations in Y are due to variations in X, but (2) they are in reality