# A Transformation Used to Circumvent the Problem of Autocorrelation

In this paper it is shown that, when the disturbances $\varepsilon_{t} = y_{1} - \Sigma^{k}_{j} = _{1}x_{tj} \beta_{j}$ in a regression model follow a first order autoregressive process with the parameter $p$ generating the process, there exist cases where the classical least squares regression analysis as applied to the $n$-1 observations: $y_{t} - py_{t-1),X_{ty} - px_{t-1,j}(j = 1, 2, /ldots, k;t = Z, /ldots, n),$ is less efficient than the classical least squares regression analysis as applied to the original $n$ observations: $y_{\rho} x_{tj} (t = 1, \ldots, n)$. It is shown that the efficiency of the former relative to the latter could be arbitrarily close to zero some cases.