Time series analysis is a branch of the field of econometrics. Time series analysis proceeds from the premise that much can be deduced about a particular variable on the basis of the past behavior of that variable. As is generally true in econometrics, a key objective of time series analysis is prediction of future values of the variable of interest. In the case of time series analysis, however, such prediction often is based on some form of extrapolation of the past behavior of the variable.
The distinguishing characteristic of a time series of a variable is that observations are ordered along the dimension of time. Thus one might have monthly sales data for a business from January 1991 to May 1999. In this case, the analyst has 101 observations ([8 × 12] + 5) on the firm's sales ordered from the beginning to the end of the observation period. A graph of this data might reveal that the firm's sales have trended upward over time. Other characteristics of the series might reflect periodic bursts and lulls in sales, as might be the case if the firm's sales picked up particularly in holiday seasons or, say, over the summer months. In addition, the firm's sales might be related to general business conditions (i.e., the business cycle). Finally, some part of movement in the series will simply appear to be random, as the firm's sales are influenced by a variety of external factors that cannot easily be accounted for. The important point to note in this discussion is that some movement in a time series is systematic and some movement is random. The systematic factors in our hypothetical firm's sales are the upward trend, the dependency on business conditions, and the seasonal behavior of sales. More formally, a time series can be thought of as having a trend component, a cyclic component, a seasonal component, and a random component.
A variety of techniques of varying levels of sophistication have developed over the years by which to conduct time series analysis. The simplest and least costly to apply techniques are deterministic. That is, these techniques do not attempt to account for the random (or stochastic) nature of a time series at all. Such models fall under the heading of extrapolation models. Two of the most commonly used extrapolation models are the linear trend model and the moving average model.
A linear trend model might be used when a time series exhibits steady increase over time. Such a model is typically fit by regressing the variable of interest against a trend variable (which begins with a value of 1 and increases by 1 unit for each time period). The resulting regression coefficient found for the trend variable gives the change in the dependent variable per time period. Such models are easy to create, but the danger inherent in their use is that while many economic series do trend up over time, they do not trend up at a steady rate. As a result, the linear trend model may account well for movement in the variable of interest in the middle of the sample period, but may significantly under- or overstate the variable at the end of the sample period. Consequently, the model will not do an adequate job of predicting the future position of the variable (i.e., outside of the sample period). In some instances, this nonlinearity in the movement of the dependent variable can be adequately dealt with by use of a more sophisticated trend variable. This can be accomplished by using a polynomial rather than a linear trend and can also be accomplished by a suitable transformation (e.g., logarithmic) of the dependent variable. Nevertheless, these extensions do force a high degree of determinism on the future path of the dependent variable and for this reason may lead to inadequate forecasts.
A second class of deterministic time series model is the moving average model. The simplest type of moving average model states the dependent variable as an average of past values of itself. So, for example, we might expect sales at a business to equal average sales at the business over the last four quarters or over the last 12 months. A more sophisticated approach would give heavier weight in the averaging of past values of the dependent variable to more recent occurrences. A model that accomplishes this task is the exponentially weighted moving average model. The reader should note that moving average models are mechanical and nonstatistical. As a result, it is not possible to determine the degree of confidence one could have in the forecasting ability of the model as would be the case with a statistical model (even one as simple as the linear trend model). The benefit of using a moving average model, however, is the ease with which it may be applied.
More sophisticated, statistical models of time series can be attributed to the seminal work of G. E. P. Box and G. M. Jenkins, who developed the autoregressive integrated moving-average (ARIMA) technique. This technique incorporates the random element of time series in the estimation process. Box-Jenkins methodology relies on four distinct steps—identification of a suitable model for the series at hand, estimation of the model, diagnostic checking and re-estimation of the model if necessary, and forecasting future values of the series. Though ARIMA modeling constitutes an important step forward in time series analysis, it is worth noting that such models are limited in the sense that forecasts are obtained only on the basis of past behavior of the series in question and lagged random disturbances in that series.
Work in time series analysis in the 1980s and 1990s concentrated on development of time series models that merge the causal framework of econometrics with the univariate approach of ARIMA modeling. This work led to a variety of important developments such as transfer function modeling, vector autoregression models, unit root tests, error correction models, and structural time series models. Though application of these techniques can be rather complicated, their adoption has led to significant improvements in understanding of time series processes and in forecasting ability. It is fair to say that the biggest advancements in the field of econometrics in recent years have occurred in the area of time series analysis.
[ Kevin J. Murphy ]
Box, George E. P., G. M. Jenkins, et al. Time Series Analysis: Forecasting and Control. 3rd ed. Upper Saddle River, NJ: Prentice Hall, 1994.
Granger, C. W. J. Forecasting in Business and Economics. 2nd ed. Boston: Academic Press, 1989.
Hamilton, James D. Time Series Analysis. Princeton, NJ: Princeton University Press, 1994.
Harvey, Andrew. C. Time Series Models. 2nd ed. Cambridge, MA: MIT Press, 1993.
Pindyck, R. S., and D. L. Rubinfield. Econometric Models and Economic Forecasts. 3rd ed. New York: McGraw-Hill College Div., 1997.