The Exponential Smoothing Forecasting Methodology

Exponential Smoothing helps companies make some of the most reliable forecasts.

Although exponential smoothing was developed more than 30 years ago, it is still a subject of great interest in the circle of statisticians and forecasters. In any case, its reputation as a robust, easy-to-understand method has grown in recent years, often at the expense of the Box-Jenkins method. Most forecasting software incorporates exponential smoothing. It is the most used time series forecasting method by companies.

The Exponential Smoothing

The main reason comes from the fact that Box-Jenkins models are built on the abstract concept of autocorrelation, while exponential smoothing models are based on concrete concepts such as level, trend and seasonality. Also, exponential smoothing models are less disturbed by outliers in the observed data.

Harvey (1984, 1990) extended the exponential smoothing approach when developing structural models. Forecasts in a structural model are generated by a Kalman filter built on a formal statistical model involving the same elements as exponential smoothing: level, trend, seasonality. We now recognize exponential smoothing for what it really is: approximate Kalman filters fitted directly to the data.

This gives us a general framework to extend the basic methodology of exponential smoothing. We will see two extensions of it in the following models:

Proportional error models extend exponential smoothing to cases where errors tend to become proportional to the data level. The majority of economic data seems to have this characteristic.

Special action models extend exponential smoothing to include estimation, and quantification, of promotions and other non-periodic events.

Global approach to the concepts of exponential smoothing

Level :

The level of a time series is a smoothed, slowly changing, non-seasonal value underlying the observations. IT is not possible to directly measure the level because it is altered by seasonality, promotions and hazards (noise). It must be estimated from the data.

The local trend:

The local trend is the smoothed rate of change of level changes. It is called local to highlight the fact that at every moment it undergoes small and unpredictable modifications. The forecasts are based on the local trend at the end of the history and not on the overall trend of the series. The trend cannot be measured directly. It must be estimated from the data.

Seasonal effects:

Seasonal, multiplicative or additive coefficients represent the seasonal structure of the data, such as the annual structure of retail trade. Like the level and the trend, the seasonal coefficients must be estimated from the data. They are assumed to undergo small changes at each instant.

Special actions:

Promotions influence sales in a way analogous to seasonality, but they are not usually periodic. Multiplicative or additive actions are estimated from the data in a way very similar to the estimation of seasonal coefficients. They are assumed to follow small variations over time.

Random effects:

The level, trend, seasonal and special action coefficients are random variables: their value changes in an unpredictable way over time. These changes are the result of unknown causes such as those that cause a company’s profit, or loss, to differ from what was expected. They are often called random shocks.

Exponential smoothing is based on a structural model of time series. It is assumed that the process under study has one or more of the following structural components:

The noise :

All that we have just described are the components of a stochastic process. Our process measurements, however, are marred by noise or measurement errors. For example, confectionery shipments or confectionery orders are noisy measures of confectionery consumption.

Three of these characteristics: level, random effects, and noise are present in every exponential smoothing model. The other three: local trend, seasonal coefficients and special actions can be present or absent. Identifying with a model consists of determining which of these characteristics must be included in the model to correctly describe the data.

Originally, exponential smoothing models were built on these characteristics, with no particular attention to the underlying statistical model. Exponential smoothing equations provided consistent means for estimating time series characteristics and making predictions. There was no way to correctly estimate a confidence interval because it depends on the underlying statistical model.

Some forecasting software developers responded to the need for confidence intervals with little or no theoretical justification. While for these methods the point estimate of the forecast was correct, the confidence limits were unusable.

Forecast Pro takes a very modern approach to exponential smoothing. Each type of smoothing model is based on a formal statistical model that serves as the basis for calculating confidence limits. The smoothing equations used are based on Kalman filters as a formal model.

A detailed description

A detailed description of the Exponential Smoothing methodology is available in French here The english version will be available soon.

PREDICONSULT proposes a training course on “Forecasting Techniques that includes a detailed presentation of Exponential Smoothing.