Financial and Economic Time Series
Financial and Economic Time Series refer to data collected over regular intervals to analyze specific characteristics of financial and economic activities over time. This data is crucial for forecasting future trends, helping managers in decision-making processes related to buying, selling, production, and hiring. Key techniques for analyzing these time series include moving average models, autoregressive models, and integrated approaches such as ARIMA, which combine elements of both methods.
The analysis distinguishes between deterministic variables—such as trends, seasonal fluctuations, and business cycles—and stochastic variables, which are influenced by random events. Understanding these variables is vital for accurate modeling and forecasting. For effective forecasting, time series data must be stationary, meaning its statistical properties remain constant over time. The insights gained from time series analysis can empower organizations to anticipate market changes and maintain a competitive edge.
On this Page
- Statistics > Financial & Economic Time Series
- Overview
- Time Series Data
- Deterministic & Stochastic Variables
- Stationarity
- Applications
- Techniques for Forecasting Stationary Time Series Data
- Naïve Forecasting Models
- Averaging Models
- Determining Trends Using Time Series Data
- Auto-Regression Time Series Modeling
- Integrated Techniques
- Terms & Concepts
- Bibliography
- Suggested Reading
Financial and Economic Time Series
In order to meet the ever-changing needs of the marketplace, industry, and other factors affecting the organization, managers need to estimate or predict future trends. This practice often involves the analysis of time series data — data gathered on a specific characteristic over a period of time. The goal of time series data analysis is to build a model that will allow managers or other decision makers to forecast future needs so that they can develop an appropriate strategy. There are a number of techniques available to forecast stationary time series data, but their application is part art and part science. Even in the simplest situations, one must determine which variables to include in the model and which variables to exclude. Several approaches are available for building time series models. These include moving average models, autoregressive techniques, and integrated techniques that incorporate both approaches in the manipulation and analysis of time series data.
Keywords Autoregression; Business Cycle; Moving Average; Seasonal Fluctuation; Stationarity; Stochastic; Time Series Data; Trend
Statistics > Financial & Economic Time Series
Overview
American philosopher George Santayana said that those who cannot learn from history are doomed to repeat it. Although time and time again this truism is proven in political arenas, it has applicability in other areas, too. Certainly, businesses can learn from the past. Understanding the effects of trends, business cycles, seasonal fluctuations, and irregular or random events on the needs of the marketplace or the trajectory of the industry can help businesses better position themselves to leverage this knowledge into a better market position and enable themselves to predict coming needs and remain competitive in the marketplace. This process is called forecasting: the science of estimating or predicting future trends. Forecasts are used to support managers in making decisions about many aspects of the business including buying, selling, production, and hiring. Although there are purely judgmental approaches to forecasting available that depend on the expertise and experience of the manager, statistical techniques that build data-driven models and help forecast trends, seasonality, and patterns can help quantify the variables causing such fluctuations. Most of these techniques require the use of time series data.
Time Series Data
Time series data are data gathered on a specific characteristic over a period of time. Time series data are used in business forecasting to examine patterns, trends, and cycles from the past in order to predict patterns, trends, and cycles in the future. Time series methods include naïve methods, averaging, smoothing, regression analysis, and decomposition. These techniques are used in the forecasting of future trends or needs in decision making about many aspects of the business including buying, selling, production, and hiring.
Time series data are data gathered on a specific characteristic over a period of time. To be useful for forecasting, time series data must be collected at intervals of regular length. In time series analysis, the sequence of observations is assumed to be a set of jointly distributed random variables. Unlike the ad hoc approach to forecasting where it is impossible to tell whether or not the formula chosen is the most appropriate for the situation, in time series analysis one can study the structure of the correlation (i.e., the degree to which two events or variables are consistently related) over time to determine the appropriateness of the model.
The primary reason for the analysis of times series data is to be able to understand and predict patterns. Time series analysis typically involves observing and analyzing the patterns of historical data in order to extrapolate past trends into future forecasts. To do this, most statistical analysis of time series data involves model building, which is the development of a concise mathematical description of past events. These models, in turn, are used to forecast how the pattern will continue into the future.
Deterministic & Stochastic Variables
There are two types of variables involved in time series data and analysis: deterministic and stochastic. Deterministic variables are those for which there are specific causes or determiners. These include trends, business cycles, and seasonal fluctuations. Trends are persistent, underlying directions in which a factor or characteristic is moving in either the short, intermediate, or long term. Most trends are linear rather than cyclic, and grow or shrink steadily over a period of years. An example of a trend would be the increasing tendency to outsource and offshore technical support and customer service within many high tech companies. Not all trends are linear, however. Trends in new industries tend to be curvilinear as the demand for the new product or service grows after its introduction then declines after the product or service becomes integrated into the economy. A second type of deterministic factor is business cycles. These are continually recurring variations in total economic activity. Business cycles tend to occur across most sectors of the economy at the same time. For example, several years of a boom economy with expansion of economic activity (e.g., more jobs, higher sales) are often followed by slower growth or even contraction of economic activity. Business cycles may occur not only across one industry or business sector, but also across the economy in general. A third type of deterministic factor is seasonal fluctuations. These are changes in economic activity that occur in a fairly regular annual pattern and are related to seasons of the year, the calendar, or holidays. For example, office supply stores experience an upsurge in business in August as children receive their school supply lists for the coming year. Similarly, the demand for heating oil is greater during the cool months than it is in the warm months.
Stochastic variables, on the other hand, are those that are caused by randomness or include an element of chance or probability. Stochastic variables include both irregular and random fluctuations in the economy that occur due to unpredictable factors. Examples of irregular variables include natural disasters such as earthquakes or floods, political disturbances such as war or change in the political party in charge, strikes, and other external factors. Other unpredictable or random factors that can affect a business's profitability include situations such as high absenteeism due to an epidemic.
A simple example of a stochastic time series is the random walk process. This is based on an investment theory that claims that market prices follow a random path up and down and are not influenced by past price movements. This theory concludes that it is impossible to predict the direction of the market with any degree of accuracy, particularly in the short term. In the random walk process, each successive change is independently drawn form a probability distribution with a mean of zero. The simplest example of a times series is one that is completely random (i.e., has no recognizable pattern).
Forecasting in the real world typically involves many variables. Although in theory a purely deterministic model is possible, the complexity of real world problems usually results in situations involving both deterministic and stochastic variables. Business and economic problems usually involve unknown variables or uncontrollable factors. As a result, most time series in the business world are stochastic in nature.
Stationarity
Another characteristic of time series is stationarity. This condition exists when the probability distribution of a time series does not change over time. Stationarity is of interest to analysts because when one can assume that the underlying stochastic process is invariant with respect to time (i.e., stationary), then one can mathematically model the process with an equation with fixed coefficients that estimate future values from past history. If the process is assumed to be stationary, that probability of a given fluctuation in the process is assumed to be the same at any given point in time (i.e., invariant with respect to time). If, on the other hand, the process is non-stationary, it is difficult to mathematically model the process using a simple algebraic equation. Unfortunately, few processes of interest in business and economics are truly stationary. However, it is often possible to use a simple mathematical procedure to transform non-stationary processes into ones that are approximately stationary for purposes of analysis.
Applications
Using time series data in econometric model building is part art and part science. Even in the simplest situations, one must determine which variables to include in the model and which variables to exclude. The goal of time series data analysis is to build a model that will allow managers or other decision makers to forecast future needs. There are several steps in developing time series models. First, one must specify the parameters of the model. Decisions to be made at this stage include the degree of homogeneity in the time series and the order of the moving average and autoregressive components of the analysis. After the model has been specified, it must be estimated. This is frequently done using nonlinear regression. The next step is to examine the autocorrelation function using a simple chi-square test to determine whether the residuals are uncorrelated. The parameter estimates should also be checked at this point to determine if they appear to be stationary. The model must next be evaluated to determine whether or not it can be used to make accurate forecasts. This can be done through such methods as historical simulation starting at different points of time. Model building is an iterative process. If the model is not successful at this point, it can be manipulated to better represent the real world situation.
Techniques for Forecasting Stationary Time Series Data
There are a number of techniques available to forecast stationary time series data (i.e., those that show no significant trend, cyclic, or seasonal effects). Different approaches, however, often yield different results. To help determine which forecast better models a given set of data, the forecaster needs to determine the amount of forecasting error produced by each technique. Error is the difference between the forecasted value of a variable and the actual value of a variable. Techniques for measuring error includes mean error, mean absolute deviation, mean square error, mean percentage error, and mean absolute percentage error.
Naïve Forecasting Models
Several types of techniques are used to smooth out irregular fluctuation effects in time series data. Naïve forecasting models offer one approach to smoothing. These are simple models that assume that the best predictors of future outcomes are the more recent data in the time series. This assumption means that naïve forecasting models do not consider the possibility of trends, business cycles, or seasonal fluctuations. As a result, the naïve forecasting models work better on data that are reported more frequently (e.g., daily or weekly) or in situations without trends or seasonality. For example, if ten gross of widgets were sold last month, a naïve model would conclude that ten gross of widgets would also be sold next month. However, since naïve model forecasts are often based on the observations of one time period, they can easily become a function of irregular fluctuations in data (e.g., Acme corporation needed a one-time purchase of widgets to set up their new operations facility which accounts for the one-time high demand for widgets in the previous month).
Averaging Models
Another approach to smoothing time series data uses averaging models. This approach helps neutralize the problem of naïve models in which the forecast is overly sensitive to irregular fluctuations as illustrated in the previous example. In averaging models, the data from several time periods are taken into account. In the simple average model, the forecast for the upcoming time period is the average of the values for a specified number of previous time periods. For example, the forecast of widgets sales for next month might be the average number of widgets sold per month over the past six months. Moving averages, on the other hand, not only use the average value from previous time periods to forecast future time periods, but update this average in each ensuing time period by including the new values not available in the previous average and dropping out the date from the earliest time periods. Although this approach has the advantage of taking into account the most recent data available, it can be difficult to choose the optimal length of time over which to compute the moving average. In addition, moving averages do not take in to account the effects of trends, business cycles, and seasonal fluctuations. To help overcome some of the problems inherent in moving averages, the analyst may use a weighted moving average which gives more weight to some time periods in the series than to others. For example, if three months ago Widget Corporation introduced their redesigned product to the marketplace, the analyst might believe that the past three months reflect the market's reaction to the new design and be better able to forecast the continuing reaction than if s/he did not have this information. In addition to naïve and averaging approaches to smoothing time series data, there are exponential smoothing techniques. The techniques use weight data from previous time periods with exponentially decreasing importance. In other words, the new forecast is a product of the current forecast and the current actual value.
Determining Trends Using Time Series Data
Although these approaches to time series modeling can be helpful for simple data sets, they do not account well for trends. However, there are several approaches to analyzing time series data to determine the influence of long-term changes in the business climate. Two of the simplest of these approaches are linear regression and regression using quadratic models. In order for these methods to produce accurate forecasts, however, the time series data cannot be influenced by seasonal fluctuations. If it is assumed that there is a seasonal effect influencing the time series data, other techniques must be used. One frequently used technique is decomposition, in which the time series data are broken down into the four component factors of trend, business cycle, seasonal fluctuation, and irregular or random fluctuation.
Time series models can produce spurious results when the error terms of the model are correlated with each other. This situation is referred to as autocorrelation or serial correlation. Autocorrelation causes problems in the use of regression analysis because regression analysis assumes that error terms are not correlated because they are either independent or random. When autocorrelation occurs, the estimates of the regression coefficients may be inefficient. In addition, both the variance of the error terms and the true standard deviation may be significantly underestimated because of their effect. Also, autocorrelation means that the confidence intervals and t and F tests are no longer strictly applicable. There are, however, a number of ways to determine whether or not autocorrelation is present in time series data (e.g., the Durbin-Watson test). Ways to correct for auto-correlated data include the addition of independent variables and by transforming variables.
Auto-Regression Time Series Modeling
Another approach to modeling time series data is auto-regression. This is a multiple regression technique used in forecasting in which future values of the variable are predicted from past values of the variable. Auto-regression takes advantage of the relationship of values to the values of previous time periods. In this approach, the independent variables are time-lagged versions of the dependent variable. In other words, one tries to forecast a future value of a variable from knowledge of that variable's value in previous time periods. This can be done for multiple previous time periods. This approach can be useful for locating both seasonal and cyclic effects.
Integrated Techniques
In addition to moving average and autoregressive models, times series data can be modeled using mixed or integrated techniques that utilize both approaches. One of these approaches to model fitting is the autoregressive integrated moving average (ARIMA) model (also called the Box-Jenkins model). This is an integrated tool for understanding and forecasting using time series data. An ARIMA model has both an autoregressive and a moving average component. Although ARIMA modeling techniques can be difficult to compute and interpret, they are powerful and frequently result in a better model than either the use of moving averages or autoregressive techniques alone. Specifically, ARIMA can be used to determine the length of the weights (i.e., how much of the past should be used to predict the next observation) and the values of these weights. Random walk, autoregressive models, and exponential models are special cases of ARIMA models.
Terms & Concepts
Autocorrelation: (also called serial correlation): A problem occurring over time in regression analysis when the error terms of the forecasting model are correlated.
Auto-Regression: A multiple regression technique used in forecasting in which future values of the variable are predicted from past values of the variable.
Autoregressive Integrated Moving Average (ARIMA): An integrated tool for understanding and forecasting using time series data. The ARIMA model has both an autoregressive and a moving average component. The ARIMA model is also referred to as the Box-Jenkins model.
Business Cycle: A continually recurring variation in total economic activity. Such expansions or contractions of economic activity tend to occur across most sectors of the economy at the same time.
Data: (sing. datum) In statistics, data are quantifiable observations or measurements that are used as the basis of scientific research.
Deterministic Variables: Variables for which there are specific causes or determiners. These include trends, business cycles, and seasonal fluctuations.
Decomposition: The process of breaking down time series data into the component factors of trends, business cycles, seasonal fluctuations, and irregular or random fluctuations.
Forecasting: In business, forecasting is the science of estimating or predicting future trends. Forecasts are used to support managers in making decisions about many aspects of the business including buying, selling, production, and hiring.
Moving Average: A method used in forecasting in which the average value from previous time periods is used to forecast future time periods. The average is updated in each ensuing time period by including the new values not available in the previous average and dropping out the date from the earliest time periods.
Seasonal Fluctuation: Changes in economic activity that occur in a fairly regular annual pattern. Seasonal fluctuations may be related to seasons of the year, the calendar, or holidays.
Stationarity: The condition of a random process where its statistical properties do not vary with time.
Stochastic: Involving chance or probability. Stochastic variables are random or have an element of chance or probability associated with their occurrence.
Time Series Data: Data gathered on a specific characteristic over a period of time. Time series data are used in business forecasting. To be useful, time series data must be collected at intervals of regular length.
Trend: The persistent, underlying direction in which something is moving in either the short, intermediate, or long term. Identification of a trend allows one to better plan to meet future needs.
Variable: An object in a research study that can have more than one value. Independent variables are stimuli that are manipulated in order to determine their effect on the dependent variables (response). Extraneous variables are variables that affect the response but that are not related to the question under investigation in the study.
Bibliography
Arestis, P., Luintel, A. D., & Luintel, K. B. (2010). Financial structure and economic growth: Evidence from time series analyses. Applied Financial Economics, 20, 1479-1492. Retrieved November 15, 2013, from EBSCO Online Database Business Source Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=53921305&site=ehost-live
Black, K. (2006). Business statistics for contemporary decision making (4th ed.). New York: John Wiley & Sons.
Keho, Y. (2010). Effect of financial development on economic growth: Does inflation matter? time series evidence from the UEMOA countries. International Economic Journal, 24, 343-355. Retrieved November 15, 2013, from EBSCO Online Database Business Source Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=53854472&site=ehost-live
Nazem, S. M. (1988). Applied time series analysis for business and economic forecasting. New York: Marcel Dekker.
Pindyck, R. S. & Rubinfeld, D. L. (1998). Econometric models and economic forecasts. Boston: Irwin/McGraw-Hill.
Rodrigues, P. M., Rubia, A., & Valle e Azevedo, J. (2013). Finite sample performance of frequency- and time-domain tests for seasonal fractional integration. Journal of Statistical Computation & Simulation, 83, 1373-1384.Retrieved November 15, 2013, from EBSCO Online Database Business Source Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=89029063&site=ehost-live
Suggested Reading
Armstrong, J. S. & Collopy, F. (1998). Integration of statistical methods and judgment for time series forecasting: Principles from empirical research. In Wright, G. & Goodwin, P. (Eds.). Forecasting with Judgment. New York: John Wiley & Sons.
Dauten, C. A. & Valentine, L. M. (1978). Business cycles and forecasting (5th ed.). Cincinnati: South-Western Publishing Co.
Di Giacinto, V. (2006). A generalized space-time ARMA model with an application to regional unemployment analysis in Italy. International Regional Science Review, 29 , 159-198. Retrieved May 24, 2007, from EBSCO Online Database Business Source Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=20711879&site=ehost-live
Makridakis, S. & Wheelwright, S. C. (1982). Introduction to management forecasting: Status and needs. In Makridakis, S. & Wheelwright, S. C. (Eds.).The Handbook of Forecasting: A Manager's Guide. New York: John Wiley & Sons.
Morrell, J. (2001). How to forecast: A guide for business. Burlington, VT: Gower.
Nelson, C. R. (1973). Applied time series analysis for managerial forecasting. San Francisco: Holden-Day.
Wynne, B. E. & Hall, D. A. (1982). Forecasting requirements for operations planning and control. In Makridakis, S. & Wheelwright, S. C. (Eds.).The Handbook of Forecasting: A Manager's Guide. New York: John Wiley & Sons.