From the February 2013 issue of Futures Magazine • Subscribe!

Breaking new ground with neural nets

Proess overview

Developing a Box-Jenkins model requires four steps:

  1. Model identification
  2. Model estimation
  3. Diagnostic checking
  4. Forecasting

The last three steps are similar to those for linear regression, such as the use of the Pearson correlation coefficient and the t-statistic, so we omit them here on the assumption that they are familiar to most readers (and available for review in any basic statistics text, if necessary). The first step in developing a Box-Jenkins model, however, requires the judicious use of discretion based on domain expertise.

To develop a model, we must identify the proper form — AR, MA or ARMA — and how many terms are needed. We answer these questions with autocorrelation functions of the series, the autocorrelation function (ACF) and the partial autocorrelation function (PACF).

ACF and PACF are like classic correlation functions, with values from -1.00 to 1.00 if the time series is stationary. In an exception to the classic definition, PACF uses lagged values of the time series itself for the independent variable.

When the regression includes only one independent variable of one-period lag, the coefficient of the independent variable is called a first-order partial autocorrelation function. If a second term of two-period lag is added, the coefficient of the second term is called a second-order partial autocorrelation function, and so on.

To identify an appropriate model, we plot the ACF and PACF in a correlogram for a good visual indication of our model. The pattern in the ACF and the quantity of spikes in the PACF tell us how many terms we need. If the ACF lag correlation fades quickly and the PACF has only one spike, then we use an AR of the first order.

One reason that Box-Jenkins is important is that early work for predicting market data used neural networks in the methodological place of Box-Jenkins. Comparisons showed neural network models, like back-propagation or kernel regression, performed as well or better than Box-Jenkins. Box-Jenkins uses only the price data itself, but neural networks can include truly independent variables, such as intermarket relationships, fundamental data, etc. The ability to use truly independent series makes neural networks and kernel regression more powerful for forecasting market data.

The computer speed and software available today expand how we can use neural networks and kernel regression. We can use them for models that trade at the portfolio level because software like TradersStudio offers neural network technology within a strong portfolio-based trading platform. Increases in computer speed make these solutions feasible. TradersStudio also offers advance handling for splits and dividends of equities. As a result we can, using neural networks, trade stocks, exchange-traded funds and mutual funds using baskets of instruments, and can trade futures portfolio systems, in addition to developing single-market systems such as those for the S&P 500. 

<< Page 3 of 9 >>
Comments
comments powered by Disqus