Neural networks over classical models in Time Series

Snigdha Cheekoty
Towards Data Science
3 min readJul 22, 2019

--

Why we shouldn’t stop with classical models in time series analysis and why we should go further into unlocking the power of neural networks.

This article discusses the capabilities of various kinds of neural networks in time series modeling.

Classical machine learning models based on time series forecasting are much difficult to implement compared to the supervised and unsupervised learning models because of the temporal difference in the data: we work on the data plotted against the same data at a different time step. This makes the process of model-fitting and model evaluation relatively difficult.

ARIMA is a very popular tool in the market, because of the reason that it can be used for almost any kind of time series data and is quite easy to understand and is effective in its implementation. There are 3 main limitations to classical models that can be overcome by deep learning methods, as discussed below.

Limitations of Classical models: (like Holt-Winters, ARIMA - based, other Exponential models)

  1. Missing values are not supported.
  2. Assuming that the data has linear relationships.i.e., showing the trend component has overall downward or upward trend
Types of trends shown in Boston marathon data**

We can see that the linear trend line shows a generalized downward trend in the data. Whereas the “cubic spline”, which is a non-linear trend line, captures a more intuitive pattern (non-linear pattern) present in the data.

3. These models work on uni-variate data. Most of the models in time series forecasting don’t support multiple variables to be taken as inputs. We focus on only one variable: the outcome of interest is nothing but the future version of the input variable (at a future time step). Exceptions to this case are models like seasonal ARIMA or SARIMA that accept exogenous variables to model the data.

Deep learning neural networks

Deep learning networks like Multi-layer perceptron, RNNs (recurrent neural networks) and Convoluted neural networks have their own set of advantages and functionalities for time series forecasting.

Multi layer perceptron: Can handle missing values, model complex relationships( like non-linear trends) and support multiple inputs.

But MLPs or Multi-layer perceptrons have a disadvantage of having to provide fixed number of inputs for producing fixed number of outputs: specifying the temporal dependence upfront in the design of the model. There is a fixed mapping function between the inputs and the outputs in these feedforward neural networks that pose a problem when a sequence of inputs is provided to the model.

Convoluted Neural networks: The main advantage of using these is that feature engineering is done readily by them. They automatically extract the features from the raw input provided to the model.

These also share the same functionality of multi-variate input, modeling complex non-linear relationships, and is robust to noise(missing values). But these have an upper hand over the feedforward neural networks because they needn’t learn directly from lag observation but they learn a representation of a large input sequence that is most relevant for the prediction problem.

Recurrent neural networks, specifically LSTMs: Multi-variate input, robustness to noise, multi-variate output, automatic feature extraction, modeling the more complex relationships in the data are all provided by LSTMs as well. They can also read an input sequence data into the model as a separate input vector.

In addition to the functionality offered by MLPs and CNNs, LSTMs can learn the mapping function from inputs to outputs, over the time. The mapping function in no longer fixed or static in the model learning process.

Stay tuned for more detailed articles on deep learning practices in time series analysis!

References:

** Picture taken from: Forecasting Principles and Practice by Rob J. Hyndman and George Athanasopoulos [Online Text: https://otexts.com/fpp2/nonlinear-regression.html]

--

--