On Strange Memory Effects in Long–term Forecasts using Regularised Recurrent Neural Networks

Authors

  • Arthur Lerke
  • Hermann Heßling

DOI:

https://doi.org/10.47839/ijc.21.1.2513

Keywords:

Time series, Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), long–term forecasting, autoregressive model, stock market

Abstract

Recurrent neural networks (RNN) based on a long short-term memory (LSTM) are used for predicting the future out of a given set of time series data. Usually, only one future time step is predicted. In this article, the capability of LSTM networks for a wide look into the future is explored. The time series data are taken from the evolution of share prices from stock trading. As expected, the longer the view into the future the stronger the deviations between prediction and reality. However, strange memory effects are observed. They range from periodic predictions (with time periods of the order of one month) to predictions that are an exact copy of a long-term sequence from far previous data. The trigger mechanisms for recalling memory in LSTM networks seem to be rather independent of the behaviour of the time-series data within the last “sliding window" or “batch". Similar periodic predictions are also observed for GRU networks and if the trainable parameters are reduced drastically. A better understanding of the influence of regularisations details of RNNs may be helpful for improving their predictive power.

References

P. J. Werbos, “Generalization of backpropagation with applications to a recurrent gas market model,” Neural Networks, vol. 1, pp. 339–356, 1988.

S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Computation, vol. 9, pp. 1735–1780, 1997.

F. A. Gers and J. Schmidhuber, “LSTM Recurrent Networks Learn Simple Context Free and Context Sensitive Languages,” IEEE Transactions on Neural Networks, vol. 12, no. 6, pp. 1333–1340, 2001.

K. Greff, R. K. Srivastava, J. Koutnik, B. R. Steunebrink, and J. Schmidhuber, “LSTM: A Search Space Odysse,” p. 2222, 2017.

K. C. B. van Merrienboer, C. Gulcehre, F. Bougares, H. Schwenk, and Y. Bengio, “Learning Phrase Representations using RNN Encoder- Decoder for Statistical Machine Translation.” [6] (2021) Time series forecasting. [Online]. Available: https://www.tensorflow.org/tutorials/structured_data/time_series#advanced_autoregressive_model

E. F. Fama, “Efficient capital markets: A review of theory and empirical work,” The Journal of Finance, vol. 25, no. 2, pp. 383–417, 1970. [Online]. Available: http://www.jstor.org/stable/2325486

R. Zanc, T. Cioara, and I. Anghel, “Forecasting financial markets using deep learning,” pp. 459–466, 09 2019.

H. Chung and K. Shin, “Genetic algorithm-optimized long short-term memory network for stock market prediction,” Sustainability, vol. 10, p. 3765, 2018.

A. Ganti. (2020) Adjusted closing price. [Online]. Available: https://www.investopedia.com/terms/a/adjusted_closing_price.asp

J. Brownlee. (2020) How to develop lstm models for time series forecasting. [Online]. Available: https://machinelearningmastery.com/how-to-develop-lstm-models-for-time-series-forecasting/

R. C. Staudemeyer and E. R. Morris. (2019) Understanding lstm – a tutorial into long short-term memory recurrent neural networks. [Online]. Available: https://www.researchgate.net/publication/335975993_Understanding_LSTM_--_a_tutorial_into_Long_Short-Term_Memory_Recurrent_Neural_Networks

D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv:1412.6980, 2015.

(2021) TensorFlow software. [Online]. Available: https://www.tensorflow.org/

(2021) Training & evaluation with the built-in methods. [Online]. Available: https://keras.io/guides/training_with_built_in_methods/

(2021) TensorFlow docs. [Online]. Available: https://www.tensorflow.org/api_docs/python/tf/compat/v1/keras/layers/CuDNNLSTM

(2021) Nvidia t4. [Online]. Available: https://www.nvidia.com/en-us/data-center/tesla-t4/

(2021) Ftse all-world index. [Online]. Available: http://www.ftse.com/Analytics/FactSheets/Home/DownloadSingleIssue/GAE?issueName=AWORLDS

M. Habiba and B. A. Pearlmutter, “Neural ordinary differential equation based recurrent neural network model,” IEEE: 31st Irish Signals and Systems Conference (ISSC), pp. 1–6, 2020.

Downloads

Published

2022-03-30

How to Cite

Lerke, A., & Heßling, H. (2022). On Strange Memory Effects in Long–term Forecasts using Regularised Recurrent Neural Networks. International Journal of Computing, 21(1), 19-24. https://doi.org/10.47839/ijc.21.1.2513

Issue

Section

Articles