×

Spatiotemporal adaptive neural network for long-term forecasting of financial time series. (English) Zbl 07414877

Summary: Optimal decision-making in social settings is often based on forecasts from time series (TS) data. Recently, several approaches using deep neural networks (DNNs) such as recurrent neural networks (RNNs) have been introduced for TS forecasting and have shown promising results. However, the applicability of these approaches is being questioned for TS settings where there is a lack of quality training data and where the TS to forecast exhibit complex behaviors. Examples of such settings include financial TS forecasting, where producing accurate and consistent long-term forecasts is notoriously difficult. In this work, we investigate whether DNN-based models can be used to forecast these TS conjointly by learning a joint representation of the series instead of computing the forecast from the raw time-series representations. To this end, we make use of the dynamic factor graph (DFG) to build a multivariate autoregressive model. We investigate a common limitation of RNNs that rely on the DFG framework and propose a novel variable-length attention-based mechanism (ACTM) to address it. With ACTM, it is possible to vary the autoregressive order of a TS model over time and model a larger set of probability distributions than with previous approaches. Using this mechanism, we propose a self-supervised DNN architecture for multivariate TS forecasting that learns and takes advantage of the relationships between them. We test our model on two datasets covering 19 years of investment fund activities. Our experimental results show that the proposed approach significantly outperforms typical DNN-based and statistical models at forecasting the 21-day price trajectory. We point out how improving forecasting accuracy and knowing which forecaster to use can improve the excess return of autonomous trading strategies.

MSC:

68T37 Reasoning under uncertainty in the context of artificial intelligence
PDF BibTeX XML Cite
Full Text: DOI arXiv

References:

[1] Makridakis, S.; Hyndman, R. J.; Petropoulos, F., Forecasting in social settings: the state of the art, Int. J. Forecast., 36, 15-28 (2020)
[2] Piotr, M.; LeCun, Y., Dynamic factor graphs for time series modeling, (Joint European Conference on Machine Learning and Knowledge Discovery in Databases (2009), Springer), 128-143
[3] Mirowski, P., Time series modeling with hidden variables and gradient-based algorithms (2011), Department of Computer Science, Courant Institute of Mathematical Sciences, New York University, Ph.D. Dissertation
[4] Koller, D.; Friedman, N., Probabilistic Graphical Models: Principles and Techniques (2009), MIT Press
[5] Bengio, Y., Learning deep architectures for AI, Found. Trends Mach. Learn., 2, 1-127 (2009) · Zbl 1192.68503
[6] Hyndman, R. J.; Athanasopoulos, G., Forecasting: Principles and Practice (2018), OTexts
[7] Yang, H.-F.; Dillon, T. S.; Chen, Y.-P. P., Optimized structure of the traffic flow forecasting model with a deep learning approach, IEEE Trans. Neural Netw. Learn. Syst., 28, 2371-2381 (2016)
[8] Olagoke, M. D.; Ayeni, A.; Hambali, M. A., Short term electric load forecasting using neural network and genetic algorithm, Int. J. Appl. Inf. Syst., 10, 22-28 (2016)
[9] Makridakis, S.; Spiliotis, E.; Assimakopoulos, V., Statistical and machine learning forecasting methods: concerns and ways forward, PLoS ONE, 13, Article e0194889 pp. (2018)
[10] Sezer, O. B.; Gudelek, M. U.; Ozbayoglu, A. M., Financial time series forecasting with deep learning: a systematic literature review: 2005-2019, Appl. Soft Comput., 90, Article 106181 pp. (2020)
[11] Žliobaitė, I., Learning under concept drift: an overview (2010), preprint
[12] Qin, Y.; Song, D.; Chen, H.; Cheng, W.; Jiang, G.; Cottrell, G., A dual-stage attention-based recurrent neural network for time series prediction, (International Joint Conference on Artificial Intelligence (2017))
[13] Bruche, M.; González-Aguado, C., Recovery rates, default probabilities, and the credit cycle, J. Bank. Finance, 34, 754-764 (2010)
[14] Marshall, A., Principles of Economics: Unabridged (2009), Cosimo, Inc.
[15] Bengio, Y.; Simard, P.; Frasconi, P., Learning long-term dependencies with gradient descent is difficult, IEEE Trans. Neural Netw., 5, 157-166 (1994)
[16] Rubanova, Y.; Chen, T. Q.; Duvenaud, D. K., Latent ordinary differential equations for irregularly-sampled time series, (Advances in Neural Information Processing Systems (2019)), 5321-5331
[17] Cho, K.; van Merrienboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y., Learning phrase representations using RNN encoder-decoder for statistical machine translation, (Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP (2014)), 1724-1734
[18] Borovykh, A.; Bohte, S.; Oosterlee, C. W., Dilated convolutional neural networks for time series forecasting, (October 25, 2018), J. Comput. Financ., forthcoming. Available at SSRN
[19] Chatfield, C., Apples, oranges and mean square error, Int. J. Forecast., 4, 515-518 (1988)
[20] Hyndman, R. J.; Koehler, A. B., Another look at measures of forecast accuracy, Int. J. Forecast., 22, 679-688 (2006)
[21] Rangapuram, S. S.; Seeger, M. W.; Gasthaus, J.; Stella, L.; Wang, Y.; Januschowski, T., Deep state space models for time series forecasting, (Advances in NIPS (2018)), 7785-7794
[22] Smyl, S., A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting, Int. J. Forecast., 36, 75-85 (2020)
[23] Makridakis, S.; Spiliotis, E.; Assimakopoulos, V., The M4 competition: results, findings, conclusion and way forward, Int. J. Forecast., 34, 802-808 (2018)
[24] Oreshkin, B. N.; Carpov, D.; Chapados, N.; Bengio, Y., N-beats: neural basis expansion analysis for interpretable time series forecasting, (International Conference on Learning Representations (2019))
[25] Godfrey, L. B.; Gashler, M. S., Neural decomposition of time-series data for effective generalization, IEEE Trans. Neural Netw. Learn. Syst., 29, 2973-2985 (2017)
[26] Hansen, J.; Nelson, R., Forecasting and recombining time-series components by using neural networks, J. Oper. Res. Soc., 54, 307-317 (2003) · Zbl 1171.91369
[27] Keener, J. P., The Perron-Frobenius theorem and the ranking of football teams, SIAM Rev., 35, 80-93 (1993) · Zbl 0788.62064
[28] Glynn, P. W.; Desai, P. Y., A probabilistic proof of the Perron-Frobenius theorem (2018), preprint
[29] Hochreiter, S.; Schmidhuber, J., Long short-term memory, Neural Comput., 9, 1735-1780 (1997)
[30] Graves, A., Adaptive computation time for recurrent neural networks (2016), preprint
[31] Olah, C.; Carter, S., Attention and augmented recurrent neural networks, Distill (2016)
[32] Tesauro, G., Temporal difference learning and TD-Gammon, Commun. ACM, 38, 58-68 (1995)
[33] Schwendener, A., The estimation of financial markets by means of a regime-switching model (2010), University of St. Gallen, Ph.D. thesis
[34] Ziat, A.; Delasalles, E.; Denoyer, L.; Gallinari, P., Spatio-temporal neural networks for space-time series forecasting and relations discovery, (2017 IEEE ICDM (2017), IEEE), 705-714
[35] Kingma, D. P.; Ba Adam, J., A method for stochastic optimization, (Bengio, Y.; LeCun, Y., 3rd International Conference on Learning Representations. 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings (2015)) (2015)
[36] Loshchilov, I.; Hutter, F., SGDR: stochastic gradient descent with warm restarts, (International Conference on Learning Representations (ICLR) 2017 Conference Track (2017))
[37] Balandat, M.; Karrer, B.; Jiang, D. R.; Daulton, S.; Letham, B.; Wilson, A. G.; Bakshy, E., BoTorch: programmable ayesian optimization in PyTorch (2019), preprint
[38] Smith, T. G., pmdarima: ARIMA estimators for Python (2017), Online
[39] Van Den Oord, A.; Dieleman, S.; Zen, H.; Simonyan, K.; Vinyals, O.; Graves, A.; Kalchbrenner, N.; Senior, A. W.; Kavukcuoglu, K., Wavenet: a generative model for raw audio, SSW, 125 (2016)
[40] Hamilton, J. D., A new approach to the economic analysis of nonstationary time series and the business cycle, Econometrica, J. Econ. Soc., 357-384 (1989) · Zbl 0685.62092
[41] Roll, R., Ambiguity when performance is measured by the securities market line, J. Finance, 33, 1051-1069 (1978)
[42] Zhang, Z.; Zohren, S.; Roberts, S., Deep reinforcement learning for trading, J. Financ. Data Sci., 2, 2, 25-40 (2020)
[43] Rubinstein, M., Markowitz’s “portfolio selection”: a fifty-year retrospective, J. Finance, 57, 1041-1045 (2002)
[44] Connor, G., Sensible return forecasting for portfolio management, Financ. Anal. J., 53, 44-51 (1997)
[45] Bao, W.; Yue, J.; Rao, Y., A deep learning framework for financial time series using stacked autoencoders and long-short term memory, PLoS ONE, 12, Article e0180944 pp. (2017)
[46] Passalis, N.; Tefas, A.; Kanniainen, J.; Gabbouj, M.; Iosifidis, A., Deep adaptive input normalization for time series forecasting, IEEE Trans. Neural Netw. Learn. Syst. (2019)
[47] Tsantekidis, A.; Passalis, N.; Tefas, A.; Kanniainen, J.; Gabbouj, M.; Iosifidis, A., Forecasting stock prices from the limit order book using convolutional neural networks, (2017 IEEE 19th Conference on Business Informatics, vol. 1. 2017 IEEE 19th Conference on Business Informatics, vol. 1, (CBI) (2017), IEEE), 7-12
[48] Zhang, Z.; Zohren, S.; Roberts, S., DeepLOB: deep convolutional neural networks for limit order books, IEEE Trans. Signal Process., 67, 3001-3012 (2019) · Zbl 07123269
[49] Deng, Y.; Bao, F.; Kong, Y.; Ren, Z.; Dai, Q., Deep direct reinforcement learning for financial signal representation and trading, IEEE Trans. Neural Netw. Learn. Syst., 28, 3, 653-664 (2016)
[50] Oreshkin, B. N.; Carpov, D.; Chapados, N.; Bengio, Y., Meta-learning framework with applications to zero-shot time-series forecasting (2020), arXiv preprint
[51] Black, F.; Litterman, R., Asset allocation: combining investor views with market equilibrium, Goldman Sachs Fixed Income Res., 115 (1990)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.