×

zbMATH — the first resource for mathematics

GluonTS: probabilistic and neural time series modeling in Python. (English) Zbl 07255147
Summary: We introduce the Gluon Time Series Toolkit (GluonTS), a Python library for deep learning based time series modeling for ubiquitous tasks, such as forecasting and anomaly detection. GluonTS simplifies the time series modeling pipeline by providing the necessary components and tools for quick model development, efficient experimentation and evaluation. In addition, it contains reference implementations of state-of-the-art time series models that enable simple benchmarking of new algorithms.
MSC:
68T05 Learning and adaptive systems in artificial intelligence
PDF BibTeX XML Cite
Full Text: Link
References:
[1] Mart´ın Abadi et al. TensorFlow: A system for large-scale machine learning. InProceedings of the 12th USENIX Conference on Operating Systems Design and Implementation, OSDI’16,
[2] Konstantinos Benidis et al. Neural forecasting: Introduction and literature overview.arXiv preprint arXiv:2004.10240, 2020.
[3] Eli Bingham et al. Pyro: Deep universal probabilistic programming.Journal of Machine Learning Research, 2018.
[4] Tianqi Chen et al. Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems.arXiv preprint arXiv:1512.01274, 2015.
[5] Zhenwen Dai et al. MXFusion: A modular deep probabilistic programming library. InNIPS Workshop MLOSS (Machine Learning Open Source Software), 2018.
[6] Dua Dheeru and Efi Karra Taniskidou. UCI machine learning repository, 2017. URL http://archive.ics.uci.edu/ml.
[7] Valentin Flunkert et al. DeepAR: Probabilistic forecasting with autoregressive recurrent networks.International Journal of Forecasting, 2019.
[8] Jan Gasthaus et al. Probabilistic forecasting with spline quantile function RNNs. In AISTATS, 2019.
[9] Agathe Girard et al. Gaussian process priors with uncertain inputs application to multiplestep ahead time series forecasting. InAdvances in Neural Information Processing Systems, pages 545-552, 2003.
[10] Felix Hieber et al. The sockeye neural machine translation toolkit at AMTA 2018. InAMTA, pages 200-207. Association for Machine Translation in the Americas, 2018.
[11] John D. Hunter.Matplotlib: A 2D graphics environment.Computing In Science & Engineering, 9(3):90-95, 2007.
[12] Rob J. Hyndman and Yeasmin Khandakar. Automatic time series forecasting: the forecast package for R.Journal of Statistical Software, 2008.
[13] Tim Januschowski, Jan Gasthaus, and Yuyang Wang. Open-source forecasting tools in Python.Foresight: The International Journal of Applied Forecasting, (55):20-26, Fall 2019.
[14] Guokun Lai et al. Modeling long- and short-term temporal patterns with deep neural networks.CoRR, abs/1703.07015, 2017. URLhttp://arxiv.org/abs/1703.07015.
[15] Markus Lning et al. sktime: A unified interface for machine learning with time series.arXiv preprint arXiv:1909.07872, 2019.
[16] Spyros Makridakis et al. The M4 competition: Results, findings, conclusion and way forward. International Journal of Forecasting, 34(4):802 - 808, 2018.
[17] Adam Paszke et al. Automatic differentiation in PyTorch. InNIPS-W, 2017.
[18] Fabian Pedregosa et al. Scikit-learn: Machine learning in Python.Journal of Machine Learning Research, 12:2825-2830, Nov. 2011.
[19] Syama Sundar Rangapuram et al. Deep state space models for time series forecasting. In Advances in Neural Information Processing Systems, 2018.
[20] Sean J Taylor and Benjamin Letham. Forecasting at scale.The American Statistician, 72 (1):37-45, 2018.
[21] A¨aron van den Oord et al. Wavenet: A generative model for raw audio.arXiv preprint arXiv:1609.03499, 2016.
[22] Ashish Vaswani et al. Attention is all you need. InAdvances in Neural Information Processing Systems, pages 5998-6008, 2017.
[23] Yuyang Wang et al. Deep factors for forecasting. InInternational Conference on Machine Learning, pages 6607-6617, 2019.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.