Localization of Global Forecasting Models with a Clustering Approach

Document Type : Original Article

Authors

Faculty of Information Technology and Computer Engineering, Azarbaijan Shahid Madani University, Tabriz, Iran

10.22091/jemsc.2025.11595.1218

Abstract

With the increasing generation of time series data, forecasting models trained on a set of time series, known as global forecasting models, outperform univariate forecasting models trained on individual series. However, the performance of global models may decrease when faced with heterogeneous data sets of time series with different lengths. In this study, a new method for localization of clustering-based global forecasting models is presented. The main steps of the proposed method include (1) extracting relevant features from each time series (2) clustering time series based on features extracted using K-Medoids and spectral clustering algorithms (3) implementing a global forecasting model using a combination of Temporal Convolution Network and its training for each cluster. To evaluate the prediction accuracy of the proposed approach, experiments were conducted on the M3 dataset that contains 1426 time series with unequal-length. The results of the experiments show the superior performance of the proposed clustering-based models compared to the baseline models and the benchmark models. The proposed model has 0.57 less error in terms of SMAPE metric.

Keywords

Main Subjects


Bai, S., Kolter, J. Z., & Koltun, V. (2018). An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271. https://doi.org/10.48550/arXiv.1803.01271
 
Bandara, K., Bergmeir, C., & Smyl, S. (2020). Forecasting across time series databases using recurrent neural networks on groups of similar series: A clustering approach. Expert systems with applications, 140, 112896. https://doi.org/10.1016/j.eswa.2019.112896
 
Bandara, K., Hewamalage, H., Liu, Y. H., Kang, Y., & Bergmeir, C. (2021). Improving the accuracy of global forecasting models using time series data augmentation. Pattern Recognition, 120, 108148. https://doi.org/10.1016/j.patcog.2021.108148
 
Bandara, K. (2023). Forecasting with Big Data Using Global Forecasting Models. In Forecasting with Artificial Intelligence: Theory and Applications (pp. 107-122). M. Hamoudia, S. Makridakis, and E. Spiliotis, Editors. 2023, Cham: Springer Nature Switzerland: https://doi.org/10.1007/978-3-031-35879-1
 
Box, G. E., Jenkins, G. M., Reinsel, G. C., & Ljung, G. M. (2015). Time series analysis: forecasting and control. John Wiley & Sons. https://doi.org/10.1111/jtsa.12194
 
Cleveland, R. B., Cleveland, W. S., McRae, J. E., & Terpenning, I. (1990). STL: A seasonal-trend decomposition. J. off. Stat, 6(1), 3-73.
 
Godahewa, R., Bandara, K., Webb, G. I., Smyl, S., & Bergmeir, C. (2021). Ensembles of localised models for time series forecasting. Knowledge-Based Systems, 233, 107518. https://doi.org/10.1016/j.knosys.2021.107518
 
Hewamalage, H., Bergmeir, C., & Bandara, K. (2021). Recurrent neural networks for time series forecasting: Current status and future directions. International Journal of Forecasting, 37(1), 388-427. https://doi.org/10.1016/j.ijforecast.2020.06.008
 
Hewamalage, H., Bergmeir, C., & Bandara, K. (2022). Global models for time series forecasting: A simulation study. Pattern Recognition, 124, 108441. https://doi.org/10.1016/j.patcog.2021.108441
 
Hyndman, R. J., Koehler, A. B., Snyder, R. D., & Grose, S. (2002). A state space framework for automatic forecasting using exponential smoothing methods. International Journal of forecasting, 18(3), 439-454. https://doi.org/10.1016/S0169-2070(01)00110-8
 
Hyndman, R., Koehler, A. B., Ord, J. K., & Snyder, R. D. (2008). Forecasting with exponential smoothing: the state space approach. Springer Science & Business Media. https://doi.org/10.1007/978-3-540-71918-2
 
Hyndman, R., Kang, Y., Montero-Manso, P., Talagala, T., Wang, E., Yang, Y., & O’Hara-Wild, M. (2019). tsfeatures: Time series feature extraction. R package version, 1(0).
 
Januschowski, T., Gasthaus, J., Wang, Y., Salinas, D., Flunkert, V., Bohlke-Schneider, M., & Callot, L. (2020). Criteria for classifying forecasting methods. International Journal of Forecasting, 36(1), 167-177. https://doi.org/10.1016/j.ijforecast.2019.05.008
Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
 
Laptev, N., Yosinski, J., Li, L. E., & Smyl, S. (2017, August). Time-series extreme event forecasting with neural networks at uber. In International conference on machine learning (Vol. 34, pp. 1-5). sn.
 
Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2018). The M4 Competition: Results, findings, conclusion and way forward. International Journal of forecasting, 34(4), 802-808. https://doi.org/10.1016/j.ijforecast.2018.06.001
Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2022). M5 accuracy competition: Results, findings, and conclusions. International Journal of Forecasting, 38(4), 1346-1364. https://doi.org/10.1016/j.ijforecast.2021.11.013
 
Martínez, F., Frías, M. P., Pérez-Godoy, M. D., & Rivera, A. J. (2018). Dealing with seasonality by narrowing the training set in time series forecasting with kNN. Expert systems with applications, 103, 38-48. https://doi.org/10.1016/j.eswa.2018.03.005
 
Montero-Manso, P., Athanasopoulos, G., Hyndman, R. J., & Talagala, T. S. (2020). FFORMA: Feature-based forecast model averaging. International Journal of Forecasting, 36(1), 86-92. https://doi.org/10.1016/j.ijforecast.2019.02.011
 
Montero-Manso, P., & Hyndman, R. J. (2021). Principles and algorithms for forecasting groups of time series: Locality and globality. International Journal of Forecasting, 37(4), 1632-1653. https://doi.org/10.1016/j.ijforecast.2021.03.004
 
Oreshkin, B. N., Carpov, D., Chapados, N., & Bengio, Y. (2019). N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. arXiv preprint arXiv:1905.10437. https://doi.org/10.48550/arXiv.1905.10437
 
Parmezan, A. R. S., Souza, V. M., & Batista, G. E. (2019). Evaluation of statistical and machine learning models for time series prediction: Identifying the state-of-the-art and the best conditions for the use of each model. Information sciences, 484, 302-337. https://doi.org/10.1016/j.ins.2019.01.076
 
Prokhorenkova, L., Gusev, G., Vorobev, A., Dorogush, A. V., & Gulin, A. (2018). CatBoost: unbiased boosting with categorical features. Advances in neural information processing systems, 31. https://doi.org/10.48550/arXiv.1706.09516
Tavakkoli-Moghaddam, R., Akbari, A. H., Tanhaeean, M., Moghdani, R., Gholian-Jouybari, F., & Hajiaghaei-Keshteli, M. (2024). Multi-objective boxing match algorithm for multi-objective optimization problems. Expert Systems with Applications, 239, 122394. https://doi.org/10.1016/j.eswa.2023.122394
Yavari, M., Marvi, M., & Akbari, A. H. (2020). Semi-permutation-based genetic algorithm for order acceptance and scheduling in two-stage assembly problem. Neural Computing and Applications, 32, 2989-3003. https://doi.org/10.1007/s00521-019-04027-w
Tanhaeean, M., Tavakkoli-Moghaddam, R., & Akbari, A. H. (2022). Boxing match algorithm: A new meta-heuristic algorithm. Soft Computing, 26(24), 13277-13299. https://doi.org/10.1007/s00500-022-07518-6
Salinas, D., Flunkert, V., Gasthaus, J., & Januschowski, T. (2020). DeepAR: Probabilistic forecasting with autoregressive recurrent networks. International journal of forecasting36(3), 1181-1191. https://doi.org/10.1016/j.ijforecast.2019.07.001
 
Schubert, E., & Rousseeuw, P. J. (2021). Fast and eager k-medoids clustering: O (k) runtime improvement of the PAM, CLARA, and CLARANS algorithms. Information Systems, 101, 101804. https://doi.org/10.1016/j.is.2021.101804
Jabbari, M., Rezaeenour, J., & Akbari, A. H. (2023). A Feature Selection Method Based on Information Theory and Genetic Algorithm. Sciences and Techniques of Information Management, 9(3), 32-7.
Smyl, S., & Kuber, K. (2016, June). Data preprocessing and augmentation for multiple short time series forecasting with recurrent neural networks. In 36th international symposium on forecasting.
 
Smyl, S. (2020). A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting. International journal of forecasting, 36(1), 75-85. https://doi.org/10.1016/j.ijforecast.2019.03.017
Vaswani, A., N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser and I. Polosukhin (2017). Attention is all you need. Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach, California, USA, Curran Associates Inc.: 6000–6010.
Akbari, A. H., & Jafari, M. (2025). Development of a Deep Reinforcement Learning Algorithm in a Dynamic Cellular Manufacturing System Considering Order Rejection, Case Study: Stone Paper Factory. Engineering Management and Soft Computing, 10(2), 204-222.
Jafari, M., & Akbari, A. H. (2025). Efficient Algorithms for Dynamic Cellular Manufacturing Systems by Considering Blockchain-Enabled (Case Study: Stone Paper Factory). Journal of Advanced Manufacturing Systems.
Ye, X., & Sakurai, T. (2016). Robust similarity measure for spectral clustering based on shared neighbors. ETRI journal, 38(3), 540-550. https://doi.org/10.4218/etrij.16.0115.0517
CAPTCHA Image