AEPH
Home > Conferences > Vol. 18. DIMI2025 >
Research on Optimization of Electricity Market Forecasting Model based on Ensemble Learning and Deep Learning
DOI: https://doi.org/10.62381/ACS.DIMI2025.12
Author(s)
Yuchen Wang*
Affiliation(s)
Xi’an Jiaotong-Liverpool University, Suzhou, China *Corresponding Author
Abstract
The electricity market plays a crucial role in global economies, necessitating accurate trend forecasting to mitigate risks to livelihoods and commerce. Traditional methods, such as moving averages, suffer from limitations like prediction lag, sensitivity to outliers, and difficulties in modeling non-linear and seasonal variations. To overcome these challenges, this study introduces an advanced ensemble learning approach that combines multiple forecasting models to improve accuracy and robustness. Additionally, deep learning techniques—including Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) networks—are integrated to better capture complex patterns and seasonal fluctuations in electricity market data. Experimental results demonstrate that the proposed method significantly outperforms conventional techniques, delivering more reliable and precise predictions. This research provides a sophisticated and efficient solution for electricity market forecasting, offering valuable insights for policymakers, energy providers, and market participants. By enhancing predictive capabilities, the study supports more informed decision-making, ultimately contributing to market stability and economic resilience.
Keywords
Electricity Market; Forecasting; Ensemble Learning; RNN; LSTM
References
[1] Box, G. E. P., Jenkins, G. M., and Reinsel, G. C. (2015) Time Series Analysis: Forecasting and Control. Wiley, 5th ed. [2] Brockwell, P. J., and Davis, R. A. (2016) Introduction to Time Series and Forecasting. Springer, 3rd ed. [3] Hyndman, R. J., and Athanasopoulos, G. (2021) Forecasting: Principles and Practice. OTexts, 3rd ed. [4] Breiman, L. (1996) Bagging predictors. Machine Learning, 24, 123-140. [5] Friedman, J. H. (2001) Greedy function approximation: A gradient boosting machine. The Annals of Statistics, 29, 1189-1232. [6] Wolpert, D. H. (1992) Stacked generalization. Neural Networks, 5, 241-259. [7] Chen, T., and Guestrin, C. (2016) XGBoost: A scalable tree boosting system. KDD, 1-10. [8] Hochreiter, S., and Schmidhuber, J. (1997) Long short-term memory. Neural Computation, 9, 1735-1780. [9] Gers, F. A., Schmidhuber, J., and Cummins, F. (2000) Learning to forget: Continual prediction with LSTM. Neural Computation, 12, 2451-2471. [10] Greff, K., Srivastava, R. K., Koutník, J., Steunebrink, B. R., and Schmidhuber, J. (2015) LSTM: A search space odyssey. IEEE Transactions on Neural Networks, 28, 2222-2232. [11] Salinas, D., Tino, P., Flunkert, V., Gasthaus, J., and Januschowski, T. (2020) DeepAR: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 36, 1181-1191. [12] Makridakis, S., Spiliotis, E., and Assimakopoulos, V. (2020) Statistical and machine learning forecasting methods: Concerns and ways forward. PLOS ONE, 15, e0230046. [13] Januschowski, T., Gasthaus, J., Wang, Y., Salinas, D., and Flunkert, V. (2020) A classification of time series features. Data Mining and Knowledge Discovery, 34, 1668-1707. [14] Zerveas, G., Demyanov, V., Vlahavas, I., and Tsoumakas, G. (2021) Transformers in time series: A survey. International Journal of Forecasting, 37, 1141-1161.
Copyright @ 2020-2035 Academic Education Publishing House All Rights Reserved