Deep Reinforcement Learning-Based Real-Time Energy Management for an Integrated Electric–Thermal Energy System
Qiang Shuai, Yue Yin, Shan Huang, Chao ChenRenewable energy plays a crucial role in achieving sustainable development and has the potential to meet humanity’s long-term energy requirements. Integrated electric–thermal energy systems are an important way to consume a high proportion of renewable energy. The intermittency and volatility of integrated electric–thermal energy systems make solving energy management optimization problems difficult. Thus, this paper proposes an energy management optimization method for an integrated electric–thermal energy system based on the improved proximal policy optimization algorithm, which effectively mitigates the problems of the traditional heuristic algorithms or mathematical planning methods with low accuracy and low solving efficiency. Meanwhile, the proposed algorithm enhances both the convergence speed and overall performance compared to the proximal policy optimization algorithm. This paper first establishes a mathematical model for the energy management of an integrated electric–thermal energy system. Then, the model is formulated as a Markov decision process, and a reward mechanism is designed to guide the agent to learn the uncertainty characteristics of renewable energy output and load consumption in the system through historical data. Finally, in the case study section, the proposed algorithm reduces the average running cost by 2.32% compared to the other algorithms discussed in this paper, thereby demonstrating its effectiveness and cost-efficiency.