Energy-Efficient Virtual Network Embedding: A Deep Reinforcement Learning Approach Based on Graph Convolutional Networks
Peiying Zhang, Enqi Wang, Zhihu Luo, Yanxian Bi, Kai Liu, Jian WangNetwork virtualization (NV) technology is the cornerstone of modern network architectures, offering significant advantages in resource utilization, flexibility, security, and streamlined management. By enabling the deployment of multiple virtual network requests (VNRs) within a single base network through virtual network embedding (VNE), NV technology can substantially reduce the operational costs and energy consumption. However, the existing algorithms for energy-efficient VNE have limitations, including manual tuning for heuristic routing policies, inefficient feature extraction in traditional intelligent algorithms, and a lack of consideration of periodic traffic fluctuations. To address these limitations, this paper introduces a novel approach that leverages deep reinforcement learning (DRL) to enhance the efficiency of traditional methods. We employ graph convolutional networks (GCNs) for feature extraction, capturing the nuances of network graph structures, and integrate periodic traffic fluctuations as a key constraint in our model. This allows for the predictive embedding of VNRs that is both energy-efficient and responsive to dynamic network conditions. Our research aims to develop an energy-efficient VNE algorithm that dynamically adapts to network traffic patterns, thereby optimizing resource allocation and reducing energy consumption. Extensive simulation experiments demonstrate that our proposed algorithm achieves an average reduction of 22.4% in energy consumption and 41.0% in active substrate nodes, along with a 23.4% improvement in the acceptance rate compared to other algorithms.