Abstract : Effective integration of available resources within edge nodes is essential to improve the performance of vehicular edge computing (VEC) to support various randomly offloaded tasks with limited computing capacity and constrained energy. This paper presents an intelligent adaptive resource integration strategy for VEC with energy harvesting. Service caching, task migration and resource allocation are jointly employed to accommodate the temporally and spatially varying computing demands. The optimization to minimize the long-term average task execution time under energy constraint is formulated as Markov decision processes and solved with a parameterized deep Q-network based learning algorithm. This algorithm employs a centralized training and distributed execution framework, where a parameter network and an action network respectively handle continuous and discrete decisions, effectively tackling the hybrid action space challenges in problem solving. Simulation results demonstrate that the proposed algorithm not only achieves faster convergence but also significantly improves system performance compared to benchmarks.
Index terms : mobile edge computing, adaptive resource management, deep reinforcement learning