AuthorsY. Dai, K. Zhang, S. Maharjan and Y. Zhang
TitleDeep Reinforcement Learning for Stochastic Computation Offloading in Digital Twin Networks
AfilliationCommunication Systems
Project(s)The Center for Resilient Networks and Applications, Simula Metropolitan Center for Digital Engineering
StatusPublished
Publication TypeJournal Article
Year of Publication2021
Journal IEEE Transactions on Industrial Informatics
Volume17
Issue7
Date Published07/2021
Publisher IEEE
Abstract

The rapid development of industrial Internet of Things (IIoT) requires industrial production towards digitalization to improve network efficiency. Digital Twin is a promising technology to empower the digital transformation of IIoT by creating virtual models of physical objects. However, the provision of network efficiency in IIoT is very challenging due to resource-constrained devices, stochastic tasks, and resources heterogeneity. Distributed resources in IIoT networks can be efficiently exploited through computation offloading to reduce energy consumption while enhancing data processing efficiency. In this article, we first propose a new paradigm digital twin network to build network topology and the stochastic task arrival model in IIoT systems. Then, we formulate the stochastic computation offloading and resource allocation problem to minimize the long-term energy efficiency. As the formulated problem is a stochastic programming problem, we leverage Lyapunov optimization technique to transform the original problem into a deterministic per-time slot problem. Finally, we present asynchronous actor-critic algorithm to find the optimal stochastic computation offloading policy. Illustrative results demonstrate that our proposed scheme is able to significantly outperforms the benchmarks.

Citation Key28076