ZTE Communications ›› 2020, Vol. 18 ›› Issue (2): 40-48.DOI: 10.12142/ZTECOM.202002006

• Special Topic • Previous Articles     Next Articles

Knowledge Distillation for Mobile Edge Computation Offloading

CHEN Haowei, ZENG Liekang, YU Shuai, CHEN Xu()   

  1. School of Data and Computer Science, Sun Yat?sen University, Guangzhou, Guangdong 510006, China
  • Received:2019-12-01 Online:2020-06-25 Published:2020-08-07
  • About author:CHEN Haowei received the B.S. degree in computer science from the School of Data and Computer Science, Sun Yat-sen University (SYSU), China in 2020. He is working towards the master’s degree in the School of Data and Computer Science, SYSU. His research interests include mobile deep computing, edge intelligence and deep learning.|ZENG Liekang received the B.S. degree in computer science from the School of Data and Computer Science, Sun Yat-sen University, China in 2018. He is currently pursuing the master’s degree with the School of Data and Computer Science, Sun Yat-sen University. His research interests include mobile edge computing, deep learning, and distributed computing.|YU Shuai received the Ph.D. degree from Pierre and Marie Curie University (now Sorbonne Université), France, in 2018, the M.S. degree from Beijing University of Post and Telecommunications, China, in 2014, and the B.S. degree from Nanjing University of Post and Telecommunications, China, in 2009. He is now a post-doctoral Research Fellow at the School of Data and Computer Science, Sun Yat-sen University. His research interests include wireless communications, mobile computing and machine learning.|CHEN Xu (chenxu35@mail.sysu.edu.cn) is a full professor in Sun Yat-sen University, China, and the vice director of National and Local Joint Engineering Laboratory of Digital Home Interactive Applications. He received the Ph.D. degree in information engineering from The Chinese University of Hong Kong in 2012, and worked as a post-doctoral research associate at Arizona State University, USA from 2012 to 2014, and a Humboldt Scholar Fellow at Institute of Computer Science of University of Goettingen, Germany from 2014 to 2016. He is currently an area editor of IEEE Open Journal of the Communications Society, an associate editor of the IEEE Transactions Wireless Communications, IEEE Internet of Things Journal and IEEE Journal on Selected Areas in Communications (JSAC) Series on Network Softwarization and Enablers.
  • Supported by:
    the National Science Foundation of China(61972432);the Program for Guangdong Introducing Innovative and Entrepreneurial Teams(2017ZT07X355)

Abstract:

Edge computation offloading allows mobile end devices to execute compute-intensive tasks on edge servers. End devices can decide whether the tasks are offloaded to edge servers, cloud servers or executed locally according to current network condition and devices’ profiles in an online manner. In this paper, we propose an edge computation offloading framework based on deep imitation learning (DIL) and knowledge distillation (KD), which assists end devices to quickly make fine-grained decisions to optimize the delay of computation tasks online. We formalize a computation offloading problem into a multi-label classification problem. Training samples for our DIL model are generated in an offline manner. After the model is trained, we leverage KD to obtain a lightweight DIL model, by which we further reduce the model’s inference delay. Numerical experiment shows that the offloading decisions made by our model not only outperform those made by other related policies in latency metric, but also have the shortest inference delay among all policies.

Key words: mobile edge computation offloading, deep imitation learning, knowledge distillation