ZTE Communications ›› 2023, Vol. 21 ›› Issue (2): 40-52.DOI: 10.12142/ZTECOM.202302007

• Special Topic • Previous Articles     Next Articles

Machine Learning Driven Latency Optimization for Internet of Things Applications in Edge Computing

AWADA Uchechukwu1, ZHANG Jiankang2(), CHEN Sheng3,4, LI Shuangzhi1, YANG Shouyi1   

  1. 1.Zhengzhou University, Zhengzhou 450001, China
    2.Bournemouth University, Poole BH12 5BB, UK
    3.University of Southampton, Southampton SO17 1BJ, UK
    4.Ocean University of China, Qingdao 266100, China
  • Received:2023-03-11 Online:2023-06-13 Published:2023-06-13
  • About author:Uchechukwu AWADA is currently working toward a PhD degree at the School of Information Engineering, Zhengzhou University, China. His current research interests include edge computing, cloud computing, aerial computing, distributed systems, IoT, IoV and wireless communications. He is a student member of the ACM.|ZHANG Jiankang (jzhang3@bournemouth.ac.uk) is a senior lecturer at Bournemouth University, UK. Prior to joining Bournemouth University, he was a senior research fellow at the University of Southampton, UK. Dr. ZHANG was a lecturer from 2012 to 2013 and then an associate professor from 2013 to 2014 at Zhengzhou University, China. His research interests are in the areas of aeronautical communications, aeronautical networks, evolutionary algorithms and edge computing.|CHEN Sheng received his BE degree from the East China Petroleum Institute, China in 1982 and his PhD degree from City, University of London, UK in 1986, both in control engineering. In 2005, he was awarded the higher doctoral degree, Doctor of Sciences (DSc), from the University of Southampton, UK. From 1986 to 1999, He held research and academic appointments at the Universities of Sheffield, Edinburgh and Portsmouth, all in UK. Since 1999, he has been with the School of Electronics and Computer Science, the University of Southampton, where he holds the post of Professor in Intelligent Systems and Signal Processing. His research interests include adaptive signal processing, wireless communications, modeling and identification of nonlinear systems, neural network and machine learning, intelligent control system design, and evolutionary computation methods and optimization. He has published over 600 research papers. He has 18 500+ Web of Science citations with h-index of 59, and 36 700+ Google Scholar citations with h-index of 81. Dr. CHEN is a Fellow of the United Kingdom Royal Academy of Engineering, a Fellow of the Asia-Pacific Artificial Intelligence Association, and a Fellow of IET. He is one of the original ISI’s highly cited researchers in engineering (March 2004). He is named a 2023 Electronics and Electrical Engineering Leader in the UK by Research.com.|LI Shuangzhi received his BS and PhD degrees from the School of Information Engineering, Zhengzhou University, China in 2012 and 2018, respectively. From 2015 to 2017, he was a visiting student with the Department of Electrical and Computer Engineering, McMaster University, Canada. He is currently a lecturer with the School of Information Engineering, Zhengzhou University, China. His research interests include noncoherent space-time coding and ultra-reliable low-latency communications.|YANG Shouyi received his PhD degree from the Beijing Institute of Technology, China in 2002. He is currently a full professor with the School of Information Engineering, Zhengzhou University, China. He has authored or co-authored various articles in the field of signal processing and wireless communications. His current research interests include signal processing in communications systems, wireless communications, and cognitive radio.
  • Supported by:
    the National Natural Science Foundation of China(61571401);61901416(part of the China Postdoctoral Science Foundation under Grant 2021TQ0304);the Innovative Talent Colleges and the University of Henan Province(18HASTIT021)

Abstract:

Emerging Internet of Things (IoT) applications require faster execution time and response time to achieve optimal performance. However, most IoT devices have limited or no computing capability to achieve such stringent application requirements. To this end, computation offloading in edge computing has been used for IoT systems to achieve the desired performance. Nevertheless, randomly offloading applications to any available edge without considering their resource demands, inter-application dependencies and edge resource availability may eventually result in execution delay and performance degradation. We introduce Edge-IoT, a machine learning-enabled orchestration framework in this paper, which utilizes the states of edge resources and application resource requirements to facilitate a resource-aware offloading scheme for minimizing the average latency. We further propose a variant bin-packing optimization model that co-locates applications firmly on edge resources to fully utilize available resources. Extensive experiments show the effectiveness and resource efficiency of the proposed approach.

Key words: edge computing, execution time, IoT, machine learning, resource efficiency