[an error occurred while processing this directive]

ZTE Communications ›› 2023, Vol. 21 ›› Issue (4): 54-59.DOI: 10.12142/ZTECOM.202304007

• • 上一篇    下一篇

  

  • 收稿日期:2023-06-20 出版日期:2023-12-07 发布日期:2023-12-07

Local Scenario Perception and Web AR Navigation

SHI Wenzhe1,2(), LIU Yanbin1,2, ZHOU Qinfen1   

  1. 1.State Key Laboratory of Mobile Network and Mobile Multimedia Technology, Shenzhen 518055, China
    2.ZTE Corporation, Shenzhen 518057, China
  • Received:2023-06-20 Online:2023-12-07 Published:2023-12-07
  • About author:SHI Wenzhe (shi.wenzhe@zte.com.cn) is a strategy planner and engineer for XRExplore Platform product planning at ZTE Corporation. He is also a member of the National Key Laboratory for Mobile Network and Mobile Multimedia Technology. His research interests include indoor visual AR navigation, SFM 3D reconstruction, visual SLAM, real-time cloud rendering, VR, and spatial perception.
    LIU Yanbin is a strategy planner and product manager for XRExplore Platform product planning at ZTE Corporation. He is also a member of the National Key Laboratory for Mobile Network and Mobile Multimedia Technology. His research interests include real-time remote rendering, visual SLAM, and computer vision.
    ZHOU Qinfen is the XR product leader director of new media industry and a senior architect of ZTE Corporation. She has more than 20 years of experience in the communication industry and media business. She has held the positions of product manager of short message center, product manager of cloud desktop, product line cost director, and XR product director at ZTE Corporation. She has a thorough understanding of products and related standards including Short Message Center, Cloud Desktop GPU Virtualization, XR, etc. As a member of the Shenzhen 8K UHD Video Industry Cooperation Alliance (SUCA) and Virtual Display Professional Committee of Jiangsu Communication Association, she leads a team responsible for the research on the latest video technology and related standards.
  • Supported by:
    ZTE Industry‐University‐Institute Cooperation Funds

Abstract:

This paper proposes a local point cloud map-based Web augmented reality (AR) indoor navigation system solution. By delivering the local point cloud map to the web front end for positioning, the real-time positioning can be implemented only with the help of the computing power of the web front end. In addition, with the characteristics of short time consumption and accurate positioning, an optimization solution to the local point cloud map is proposed, which includes specific measures such as descriptor de-duplicating and outlier removal, thus improving the quality of the point cloud. In this document, interpolation and smoothing effects are introduced for local map positioning, enhancing the anchoring effect and improving the smoothness and appearance of user experience. In small-scale indoor scenarios, the positioning frequency on an iPhone 13 can reach 30 fps, and the positioning precision is within 50 cm. Compared with an existing mainstream visual-based positioning manner for AR navigation, this specification does not rely on any additional sensor or cloud computing device, thereby greatly saving computing resources. It takes a very short time to meet the real-time requirements and provide users with a smooth positioning effect.

Key words: Web AR, three-dimensional reconstruction, navigation, positioning