Content for  TR 26.998  Word version:  18.0.0

Top   Top   Up   Prev   Next
0…   4…   4.2…   4.2.2…   4.2.3…   4.3…   4.4…   4.5…   4.6…   4.6.4…   4.6.5…   4.6.8…   5   6…   6.2…   6.2.4…   6.2.5…   6.3…   6.3.4…   6.3.5…   6.4…   6.4.4   6.4.5…   6.5…   6.5.4   6.5.5   6.5.6…   6.6…   6.6.4   6.6.5…   7…   8…   8.9   9   A…   A.2   A.3…   A.4   A.5   A.6   A.7…


9  Conclusionsp. 101

AR/MR experiences involve augmenting visual/auditory contents into the real world to improve the user's experience with better immersiveness, unlike VR, which provides an entirely virtual world. To realize these experiences, glass-type AR/MR devices may be a good candidate device, easily combining the lights from the real world and those from the display without a need of holding a device in one's hand.
In this study, the generic finding for eXtended Reality (XR) in TR 26.928 have been further analysed with specific focus on Augmented Reality (AR) experiences and in particular also with a new device type, AR glasses. Different device centric functions of AR glasses are defined, and different device types are defined. Of particular relevance are 5G STandalone AR (STAR) UEs, i.e. devices that have sufficient capabilities to render rich AR experiences on the device as well as 5G EDGe-Dependent AR (EDGAR) UEs for which edge-based rendering support is a must to provide rich AR experiences. Three basic functions are introduced, the AR Runtime, the Scene Manager and the 5G Media Access Function. Basic AR processes are defined, and a comprehensive summary of AR related media formats is provided. The relevant work in external organizations is summarized.
Based on core use cases, different scenarios are mapped to the 5G System architecture, namely (i) Immersive media downlink streaming (ii) Interactive immersive services (iii) 5G cognitive/spatial computing immersive services as well (iv) AR conversational services. Potential normative work is identified and summarized in clause 8.
Based on the details in the report, the following next steps are proposed.
In the short-term:
  • Document the relevant 5G generic architecture for real-time media delivery based on the 5GMS architecture as addressed in clause 8.2.
  • Establish the concept of 5G media service enablers as introduced in clause 8.3 and make use of the concept to define relevant AR media service enablers.
  • Define a 5G real-time communication media service enabler to support different low-latency streaming and conversational AR related services based on the considerations in clause 8.4.
  • Define media capabilities for AR glasses in a service-independent manner based on the considerations in clause 8.5. The outcomes may affect the other items, especially the 5G real-time communication media service enabler and the IMS-based conversational services.
  • Based on the work on above, define a split rendering media service enabler to support EDGAR devices, as addressed in clause 8.6.
  • Study options for smartly tethering AR glasses based on the discussion in clause 8.7.
  • Develop the extension of IMS-based AR conversational services and shared AR experiences, including an extended MTSI terminal architecture, as addressed in clause 8.8.
In the mid-term:
  • Add issues around semantical perception and spatial mapping to an AI/ML study, taking into account the findings in clause 4.2.3 and 4.2.5 as well as TR 22.874.
All work topics will benefit to be carried out in close coordination with other groups in 3GPP on 5G System and radio related matters, edge computing and rendering as well in communication with experts in MPEG on the MPEG-I project as well as with Khronos on their work on OpenXR, glTF and Vulkan/OpenGL. A follow-up workshop based on the information in clause 4.6.9 may be conducted in order to explore additional synergies and complementary work in different organizations in the XR/AR domain.

Up   Top   ToC