Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x
Top   in Index   Prev   Next

TR 26.928
Extended Reality (XR) in 5G

V18.0.0 (PDF)  2023/03  133 p.
V17.0.0  2022/03  133 p.
V16.1.0  2020/12  133 p.
Rapporteur:
Dr. Stockhammer, Thomas
Qualcomm CDMA Technologies

essential Table of Contents for  TR 26.928  Word version:  18.0.0

each title, in the "available" or "not available yet" area, links to the equivalent title in the CONTENT
Here   Top

Up   Top

List of Figures and Tables

Figure 4.1-1Different Types of Realities and some applications
Figure 4.1-2Different degrees of freedom for a user in extended realities
Figure 4.1-3Different degrees of freedom
Figure 4.1-4Right-Handed Coordinate system
Figure 4.1-5Simplified Illustration of XR Spaces
Figure 4.2.1-1Environmental Awareness in XR Applications
Table 4.2.2-1Interaction delay tolerance in traditional gaming (from [19]).
Figure 4.3.2-15G-XR functions integrated in 5G System
Figure 4.3.2-25G-XR Interfaces and Architecture
Table 4.3.3-1Standardized 5QI to QoS characteristics mapping (identical to Table 5.7.4.1-1 in 3GPP TS 23.501 [10])
Figure 4.3.5-1Cloud and Edge Processing
Figure 4.4.1-1XR engine and ecosystem landscape today and in the future as seen by OpenXR © Khronos
Figure 4.4.1-2CPU and GPU operations for XR applications
Figure 4.4.2-1Rasterized (left) and ray-tracing based (right) rendering
Table 4.5-1Expected Video coding standards performance and bitrate target
Figure 4.6.2-1Examples of Spherical to 2D mappings
Figure 4.6.2-2Video Signal Representation
Figure 4.6.3-1Elements necessary for mesh representations ©Wikipedia (Mesh_overview.jpg: The original uploader was Rchoetzlein at English Wikipedia.derivative work: Lobsterbake [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)])
Figure 4.6.7-1Example of a capture and production facility for Point Cloud/ 3D meshes
Figure 4.6.7-2Example of the 32 images captured simultaneously at one time instance in a studio
Figure 4.6.7-33D mesh generation and production workflow
Figure 4.6.7-4Example of a dense depth map calculated per frame for each stereo camera pair
Figure 4.6.7-5Example of resulting point cloud (left), and 3D models such as meshing, simplification and texturing (from second left to right)
Figure 4.8-1XR Form Factors
Figure 4.8-2Temperature rise vs. power density
Table 4.8-1XR Device Types
Table 4.10Overview of Use cases as documented in Annex A
Table 5.1-1Core use case mapping to Annex A
Figure 5.2-1Offline Sharing 3D Objects and MR Scenes
Table 5.2-1Overview of potential normative work linked to different offline sharing use-cases in Annex A
Figure 5.3-1Real-time sharing of XR content
Figure 5.4-1XR Multimedia Streaming
Figure 5.5-1Online XR Gaming
Figure 5.6-1XR Critical Mission
Figure 5.7-1XR Conference
Table 5.7-1Overview of potential normative work linked to different conversational/conferencing use-cases in Annex A
Figure 5.8-1Spatial Audio Multiparty Call
Figure 6.2.2-1Viewport Independent Delivery
Figure 6.2.3-2Viewport-dependent Streaming
Figure 6.2.4-1Viewport rendering in Network
Figure 6.2.5-1Split Rendering with Asynchronous Time Warping (ATW) Correction
Figure 6.2.6-1VR Split Rendering with XR Viewport Rendering in Device
Figure 6.2.7-1XR Distributed Computing Architecture
Figure 6.2.8-1General architecture for XR conversational and conference services
Table 6.3-1Initial Traffic Characteristics for different architectures
Table A.1-1Proposed Use Case Collection Template
Table A.1-2Overview of Use cases
Figure A.13-1example image of a photo-realistic 360-degree communication experience
Figure A.13-2Functional blocks of end-to-end communication
Figure A.14-1example image of a virtual 3D experience with photo-realistic user representations
Figure A.14-2Functional blocks of end-to-end communication
Figure A.15-1Physical scenario
Figure A.15-2Virtual scenario
Figure A.16-1
Figure A.17-1
Figure A.24-1

Top