| |
Figure 4.1-1 | Different Types of Realities and some applications |
Figure 4.1-2 | Different degrees of freedom for a user in extended realities |
Figure 4.1-3 | Different degrees of freedom |
Figure 4.1-4 | Right-Handed Coordinate system |
Figure 4.1-5 | Simplified Illustration of XR Spaces |
Figure 4.2.1-1 | Environmental Awareness in XR Applications |
Table 4.2.2-1 | Interaction delay tolerance in traditional gaming (from [19]). |
Figure 4.3.2-1 | 5G-XR functions integrated in 5G System |
Figure 4.3.2-2 | 5G-XR Interfaces and Architecture |
Table 4.3.3-1 | Standardized 5QI to QoS characteristics mapping (identical to Table 5.7.4.1-1 in 3GPP TS 23.501 [10]) |
Figure 4.3.5-1 | Cloud and Edge Processing |
Figure 4.4.1-1 | XR engine and ecosystem landscape today and in the future as seen by OpenXR © Khronos |
Figure 4.4.1-2 | CPU and GPU operations for XR applications |
Figure 4.4.2-1 | Rasterized (left) and ray-tracing based (right) rendering |
Table 4.5-1 | Expected Video coding standards performance and bitrate target |
Figure 4.6.2-1 | Examples of Spherical to 2D mappings |
Figure 4.6.2-2 | Video Signal Representation |
Figure 4.6.3-1 | Elements necessary for mesh representations ©Wikipedia (Mesh_overview.jpg: The original uploader was Rchoetzlein at English Wikipedia.derivative work: Lobsterbake [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)]) |
Figure 4.6.7-1 | Example of a capture and production facility for Point Cloud/ 3D meshes |
Figure 4.6.7-2 | Example of the 32 images captured simultaneously at one time instance in a studio |
Figure 4.6.7-3 | 3D mesh generation and production workflow |
Figure 4.6.7-4 | Example of a dense depth map calculated per frame for each stereo camera pair |
Figure 4.6.7-5 | Example of resulting point cloud (left), and 3D models such as meshing, simplification and texturing (from second left to right) |
Figure 4.8-1 | XR Form Factors |
Figure 4.8-2 | Temperature rise vs. power density |
Table 4.8-1 | XR Device Types |
Table 4.10 | Overview of Use cases as documented in Annex A |
Table 5.1-1 | Core use case mapping to Annex A |
Figure 5.2-1 | Offline Sharing 3D Objects and MR Scenes |
Table 5.2-1 | Overview of potential normative work linked to different offline sharing use-cases in Annex A |
Figure 5.3-1 | Real-time sharing of XR content |
Figure 5.4-1 | XR Multimedia Streaming |
Figure 5.5-1 | Online XR Gaming |
Figure 5.6-1 | XR Critical Mission |
Figure 5.7-1 | XR Conference |
Table 5.7-1 | Overview of potential normative work linked to different conversational/conferencing use-cases in Annex A |
Figure 5.8-1 | Spatial Audio Multiparty Call |
Figure 6.2.2-1 | Viewport Independent Delivery |
Figure 6.2.3-2 | Viewport-dependent Streaming |
Figure 6.2.4-1 | Viewport rendering in Network |
Figure 6.2.5-1 | Split Rendering with Asynchronous Time Warping (ATW) Correction |
Figure 6.2.6-1 | VR Split Rendering with XR Viewport Rendering in Device |
Figure 6.2.7-1 | XR Distributed Computing Architecture |
Figure 6.2.8-1 | General architecture for XR conversational and conference services |
Table 6.3-1 | Initial Traffic Characteristics for different architectures |
Table A.1-1 | Proposed Use Case Collection Template |
Table A.1-2 | Overview of Use cases |
Figure A.13-1 | example image of a photo-realistic 360-degree communication experience |
Figure A.13-2 | Functional blocks of end-to-end communication |
Figure A.14-1 | example image of a virtual 3D experience with photo-realistic user representations |
Figure A.14-2 | Functional blocks of end-to-end communication |
Figure A.15-1 | Physical scenario |
Figure A.15-2 | Virtual scenario |
Figure A.16-1 | |
Figure A.17-1 | |
Figure A.24-1 | |