Tech-invite   3GPPspecs   RFCs   Search in Tech-invite

Gen21222324252627282931323334353637384‑5x
FsNEsRPsSBIsIDs Ti+
Top   in Index   Prev   Next

TR 26.918 (SA4)
Virtual Reality (VR)
Media Services over 3GPP

use "3GPP‑Page" to get the Word version
use "ETSI‑search" to get the PDF version
for a better overview, the Table of Contents (ToC) is reproduced
V16.0.0 (Wzip)  2018/12  129 p.
V15.2.0 (PDF)  2018/03  118 p.

Rapporteur:  Mr. Teniou, Gilles

Virtual Reality (VR) is the ability to be virtually present in a space created by the rendering of natural and/or synthetic image and sound correlated by the movements of the immersed user allowing interacting with that world.
The immersive multimedia experience has been an exploration topic for several years. With the recent progress made in rendering devices, such as Head mounted displays (HMD), a significant quality of experience can be offered.
Before any possible standardization, it is necessary to study the field:
  • to understand how the equipment used for creating such an immersive experience works, e.g. by collecting information on the optical systems and audio rendering processes;
  • to evaluate the relevance to Virtual Reality for 3GPP;
  • to identify the possible points of interoperability, and hence potential standardization.
Use cases for Virtual Reality need to be listed and mapped to the already existing 3GPP services if applicable.
Media formats required for providing the immersive experience need to be identified and potentially evaluated subjectively so as to extract requirements on minimum device and network capabilities.
The scope of the present document is to investigate the relevance of Virtual Reality in the context of 3GPP. Virtual Reality is the ability to be virtually present in a non-physical world created by the rendering of natural and/or synthetic image and sound correlated by the movements of the immersed user allowing to interact with that world. With recent progress made in rendering devices, such as Head mounted displays (HMD), a significant quality of experience can be offered. By collecting comprehensive information on VR use cases, existing technologies and subjective quality, the report attempts to identify potential gaps and relevant interoperability points that may require further work and potential standardization in 3GPP in order to support VR use cases and experiences in 3GPP user services. The report primarily focuses on 360 degrees video and associated audio, which support three degrees of freedom (3DOF).

full Table of Contents for  TR 26.918  Word version:   16.0.0

 

Here   Top

 

1  ScopeWord-p. 9
2  References
3  Definitions and abbreviationsWord-p. 13
4  Introduction to Virtual RealityWord-p. 14
4.1  Definition
4.2  Video systemsUp
4.3  Audio systems
4.4  Example service architectures
5  Use cases for Virtual Reality
5.1  General overview
5.2  Event broadcast/multicast use casesWord-p. 41
5.3  VR streamingWord-p. 42
5.4  Distributing 360 A/V content library in 3GPP
5.5  Live services consumed on HMDWord-p. 44
5.6  Social TV and VR
5.7  Cinematic VR use cases
5.8  Learning application use cases
5.9  VR calls use casesWord-p. 46
5.10  User generated VR use cases
5.11  Virtual world communication
5.12  HMD-based legacy content consumption
5.13  Use cases for Highlight Region(s) in VR video
6  Audio quality evaluation
6.1  Audio quality evaluation of scene-based formats
6.1.1  Introduction
6.1.2  Report of one ITU-R BS.1534-3 binaural listening test for basic audio quality of encoded scene-based audio content with non-individualized HRTF and non-equalized headphonesWord-p. 51
6.1.3  Report of one ITU-R BS.1534-3 binaural listening test for Localization quality of synthetic scene-based audio content with non-individualized HRTF and non-equalized headphonesWord-p. 54
6.1.4  Test of the ISO/IEC 23008-3 MPEG-H 3D Audio scene-based coding schemeWord-p. 56
6.1.5  Listening test for synthetic scene-based audio content with loudspeaker rendering assessing overall and localization quality with written audio scene descriptions as reference
6.1.6  Report of one test on encoding First-Order Ambisonics with 3GPP enhanced AAC+ with loudspeakers and with non-individualized HRTF and non-equalized headphones
6.1.7  Listening test for coding of First-Order Ambisonics using the EVS codec with loudspeaker rendering
6.2  Audio quality evaluation of object-based formatsWord-p. 79
6.3  Audio quality evaluation of channel-based formats
7  Video quality evaluation
7.1  Similarity ring metric
7.2  Subjective evaluation of Viewport-independent omnidirectional video streaming
7.3  Subjective evaluation of Viewport-dependent omnidirectional video streamingWord-p. 95
7.4  QoE Assessment of Simulator Sickness in VR [R16]Word-p. 100
8  Latency and synchronization aspectsWord-p. 108
9  Gap Analysis, Recommended Objectives and Candidate solutions for VR Use Cases
10  Conclusion
A  Encoding configuration parameters for viewport-independent video quality evaluationWord-p. 125
B  Test instructions for viewport-independent video quality evaluationWord-p. 126
C  Simulator Sickness Questionnair (SSQ) [R16]
D  Change historyWord-p. 129

Up   Top