Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x
Top   in Index   Prev   Next

TR 26.861
Investigations on Test Methodologies
for Immersive Audio Systems

V15.0.0 (Wzip)  2018/09  48 p.
Rapporteur:
Mr. Schevciw, Andre
Qualcomm UK Ltd

full Table of Contents for  TR 26.861  Word version:  15.0.0

each clause number in 'red' refers to the equivalent title in the Partial Content
Here   Top
0Introduction  p. 5
1Scope  p. 6
2References  p. 6
3Definitions  p. 6
4Perceptual Audio Quality Attributes for Immersive Audio  p. 7
4.1Overview  p. 7
4.2Considerations on quality attributes for assessing immersive audio systems.  p. 7
4.2.1Input provided in S4-170836 - Test Methodology for the Assessment of Audio Systems  p. 7
4.2.1.1Proposal of relevant perceptual quality attributes  p. 7
4.2.1.2Listening Test Methodology  p. 7
4.2.1.3Experiments with Absolute Category Rating  p. 7
4.2.1.4Possibilities with Comparative Listening Tests  p. 7
4.2.1.5Conclusions  p. 9
4.2.1.6References  p. 9
4.2.2Input provided in S4-170914 - On Audio Quality Attributes  p. 9
4.2.2.1Introduction  p. 9
4.2.2.2Audio Quality Attribute Elicitation  p. 9
4.2.2.3The Sound Quality Wheel  p. 10
4.2.2.4Proposals on Quality Attributes for Audio Capture Tests  p. 11
4.2.2.5Quality Dimensions for Audio Coding and Transmission Tests  p. 11
4.2.2.6References  p. 12
4.2.2.7Appendix A  p. 13
4.2.3Input provided in S4-171225 - On Spatial Audio Quality Assessment  p. 13
4.2.3.1Introduction  p. 13
4.2.3.2On audio attributes and categories  p. 13
4.2.3.2.1Audio attributes in ITU-R recommendations  p. 13
4.2.3.2.2Audio attributes and categories from a lexical study  p. 14
4.2.3.2.3Attributes related to binaural rendering  p. 15
4.2.3.3On subjective test methodologies  p. 15
4.2.3.3.1Choice of listening instrument: loudspeakers vs. headphones  p. 15
4.2.3.3.2Methodology used to evaluate the binaural rendering of different audio capture systems  p. 16
4.2.3.3.3Extended MUSHRA methodology with anchors to cover multiple degradations/categories  p. 16
4.2.3.4Conclusion  p. 17
4.2.3.5References  p. 17
4.2.4Input provided in S4-180125 - On Auditory Assessment of Audio Systems  p. 17
4.2.4.1Introduction  p. 17
4.2.4.2Listening Test Material  p. 17
4.2.4.3Listening Test Methodology and Environment  p. 18
4.2.4.4Results  p. 18
4.2.4.4.0General  p. 18
4.2.4.4.1Postprocessing of Votes  p. 18
4.2.4.4.2Mapping to absolute scale  p. 20
4.2.4.4.3Predictability of overall quality  p. 21
4.2.4.5Conclusions  p. 21
4.2.4.6References  p. 22
4.2.5Input provided in S4-180144 - High Dimensional Assessment of Spatial Audio Quality  p. 22
4.2.5.1Abstract  p. 22
4.2.5.2Perceptual attributes of spatial audio  p. 22
4.2.5.3Adaptive audio (ADA) test methodology  p. 23
4.2.5.4Suitability analysis of ADA methodology  p. 25
4.2.5.4.1ADA in free-field listening environments  p. 25
4.2.5.4.2Analysis of free-field listening environments  p. 27
4.2.5.5Applicability of ADA  p. 28
4.2.5.5.1ADA assessment for binaural sound systems  p. 28
4.2.5.5.2Trajectory Analysis  p. 29
4.2.5.6References  p. 30
4.3Quality attributes of relevance for 3GPP immersive audio systems  p. 30
5Considerations on Test Methodologies for Immersive Audio Systems  p. 30
5.1Input provided in S4-180464 - On the validity of the CIBR baseline testing for VR Stream  p. 30
5.1.1Summary  p. 30
5.1.2Introduction  p. 31
5.1.3Comparison Category Rating Tests for Reference Renderer  p. 31
5.1.3.1Test design  p. 31
5.1.3.1.1Test Material  p. 31
5.1.3.1.2HRTF Selection  p. 33
5.1.3.1.3Listeners  p. 33
5.1.3.1.4Test Description, Interface and Randomization  p. 33
5.1.3.2Results  p. 34
5.1.3.3Conclusion and Proposal  p. 35
5.1.3.4References  p. 36
5.2Input provided in S4-180520 - On the impact of individualized HRTF and HpTF  p. 36
5.2.1Summary  p. 36
5.2.2Introduction  p. 36
5.2.3Test Methodology  p. 36
5.2.3.1Goal of the Experiment  p. 36
5.2.3.2Listening Experiment Paradigm  p. 36
5.2.3.3Test Environment  p. 37
5.2.3.4HpIR Measurement & Compensation Filter Generation  p. 38
5.2.3.5BRIR Measurement  p. 39
5.2.3.6Test and Reference Conditions generation  p. 39
5.2.3.6.1Source Material  p. 39
5.2.3.6.2Test Conditions  p. 40
5.2.3.6.3Reference Conditions  p. 40
5.2.3.6.4Test and Reference Conditions Naming Convention  p. 40
5.2.3.7Training Session  p. 41
5.2.3.8Listening Session  p. 42
5.2.4Participants  p. 43
5.2.5Results  p. 43
5.2.6Analysis  p. 43
5.2.7Conclusion  p. 43
5.3Input provided in S4-180472 - On ITU-R BS.1534 (MUSHRA)  p. 44
5.3.1Introduction  p. 44
5.3.2Description  p. 44
5.3.2.1General  p. 44
5.3.3Conclusion  p. 44
5.3.4References  p. 44
5.4Input provided in S4-180805 - Verification of CIBR configuration for FOA  p. 44
5.4.1Summary  p. 44
5.4.2FOA rendering in CIBR  p. 45
5.4.2.1Test Design  p. 45
5.4.2.2Test Material  p. 45
5.4.2.3Listening Environment  p. 45
5.4.2.4Listening Panel  p. 46
5.4.2.5Results  p. 46
5.4.3Conclusions  p. 47
5.4.4References  p. 47
$Change history  p. 48

Up   Top