For the purposes of the present document, the terms and definitions given in TR 21.905
and the following apply:
A real-world visual scene captured by a set of cameras or a camera device with multiple lenses and sensors covering the sphere in all directions around the centre point of the camera set or camera device. The term 360-degree video may be used to include also limited 360-degree video.
Limited 360-degree video:
A 360-degree video in which the visual scene does not cover the entire sphere around the center point of the camera set or camera device but only a part of it. A limited 360-degree video may be limited i) in the horizontal field to less than 360 degrees, or ii) in the vertical field to less than 180 degrees or iii) in both the vertical and horizontal fields.
Both names refer to the AMR codec (TS 26.071
) and are used interchangeably in this specification.
A bitstream that conforms to a video or audio encoding format.
A sequence of bits that forms the representation of one or more coded video or audio sequences.
The Coverage and Handoff Enhancements using Multimedia error robustness feature.
Used for the AMR and AMR-WB codecs to identify one specific bitrate. For example AMR includes 8 codec modes (excluding SID), each of different bitrate.
UE that is (i) operating in radio access capability category series "M" capable of supporting conversational services, and/or (ii) a wearable device which is constrained in size, weight or power consumption (e.g. connected watches), excluding smartphones and feature phones.
A data channel capable MTSI client supporting data channel media as defined in clause 6.2.10.
DCMTSI client in terminal:
A DCMTSI client that is implemented in a terminal or UE. The term "DCMTSI client in terminal" is used in this document when entities such as MRFP, MRFC or media gateways are excluded.
A variant of 2-channel stereo encoding where two instances of a mono codec are used to encode a 2-channel stereo signal.
Evolved UTRAN is an evolution of the 3G UMTS radio-access network towards a high-data-rate, low-latency and packet-optimized radio-access network.
EVS Primary mode:
The EVS codec includes two operational modes: EVS Primary operational mode ('EVS Primary mode') and EVS AMR-WB Inter-Operable ('EVS AMR-WB IO mode'). When using EVS AMR-WB IO mode the speech frames are bitstream interoperable with the AMR-WB codec 
. Frames generated by an EVS AMR-WB IO mode encoder can be decoded by an AMR-WB decoder, without the need for transcoding. Likewise, frames generated by an AMR-WB encoder can be decoded by an EVS AMR-WB IO mode decoder, without the need for transcoding.
EVS AMR-WB IO mode:
Includes 11 bit-rates for fixed-rate or multi-rate operation; 1 average bit-rate for variable bit-rate operation; and 1 bit-rate for SID (TS 26.441
). The EVS Primary can encode narrowband, wideband, super-wideband and fullband signals. None of these bit-rates are interoperable with the AMR-WB codec.
Field of View:
Includes 9 codec modes and SID. All are bitstream interoperable with the AMR-WB codec (TS 26.171
The extent of visible area expressed with vertical and horizontal angles, in degrees in the 3GPP 3DOF reference system as defined in TS 26.118
Video captured by a wide-angle camera lens that usually captures an approximately hemispherical field of view and projects it as a circular image.
Frame Loss Rate (FLR):
The percentage of speech frames not delivered to the decoder. FLR includes speech frames that are not received in time to be used for decoding.
MTSI client supporting the Immersive Teleconferencing and Telepresence for Remote Terminals (ITT4RT) feature, as defined in Annex X
ITT4RT client only capable of sending immersive video.
ITT4RT client only capable of receiving immersive video
An ITT4RT client implemented by functionality included in the MRFC and the MRFP.
ITT4RT client in terminal:
An ITT4RT client that is implemented in a terminal or UE. The term "ITT4RT client in terminal" is used in this document when entities such as ITT4RT MRF is excluded.Mode-set: Used for the AMR and AMR-WB codecs to identify the codec modes that can be used in a session. A mode-set can include one or more codec modes.
A multi-stream capable MTSI client supporting multiple streams as defined in Annex S
. An MTSI client may support multiple streams, even of the same media type, without being an MSMTSI client. Such an MTSI client may, for example, add a second video to an ongoing video telephony session as shown in Annex A.11
. In that case, the MTSI client is an MSMTSI client only if it is fully compliant with Annex S
An MSMTSI client implemented by functionality included in the MRFC and the MRFP.
MSMTSI client in terminal:
An MSMTSI client that is implemented in a terminal or UE. The term "MSMTSI client in terminal" is used in this document when entities such as MRFP, MRFC or media gateways are excluded.
A function in a terminal or in a network entity (e.g. a MRFP) that supports MTSI.
MTSI client in terminal:
An MTSI client that is implemented in a terminal or UE. The term "MTSI client in terminal" is used in this document when entities such as MRFP, MRFC or media gateways are excluded.
MTSI media gateway (or MTSI MGW):
A media gateway that provides interworking between an MTSI client and a non MTSI client, e.g. a CS UE. The term MTSI media gateway is used in a broad sense, as it is outside the scope of the current specification to make the distinction whether certain functionality should be implemented in the MGW or in the MGCF.
Media such as image or video and its associated audio that enable rendering according to the user's viewing orientation, if consumed with a head-mounted device, or according to user's desired viewport, otherwise, as if the user was in the spot where and when the media was captured.
Used for the EVS codec to distinguish between EVS Primary mode and EVS AMR-WB IO mode.
A piece of visual media, rendered over omnidirectional video or image, or a viewport.
Position and rotation information associated to a viewport.
Picture that has a representation format specified by an omnidirectional video projection format.
Inverse of the process by which the samples of a projected picture are mapped to a set of positions identified by a set of azimuth and elevation coordinates on a unit sphere.
Simultaneously sending different encoded representations (simulcast formats) of a single media source (e.g. originating from a single microphone or camera) in different simulcast streams.
The encoded format used by a single simulcast stream, typically represented by an SDP format and all SDP attributes that apply to that particular SDP format, indicated in RTP by the RTP header payload type field.
The RTP stream carrying a single simulcast format in a simulcast.
Region of omnidirectional image or video suitable for display and viewing by the user.
For the purposes of the present document, the abbreviations given in TR 21.905
and the following apply:
3 Degrees of freedom
5G Core Network
Application Layer - Service Data Unit
Adaptive Multi-Rate - NarrowBand
Adaptive Multi-Rate - WideBand
Adaptive Multi-Rate - WideBand Inter-operable Mode, included in the EVS codec
Access Network Bitrate Recommendation
Access Network Bitrate Recommendation Query
APPlication-defined RTCP packet
Automatic repeat ReQuest
Access Transfer Control Function
Access Transfer GateWay
Advanced Video Coding
Binary Floor Control Protocol
Codec Control Messages
Cumulative Distribution Function
Connected Mode DRX
Coverage and Handoff Enhancements using Multimedia error robustness feature
Codec Mode Request
characters per second
Call Session Control Function
Cellular Text telephone Modem
Coordination of Video Orientation
Delay Budget Information
Data Radio Bearer
Datagram Transport Layer Security
Dual Tone Multi-Frequency
Explicit Congestion Notification
ECN Congestion Experienced
ECN Capable Transport
E-UTRAN Node B
Enhanced Voice Services
Far End Camera Control
Full Intra Request
Frame Loss Rate
Facsimile over IP
Field Of View
Generic IP access
Group Of Blocks
Hybrid - ARQ
High Efficiency Video Coding
Head Mounted Display
High Speed Packet Access
Initial Codec Mode
Instantaneous Decoding Refresh
Internet Facsimile Protocol
Internet Facsimile Transfer
IP Multimedia Subsystem
Internet Protocol version 4
Intra Random Access Point
Immersive Teleconferencing and Telepresence for Remote Terminals
International Telecommunications Union - Telecommunications
Jitter Buffer Management
Media Gateway Control Function
Multipurpose Internet Mail Extensions
Moving Picture Experts Group
Media Resource Function Controller
Media Resource Function Processor
Multi-Stream Multimedia Telephony Service for IMS
Message Session Relay Protocol
Multimedia Telephony Service for IMS
Maximum Transfer Unit
Network Time Protocol
Omnidirectional MediA Format
Pulse Code Modulation
Packet Data Convergence Protocol
Packet Data Protocol
Picture Loss Indication
Packet Loss Ratio
Point Of Interconnect
Public Switched Telephone Network
Pan, Tilt, Zoom and Focus
QoS Class Identifier
QoE Measurement Collection
Quality of Experience
Quality of Service
Region of Interest
RTP Control Protocol
Real-time Transport Protocol
Sub-Band Adaptive Differential PCM
Source Controlled VBR
Stream Control Transmission Protocol
Service Data Adaptation Protocol
Session Description Protocol
SDP Capability Negotiation
Supplemental Enhancement Information
Session Initiation Protocol
Single Radio Voice Call Continuity
Telecoms and Internet converged Services and Protocols for Advanced Network
Temporary Maximum Media Bit-rate Notification
Temporary Maximum Media Bit-rate Request
User Datagram Protocol
Facsimile UDP Transport Layer (protocol)
Viewport Dependent Processing
Voice over IP
Video Object Plane
Web Real-Time Communication