This document describes the service and performance requirements for the operation of professional video, audio and imaging via a 5G system, including a UE, NG-RAN and 5G Core network.
The aspects addressed in this document include:
Network service requirements specific for the operation of professional video, imaging and audio for PLMN and non-public networks (NPN)
New key performance indicators (KPIs) for PLMN and NPN
KPIs for Multicast and Broadcast Services
Network Exposure Requirements
Application Specific Requirements for video, imaging and audio
The following documents contain provisions which, through reference in this text, constitute provisions of the present document.
References are either specific (identified by date of publication, edition number, version number, etc.) or non-specific.
For a specific reference, subsequent revisions do not apply.
For a non-specific reference, the latest version applies. In the case of a reference to a 3GPP document (including a GSM document), a non-specific reference implicitly refers to the latest version of that document in the same Release as the present document.
For the purposes of the present document, the terms and definitions given in TR 21.905 and the following apply. A term defined in the present document takes precedence over the definition of the same term, if any, in TR 21.905.
the process by which audio and video content are combined in order to produce media content. This could be for live events, media production, conferences or other professional applications.
a means of making video file or stream sizes smaller to meet various applications. Different applications have different compressions methodologies applied.
Mezzanine compression: low latency and non-complex compression applied to a video signal in order to maintain the maximum amount of information whilst reducing the stream size to allow for the available bandwidth.
Visually lossless compression: the maximum amount of compression that can be applied to a video signal before visible compression artefacts appear.
Highly compressed: use of compression to distribute content over very low bandwidth connections where the content is more important than the quality of the image.
The time characteristic of an event or signal that is recurring at known, periodic time intervals.
Imaging System Latency:
The time that takes to generate an image from a source, to apply a certain amount of processing, to transfer it to a destination and then to render the resulting image on a suitable display device, as measured from the moment a specific event happens to the moment that very same event is displayed on a screen.
A specialist type of earphone usually worn by a performer in which an audio signal is fed to a wireless receiver and attached earphone.
Media clocks are used to control the flow (timing and period) of audio / video data acquisition, processing and playback. Typically, media clocks are generated locally in every mobile or stationary device with a master clock generated by an externally sourced grand master clock.
End-to-end maximum latency between the analogue input at the audio source (e.g. wireless microphone) and the analogue output at the audio sink (e.g. IEM). It includes audio application, application interfacing and the time delay introduced by the wireless transmission path.
For the purposes of the present document, the abbreviations given in TR 21.905 and the following apply. An abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in TR 21.905.