Tech-invite   3GPPspecs   RFCs   Search in Tech-invite

Gen21222324252627282931323334353637384‑5x
FsNEsRPsSBIsIDs Ti+
Top   in Index   Prev   Next

TR 22.827 (SA1)
Study on Audio-Visual Service Production

use "3GPP‑Page" to get the Word version
for a better overview, the Table of Contents (ToC) is reproduced
V17.0.0 (Wzip)2019/09  76 p.

Rapporteur:  Dr. Cauduro Dias de Paiva, Rafael

This Technical Report describes relevant use-cases and proposes respective potential service requirements for 5G systems to support production of audio-visual (AV) content and services.
Previous work assessed certain aspects for local applications (e.g. ultra-reliable low-latency and time synchronization demands.) This study addresses implications for 3GPP from wide-area media production and additional local applications. Topics to be studied include demanding locally-distributed production scenarios or ad-hoc deployments of high-bandwidth networks providing increased mobility and coverage and lowest streaming latencies.
Hereunder is reproduced Section 4 "Overview":
The 3GPP system already plays an important role in the distribution of audio-visual (AV) media content and services. Release 14 contains substantial enhancements to deliver TV services of various kinds, from linear TV programmes for mass audiences to custom-tailored on-demand services for mobile consumption. However, it is expected that also in the domain of AV content and service production, 3GPP systems will become an important tool for a market sector with steadily growing global revenues. There are several areas in which 3GPP networks may help to produce audio-visual content and services in a cost efficient and flexible manner.
AV content and service production can be broadly categorized. The most obvious distinction is production within a fixed production environment versus production at a location outside the premises of a production company. Furthermore, live or non-live productions may come with very different requirements. Mobile 3G and 4G networks are utilized quite frequently nowadays. Several mobile devices are employed simultaneously in order to achieve required data rates and guarantee stabile communication. In the broadcasting world this is called bonded cellular contribution.
Newsgathering is an AV production category which is vital for broadcasting companies around the world. Their job is to offer news covering any kind of event or incident which may be of interest to the public. This refers to events which cannot be planned for as they just happen. Incidents in politics and economy or natural catastrophes often come out of the blue and production companies need to react swiftly. The time to set-up equipment, for example a local communication network, is a crucial factor. As soon as an important incident becomes known a newsgathering team is sent to some location to cover what is happening. Reporters may capture audio and video which need to be send to the home base production facilities. This requires fast and efficient communication links. In newsgathering high levels of data compression may be acceptable if otherwise no communication is possible. For HD video feeds 5 - 10 Mbit/s are needed as minimum.
In a typical setting of newsgathering more than one camera is used. Depending on circumstances a single camera is fed back to a central production facility, sometimes it is multiple single cameras simultaneously. However, quite often multiple single cameras are fed into a local vision mixer/switcher before being sent as a single stream back to the production facility. The latter is called multi-camera feed. Crew communication on the location of the event or incident needs to be established as well. Furthermore, all devices such as cameras are operated by the production crew at the location of the incident. Newsgathering may take place outdoors or indoors.
One use case often occurring in production is the ability to transfer file-based AV content or other assets to and from the broadcasting facility. For example, clips that are preproduced at the event location and need to be made available in the broadcaster playout system to illustrate a live contribution. Or if the mixing of the live signal is done locally at the event location and archive clips or video overlays are need be available on location. The difference to the live feed transmission is that this material is sent between the two locations but not necessarily in real-time, usually as a file. This means that 2-way file transfer capabilities need to be available to upload or down load files on location. These files are usually extremely large (> 1Gbit per file) and transfer speeds need to be capable of delivering this within a reasonable timescale. While this does not necessarily need to be real time. Support for growing files is also useful so that an editor on location can start work on a clip without waiting for the whole file to be delivered.
Another important category of AV production is called "outside broadcast" (OB). In contrast to newsgathering the date of an event is known sometimes a long time before it actually takes place. Examples are elections or sport events such as football championships or the Olympic Games. Apart from that OB productions are quite similar to newsgathering in terms of setting; however, the scale of the event is usually larger. That means more equipment, more people and very likely for a longer period of time. Usually, a large number of wireless audio links (e.g. up to 100 and more) and several wireless video cameras (e.g. up to 20 or more) are employed in one regular single event. They have to be carefully synchronized in time, at the moment of recording and capture, in particular in live production as well as transmitted with the associated timestamp or delta to a master clock. Large scale events could also utilize several hundred remote microphones and cameras, while these devices are limited to just capture, processing, compression, optional encryption and transmission, incorporating mobile as well as stationary equipment.
The equipment, devices and communication infrastructure are carried to the location of the event using large vans. These OB vans act as a communication hub for the event. They are potentially capable of supporting many cameras, microphones, mixers, etc. On-location reliable and scalable wireless communication links between directors, technicians and other staff are needed, in particular audio links.
Satellite connections are typically established for OB productions to send audio and video content back to the base production facilities. More recently there is a trend becoming more and more important to remote control for example cameras from the central home base production facilities rather than on location. Remotely operated equipment requires reliable telemetry and control communications. The quality of audio and video in OB productions are high calling for potent communication links in terms of data rates and data capacity.
Most OB productions are taking place in a given location. However, there are also events which are not stationary. This requires the production team to follow the event including carrying the whole equipment along the way. Coverage of cycling events is a typical example. There the OB vans as a communication hub are often replaced by helicopters and planes. These kinds of events also come with the requirement to cope with very high velocities. In Formula 1 races the cameras mounted on the vehicles need to be operated at speeds up to 400 km/h.
Even though today audio and video material are sent back to the home base production facility for post-processing in order to prepare the final TV or radio services, there is a trend to carry post-processing remotely already. This requires the ability to access resources from the home base production facility as well as utilizing cloud services be it computational power or storage.
In addition to production outside the premises of production companies, studio-based production is of paramount importance. To date, most studios are using mainly wired and purpose-built communication infrastructure. Depending on the circumstances this is costly and inflexible. Many production studios still utilize fixed line connections between for example cameras, mixers and galleries. However, in order to become more flexible and agile fully wireless workflows would be appreciated. However, studio productions are typically those where quite likely the highest quality and communication requirements are encountered. While under mobile or nomadic conditions concessions need to be made regarding the maximum available data rate for data transfer it would be much appreciated if for studio productions uncompressed or at least loss-less data transmissions could be utilized. Uncompressed TV signals can be over 12 Gbit/s for a high-resolution high frame rate video.
Covering an event which takes place on a stage in a theatre or a concert hall lies somewhat between pure OB and studio production. Quite often there is infrastructure available at the location of the event which could be used by production companies under certain conditions. Then the question of seamless cooperation between different infrastructures arise under the condition that the production requirements can still be met.
Capturing a stage event requires to deal with many wireless microphones, in-ear monitors, and a variety of service links. In a typical professional live-performance scenario, performers on stage use wireless microphones while hearing themselves via the wireless in-ear monitor system. The audio signals coming from the microphones are streamed to a mixing console, where different incoming audio streams are mixed into several outgoing streams, such as, e.g. the Public Address (PA), the in-ear monitoring mixes or recording mixes. Typical setups come with stringent requirements in terms of end-to-end latency, jitter, synchronicity, communication service availability, communication service reliability and number of wireless links per site. For complex stage productions the number of simultaneous links might be very high, i.e. more than 100 in the same location.
Traditional broadcast signals have been moved on dedicated networks and infrastructure. In recent years we have seen broadcast centres moving to an increasingly IP based workflow. This has several benefits but has meant significant work on the definition of IP streams that carry audio, video and data. The standards bodies who have defined these systems are actively looking at how these protocols may be carried by a wireless network. It is desirable that 3GPP contribution should be compatible with these best practice architectures so as to make interfacing and adoption as simple as possible.

full Table of Contents for  TR 22.827  Word version:   17.0.0

 

Here   Top

 

1  ScopeWord-p. 7
2  References
3  Definitions, symbols and abbreviationsWord-p. 8
4  Overview
5  Use casesWord-p. 12
5.1  On-site Live Audio Presentation
5.2  Audio Streaming in Live Performances
5.3  Live production with integrated audience servicesWord-p. 19
5.4  Intercom system for large live eventsWord-p. 24
5.5  Single- Source uncompressed Outside Broadcast ContributionWord-p. 27
5.6  Single- source compressed Outside Broadcast Contribution
5.7  Professional TV Production Contributions from an Off-Site, Remotely-Produced, Multi-Camera Outside BroadcastWord-p. 36
5.8  Simple Live Sports CommentaryWord-p. 40
5.9  Video streaming of live events using an airborne relay
5.10  Live Immersive Media ServiceWord-p. 48
5.11  Video Streaming in Professional Coverage of Live Performances
5.12  Authentication of devices on a shared non-public networkWord-p. 59
5.13  Onboarding of audio-visual IoT devices onto a non-public network
6  Security AspectsWord-p. 63
7  Additional considerationsUp
8  Consolidated potential requirements
A  Real-time audio-streaming latency budget
B  Overview of AV system structure using point to multipointWord-p. 73
C  Change historyWord-p. 74

Up   Top