Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 22.856  Word version:  19.2.0

Top   Top   Up   Prev   Next
1…   5…   5.2…   5.3…   5.4…   5.5…   5.6…   5.7…   5.8…   5.9…   5.10…   5.11…   5.12…   5.13…   5.14…   5.15…   5.16…   5.17…   5.18…   5.19…   5.20…   5.21…   5.22…   5.23…   5.24…   5.25…   5.26…   5.27…   5.28…   6   7…   7.2   8   A   B   C…

 

5.7  Use Case on AR Enabled Immersive Experiencep. 35

5.7.1  Descriptionp. 35

In addition to watching movies at the cinema, people will also choose to watch a movie on their mobile phones, laptops or TVs when they don't have time to go to the cinema, e.g. when travelling or at home. However, by watching through these terminals, users will feel more or less uncomfortable in the neck or cervical spine because the users always keep their necks down. Moreover, the screen is relatively small; if users want to see more realistic screen details or watch an immersive 3D movie, using these terminals is not feasible.
Copy of original 3GPP image for 3GPP TS 22.856, Fig. 5.7.1-1: AR Enabled Immersive Experience (image source: www.indiegogo.com)
Up
In this use case, users can get an immersive location agnostic service experience of watching movies in certain circumstances, such as at home or on the train. They can even invite some of their friends to watch a movie at different place simultaneously by wearing a wearable device, such as AR glasses. A large screen like a movie theatre will be presented in the field of vision (FOV) of the wearable device, which not only provides an immersive watching experience like private cinema but also has very low demands on the environment and space of the user's location, 3D cinematic effects can also be easily rendered in the device.
To achieve an immersive experience location agnostic service through AR glasses, the 5G system is required to provide a reliable transmission of uplink and downlink data and a way for users to synchronize their experience and interact together. In mobility scenarios, e.g., when a user is travelling on the train, the continuity of data transmission also needs to be guaranteed. Moreover, when AR glasses are wireless, the power supply relies on the battery integrated inside the AR glasses. This use case investigates how the 5G system (through direct device connection or NG-RAN) can be used to support UE to establish data connection with the mobile metaverse server. The 5G system can provides services to AR glasses so as to minimize energy consumption in the overall system.
The service dataflows and requirements may differ depending on whether the AR glasses are accessing the service through a direct device connection or NG-RAN.
Up

5.7.2  Pre-conditionsp. 36

Bob wants to watch an AR movie with friends to relax while travelling by train. So he wears wearable equipment such as AR glasses that can access into the 5G network (through direct device connection or NG-RAN) to access metaverse services.
Bob has subscribed to an immersive movie service as a mobile metaverse service that he can access via AR glasses. The service gives Bob access to a large movie catalog (2D/3D). The battery capacity of the AR glasses is enough to watch a two-hour movie. Before starting the movie, Bob can invite some friends to join him. If people join Bob, their avatars are also placed into his FOV and Bob can interact with them (speech or text).
Considering the wearing comfort of users, mainstream AR glasses should not be too heavy (normally less than 150 grams). The maximum capacity of the battery (50 grams) is about 1000mAh. Usually, 25% of the battery capacity of AR glasses is allocated to the mobile termination module.
When a user watches a 4K movie, some video compression techniques are usually used to reduce the amount of data transmitted while maintaining the image quality.
In general, a large compression ratio will cause a delay increase. Considering the overall factors of delay and energy consumption, using AR glasses with a direct device connection would require a low compression ratio, for instance, 3:1 [50]. However, some advanced AR glasses SoC embeds hardware video decoders (e.g., AVC, HEVC, and VVC) and can render viewport efficiently.
A study on energy consumption of hardware video decoders [59] shows that a typical HEVC hardware decoder embedded on an Android device is spending 40-50mA per hour of video playback (decoding only). The usage of hardware decoders seems reasonable, given the 1000mAh battery capacity assumption made in the current description. For the NG-RAN case, it is reasonable to think that AR glasses can decode and render efficiently with low energy consumption.
Up

5.7.3  Service Flowsp. 37

  1. Data connections are set up between AR glasses and the Metaverse server, which can provide immersive location agnostic service experience of a movie service. The 5G module can be connected to the 5G network either via direct device connection or NG-RAN.
  2. The mobile metaverse server provides movie access to the client AR glasses through the downlink data stream.
  3. The mobile metaverse server manages communications between clients (friends), e.g., including video, avatar, speech, and text.
  4. The mobile metaverse server manages synchronization between the clients (e.g., the various AR glasses) of the friends.
Up

5.7.4  Post-conditionsp. 37

Bob is able to watch an immersive movie with friends while travelling, obtaining a good user experience. The battery capacity of the AR glasses is enough to watch a two-hour movie.
Bob is able to communicate with them in a synchronized manner. Bob's friends (or avatars) can be visible in his FOV.
The 5G system is able to support communication for immersive location agnostic AR services, providing a reliable transmission, a continuity of service and a synchronized experience across users sharing a viewing experience.
Up

5.7.5  Existing features partly or fully covering the use case functionalityp. 37

The performance requirements for high data rate AR services have been captured in clause 7.6 of TS 22.261. The performance requirements for UE to network relaying in 5G systems have been captured in clause 7.7 of TS 22.261. The functional and performance requirements for tactile and multi-modal communication services have been captured in clauses 6.43 and 7.11 of TS 22.261, respectively.
However, existing requirements still need to consider the power consumption of the 5G UE onboard AR terminals.
Up

5.7.6  Potential New Requirements needed to support the use casep. 37

[PR 5.7.6-1]
Subject to operator policy, the 5G system shall support a means to provide high data rate transmission to a UE during an extended period of time, including when in high-speed mobility.
[PR 5.7.6-2]
Subject to operator policy, the 5G system shall support a mechanism that enables flexible adjustment of communication services based on the type of devices (e.g., wearables), such that the services can be operated with reduced energy utilization.
[PR 5.7.6-3]
Subject to operator policy, the 5G system shall support a means to enable interactive immersive multiparty communications in the metaverse service.
Characteristic parameter (KPI) Influence quantity
Max allowed end-to-end latency Service bit rate: user-experienced data rate Reliability # of UEs UE Speed Service Area
Viewports streaming from rendering device to AR glasses through direct device connection
(tethered/relaying case)
(note 1)
10 ms (i.e., UL+DL between AR Glasses display and the rendering UE) (note 2)[200-2000] Mbit/s99,9 % (note 2)1-2Stationary or pedestrian
Pose information from AR glasses to rendering device through direct device connection
(tethered/relaying case)
(note 1)
5 ms (note 2)[100-400] Kbit/s (note 2)99,99 % (note 2)1-2Stationary or pedestrian
NOTE 1:
These KPIs are only valid for cases where the viewport rendering is done in the tethered device and streamed down to the AR glasses. In the case of rendering capable AR glasses, these KPIs are not valid.
NOTE 2:
These values are aligned with the tactile and multi-modal communication KPI table in clause 7.11 of TS 22.261.
Characteristic parameter (KPI) Influence quantity
Max allowed end-to-end latency Service bit rate: user-experienced data rate Reliability # of UEs UE Speed Service Area
Movie streaming from metaverse server to the rendering device (note 2)Only relevant for live streaming.
[1-5] s in case of live streaming
[0,1-50] Mbit/s (i.e., covering a complete OTT ladder from low resolution to 3D-8K)
(note 1)
99,9 %1 to [10][up to 500 km/h]-
Avatar information streaming between remote UEs (end to end)
(note 3)
20 ms (i.e., UL between UE and the interface to metaverse server + DL back to the other UE)[0,1-30] Mbit/s99,9 %1 to [10][up to 500 km/h]-
Interactive data exchange: voice and text between remote UEs (end to end)
(note 4)
20 ms (i.e., UL between UE and the interface to metaverse server + DL back to the other UE)[0,1-0,5] Mbit/s99,9 %1 to [10][up to 500 km/h]-
NOTE 1:
These values are aligned with "high-speed train" DL KPI from clause 7.1 of TS 22.261.
NOTE 2:
To leverage existing streaming assets and delivery ecosystem, it is assumed that the legacy streaming data are delivered to the rendering device, which incrusts this in the virtual screen prior to rendering. For a live streaming event, the user-experience end-to-end latency is expected to be competitive with traditional live TV services, typically [1-5] seconds.
NOTE 3:
For example, the glTF format [60] can be used to deliver avatar representation and animation metadata in a standardized manner. Based on this format, the required bitrate for transmitting such data is highly dependent on avatar's complexity (e.g., basic model versus photorealistic).
NOTE 4:
These values are aligned with "immersive multi-modal VR" KPIs in clause 7.11 of TS 22.261. End-to-end latency in this table is calculated as twice the value of the DL "immersive multi-modal VR" latency in clause 7.11 of TS 22.261.
Up

Up   Top   ToC