Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 22.856  Word version:  19.2.0

Top   Top   Up   Prev   Next
1…   5…   5.2…   5.3…   5.4…   5.5…   5.6…   5.7…   5.8…   5.9…   5.10…   5.11…   5.12…   5.13…   5.14…   5.15…   5.16…   5.17…   5.18…   5.19…   5.20…   5.21…   5.22…   5.23…   5.24…   5.25…   5.26…   5.27…   5.28…   6   7…   7.2   8   A   B   C…

 

7.2  Consolidated potential KPIsp. 88

The 5G system shall support various mobile metaverse services with the following KPIs.
Use Cases Characteristic parameter (KPI) Influence quantity Remarks
Max allowed end-to-end latency Service bit rate: user-experienced data rate Relia­bility Area Traffic capacity Message size (byte) Transfer interval Position accuracy UE speed Service Area
5G-enabled Traffic Flow Simulation and Situational Awareness
(NOTE 2)
[5-20] ms
(NOTE 1)
[10~100] Mbit/s [25]
(NOTE 6)
> 99.9%[~39.6] Tbit/s/km2
(NOTE 5)
- 20~100 ms
(NOTE 3)
-< 250 km/hCity or Country wide
(NOTE 4)
UL
Collaborative and concurrent engineering[≤10] ms [14]
(NOTE 7)
[1-100] Mbit/s [14][> 99.9%] [14] [1.55] Tbit/s/km2
(NOTE 8)
Video: 1500 Audio: 100 [14] --Stationary or Pedestriantypically < 100 km2
(NOTE 9)
UL and DL audio/video
[5] ms UL [1-50] ms DL [14]
(NOTE 7)
[<1] Mbit/s [14][> 99.9%] (without compression)
[> 99.999%] (with compression
(NOTE 10)) [26]
[2.25] Tbit/s/km2
(NOTE 8)
1 DoF: 2-8 3 DoFs: 6-24 6 DoFs: 12-48 [14]0.25-10 ms [14]UL and DL haptic feedback
Metaverse-based Tele-Operated Driving
(NOTE 16)
[100] ms [25]
(NOTE 11)
[10~50] Mbit/s [25] 99% [25][~360] Mbit/s/km2
(NOTE 14)
-20~100 ms [25]
(NOTE 12)
[10] cm [25][10-50] km/h (vehicle) [25]
Stationary/ Pedestrian (user)
Up to 10km radius [25]
(NOTE 13)
UL real-time vehicle data (video streaming and/or sensor data) [25]
[20] ms [25] [0.1~0.4] Mbit/s [25] 99,999% [25] [~4] Mbit/s/km2
(NOTE 14)
Up to 8Kb [25]20 ms [25]
(NOTE 12)
[10] cm [25][10-50] km/h (vehicle) [25]
Stationary/ Pedestrian (user)
Up to 10km radius [25]
(NOTE 13)
DL control traffic (commands from the remote driver) [25].
1-20 ms
(NOTE 15)
16 kbit/s -2 Mbit/s (without haptic compression encoding); 0.8 - 200 kbit/s (with haptic compression encoding)
(NOTE 15)
99.999%
(NOTE 15)
[~20] Mbit/s/km2
(NOTE 14)
2-8 (1 DoF)
(NOTE 15)
Stationary/ Pedestrian (user)Up to 10km radius [25]
(NOTE 13)
Haptic feedback
Viewports streaming from rendering device to AR glasses through direct device connection (tethered/relaying case)
(NOTE 17)
10 ms (i.e., UL+DL between AR Glasses display and the rendering UE)
(NOTE 18)
[200-2000] Mbit/s99.9 %
(NOTE 18)
----Stationary or pedestrian (between rendering device and AR glasses)Up to direct device connection rangingImmersive AR interactive experience: tethered link
Pose information from AR glasses to rendering device through direct device connection (tethered/relaying case)
(NOTE 17)
5 ms
(NOTE 18)
[100-400] Kbit/s
(NOTE 18)
99.9 %
(NOTE 18)
----Stationary or pedestrian (between rendering device and AR glasses)Up to direct device connection ranging
Movie streaming from metaverse server to the rendering device
(NOTE 20)
Only relevant for live streaming.
[1-5] s in case of live streaming
[0.1-50] Mbit/s (i.e., covering a complete OTT ladder from low resolution to 3D-8K)
(NOTE 19)
99.9 %----[up to 500 km/h]-Immersive AR interactive experience: NG-RAN multimodal communication link
Avatar information streaming between remote UEs (end to end) 10 ms (i.e., 20ms between both UEs excluding metaverse server processing time)
(NOTE 22)
[0.1-30] Mbit/s
(NOTE 21)
99.9 %----[up to 500 km/h]-
Interactive data exchange: voice and text between remote UEs (end to end)
(NOTE 22)
10 ms (i.e., 20ms between both UEs excluding metaverse server processing time)[0.1-0.5] Mbit/s99.9 %----[up to 500 km/h]-
NOTE 1:
The mobile metaverse server receives the data from various sensors, performs data processing, rendering and provide feedback to the vehicles and users.
NOTE 2:
Examples of typical data volume including 1) camera: 10 Mbit/s per sensor (unstructured), 2) LiDAR: 90 Mbit/s per sensor (unstructured), 3) radar: 10 Mbit/s per sensor (unstructured), and 4) real-time Status information including Telemetry data: [< 50 kbit/s] per sensor/vehicle/VRU (structured). This is to support at least 80 vehicles and 1600 users present at the same location (e.g. in an area of 40m*250m) to actively enjoy immersive metaverse services for traffic simulation and traffic awareness, the area traffic capacity is calculated considering 2 cameras, 2 Radars, 2 LiDARs on road side, 1600 user's smart phones and 80 vehicles with 7 cameras, 4 radar and 2 LiDAR for each vehicle.
NOTE 3:
The frequency considers different sensor types such as Radar/LiDAR (10Hz) and camera (10~50Hz).
NOTE 4:
The service area for traffic flow simulation and situational awareness depends on the actual deployment, for example, it can be deployed for a city or a district within a city or even countrywide. In some cases a local approach (e.g. the application servers are hosted at the network edge) is preferred in order to satisfy the requirements of low latency and high reliability.
NOTE 5:
The calculation is this table is done per one 5G network, in case of N 5G networks to be involved for such use case in the same area, this value can be divided by N.
NOTE 6:
User experienced data rate refers to the data rate needed for the vehicle or human, the value is observed from industrial practice.
NOTE 7:
The network based conference focus is assumed, which receives data from all the participants, performs rendering (image synthesis), and then distributes the results to all participants. As rendering and hardware introduce some delay, the communication delay for haptic feedback is typically less than 5ms.
NOTE 8:
To support at least 15 users present at the same location (e.g. in an area of 20m*20m) to actively enjoy immersive Metaverse service concurrently, the area traffic capacity is calculated considering per user consuming non-haptic XR media (e.g. for video per stream up to 40000 kbit/s) and concurrently 60 haptic sensors (per haptic sensor generates data up to 1024 kbit/s).
NOTE 9:
In practice, the service area depends on the actual deployment. In some cases a local approach (e.g. the application servers are hosted at the network edge) is preferred in order to satisfy the requirements of low latency and high reliability.
NOTE 10:
The arrival interval of compressed haptic data usually follow some statistical distributions, such as generalized Pareto distribution, and Exponential distribution [26].
NOTE 11:
The end-to-end latency does not include sensor acquisition or actuator control on the vehicle side, processing, and rendering on the user side (estimated additional 100ms total). Target e2e user experienced max delay depends on reaction time of the remote driver (e.g. at 50km/h, 20ms means 27cm of remote vehicle movement).
NOTE 12:
UL data transfer interval around 20ms (video) to 100ms (sensor), DL data transfer interval (commands) around 20ms.
NOTE 13:
The service area for teleoperation depends on the actual deployment; for example, it can be deployed for a warehouse, a factory, a transportation hub (seaport, airport etc.), or even a city district or city. In some cases, a local approach (e.g., the application servers are hosted at the network edge) is preferred to satisfy low latency and high-reliability requirements.
NOTE 14:
The area traffic capacity is calculated for one 5G network, considering 4 cameras + sensors on each vehicle. Density is estimated to 10 vehicles/km2, each of the vehicles with one user controlling them. [25]
NOTE 15:
KPI comes from [5] clause 7.11 "remote control robot" use case.
NOTE 16:
Examples of typical data volume including 1) ~8Mbps video stream. Four cameras per vehicle (one for each side): 4*8=32Mbps. 2) sensor data (interpreted objects), assuming 1 kB/object/100 ms and 50 objects: 4 Mbps [25].
NOTE 17:
These KPIs are only valid for cases where the viewport rendering is done in the tethered device and streamed down to the AR glasses. In the case of rendering capable AR glasses, these KPIs are not valid.
NOTE 18:
These values are aligned with the tactile and multi-modal communication KPI table in clause 7.11 of TS 22.261.
NOTE 19:
These values are aligned with "high-speed train" DL KPI from clause 7.1 of TS 22.261.
NOTE 20:
To leverage existing streaming assets and delivery ecosystem, it is assumed that the legacy streaming data are delivered to the rendering device, which incrusts this in the virtual screen prior to rendering. For a live streaming event, the user-experience end-to-end latency is expected to be competitive with traditional live TV services, typically [1-5] seconds.
NOTE 21:
For example, the glTF format [60] can be used to deliver avatar representation and animation metadata in a standardized manner. Based on this format, the required bitrate for transmitting such data is highly dependent on avatar's complexity (e.g., basic model versus photorealistic).
NOTE 22:
These values are aligned with "immersive multi-modal VR" KPIs in clause 7.11 of TS 22.261.
Up

Up   Top   ToC