Content for  TR 22.856  Word version:  19.2.0

Top   Top   Up   Prev   Next
1…   5…   5.2…   5.3…   5.4…   5.5…   5.6…   5.7…   5.8…   5.9…   5.10…   5.11…   5.12…   5.13…   5.14…   5.15…   5.16…   5.17…   5.18…   5.19…   5.20…   5.21…   5.22…   5.23…   5.24…   5.25…   5.26…   5.27…   5.28…   6   7…   7.2   8   A   B   C…


5.20  Use Case on Immersive Tele-Operated Driving in Hazardous Environmentp. 65

5.20.1  Descriptionp. 65

Operating vehicles, lifting devices, or machines in an industrial environment is hazardous when achieved manually and locally by a human. Depending on the environment, operators are exposed to dangerous material, toxic fumes, extreme temperatures, landslide risks, radioactivity, etc.
AGVs already exist, although it is expected that human operators can take remote control to remotely operate such moving vehicles.
In this use case, it is proposed to leverage 5G to provide an end-to-end system in which a remote user controls a moving device (vehicle, lifting device, robot, etc.) with an immersive cockpit displayed on a virtual reality head-mounted display and haptic gloves for control. Furthermore, the cockpit is complemented with information from the digital twin of the place in where the user operates (e.g., sensors in a factory, type of material around, other moving vehicles or persons).
The use case improves user safety and makes the operations even more accurate by merging additional information from a digital twin.

5.20.2  Pre-conditionsp. 65

Bob works in a seaport; he operates a lifting device. The place in which he is operating is surrounded by cranes, machines, containers, pipes, and barrels containing hazardous substances.
A new mobile metaverse service is available: instead of locally controlling the device, Bob is installed in a safe remote location from which he is working. The surrounding information is available through a digital twin of the seaport and can come from various sources (IoT sensors, CCTV cameras, connected machines, and other vehicles).
In order to maximize Bob's efficiency, the metaverse service experience delivered by the system is real-time with non-noticeable latency. This use case includes both location related and location agnostic service experience examples.
The mobile metaverse service Bob uses for teleoperation is running on a mobile metaverse server. In addition, Bob is equipped with a head-mounted display and haptic gloves to remotely control the vehicle.

5.20.3  Service Flowsp. 66

  1. This morning, Bob stayed home as his boss informed him about a potential hazard at the factory that was identified through some sensor on a pipe. Unfortunately, the exact nature and location of the hazard on the pipe are not known. So, Bob decides to remotely inspect the factory before his boss and local public authorities arrive to check.
  2. He puts on his head-mounted display on which a cockpit environment is displayed from the mobile metaverse server: a virtual control panel appears in front of him. He can see his hands and the control panel in the cockpit. Bob's application is connected to the mobile metaverse server which enables him to use the service.
  3. Bob can tell the mobile metaverse server to configure which surrounding information from the digital twin he wants to monitor. He decides to focus on the 3D representation of the pipe and get real-time sensor information from it, as well as live data from the ambient temperature and gas sensors. The mobile metaverse media displays additional predicted data that temperature is growing, gas concentration is increasing, and that there is a high risk of explosion in less than 10min if this continues. This surrounding information is integrated with other display elements in the cockpit, but he can anchor it in his FOV.
  4. While driving along the seaport by remotely controlling the lifting device via its digital twin in the metaverse server, Bob can also see the (hidden) content of other pipes.

5.20.4  Post-conditionsp. 66

Thanks to the 5G mobile metaverse "Tele-operated Driving" service, Bob has been able to drive the vehicle remotely in a reactive way avoiding dangers and finding the leak with the help of the information provided via the digital twins.

5.20.5  Existing feature partly or fully covering use case functionalityp. 66

The use case related to traffic flow simulation in clause 5.2 already provides requirements and KPIs related to the operation of a moving UE, similar to an AGV. However, that use case does not envision the use of remote control, e.g., using haptic devices and HMD, which trigger new requirements.
The use case related to critical healthcare services in clause 5.10 captures the usage of HMD and haptic devices with related requirements and KPIs, which can be generalized to industrial operations. However, this use case does not consider time-critical decisions based on surrounding moving objects in an open area. Neither it relies on real-time digital twin updates to track the characteristics of the environment (e.g., information about pipe content, etc.)

5.20.6  Potential New Requirements needed to support the use casep. 66

[PR 5.20.6-1]
The 5G system shall be able to provide a means to associate data flows related to one or multiple UEs with a single digital twin maintained by the mobile metaverse service.
[PR 5.20.6-2]
The 5G system shall be able to provide a means to support data flows from one or multiple UEs to update a digital twin maintained by the mobile metaverse service.
[PR 5.20.6-3]
Subject to regulatory requirements and operator's policy, the 5G system shall be able to support data flows directed towards one or multiple UEs as a result of a change in a digital twin maintained by the mobile metaverse service, so that physical objects could be affected via actuators.
[PR 5.20.6-4]
The 5G system shall be able to support the following KPIs for remotely controlling physical objects via the mobile metaverse service.
Use Cases Characteristic parameter (KPI) Influence quantity Remarks
Max allowed end-to-end latency Service bit rate: user-experienced data rate Relia­bility Area Traffic capacity Message Data Volume (bits) Transfer interval Position accuracy UE speed Service Area
Metaverse-based Tele-Operated Driving[100] ms [25] (NOTE 1) [10~50 Mbit/s] [25] 99%
[~360 Mbit/s/km2 ]
(NOTE 4)
~8Mbps video stream. Four cameras per vehicle (one for each side): 4*8=32Mbps.
Sensor data (interpreted objects).
Assuming 1 kB/object/100 ms and 50 objects: 4 Mbps [25]
20~100 ms [25]
(NOTE 2)
[10] cm [25] [10-50] km/h (vehicle) [25]
Stationary/Pedestrian (user)
Up to 10km radius [25]
(NOTE 3)
[20] ms [25] [0.1~0.4 Mbit/s] [25] 99,999% [25] [~4 Mbit/s/km2 ]
(NOTE 4)
Up to 8Kb
per message [25]
20 ms [25]
(NOTE 2)
[10] cm [25] [10-50] km/h (vehicle) [25]
Stationary/Pedestrian (user)
Up to 10km radius [25]
(NOTE 3)
1-20ms (NOTE 6)16 kbit/s -2 Mbit/s
(without haptic compression encoding);
0.8 - 200 kbit/s
(with haptic compression encoding)
(NOTE 6)
(NOTE 6)
[~20 Mbit/s/km2 ]
(NOTE 4)
2-8 (1 DoF) (NOTE 6)Stationary/ Pedestrian (user) Up to 10km radius [25]
(NOTE 3)
Haptic feedback
The end-to-end latency refers to the transmission delay between a UE and the mobile metaverse server or vice-versa, not including sensor acquisition or actuator control on the vehicle side, processing, and rendering on the user side (estimated additional 100ms total). Target e2e user experienced max delay depends on reaction time of the remote driver (e.g. at 50km/h, 20ms means 27cm of remote vehicle movement).
UL data transfer interval around 20ms (video) to 100ms (sensor), DL data transfer interval (commands) around 20ms.
The service area for teleoperation depends on the actual deployment; for example, it can be deployed for a warehouse, a factory, a transportation hub (seaport, airport etc.), or even a city district or city. In some cases, a local approach (e.g., the application servers are hosted at the network edge) is preferred to satisfy low latency and high-reliability requirements.
The area traffic capacity is calculated for one 5G network, considering 4 cameras + sensors on each vehicle. Density is estimated to 10 vehicles/km2, each of the vehicles with one user controlling them. [25]
Based on [25]. UL is real-time vehicle data (video streaming and/or sensor data), DL is control traffic (commands from the remote driver).
KPI comes from clause 7.11 of TS 22.261"remote control robot" use case.

Up   Top   ToC