Mobile robots have been playing an increasingly important role in some scenarios, e.g. warehouse, disaster rescue and smart factories 
, thanks to their high mobility. The mobile robots need to work in an ever-changing environment, hence need to perform fast and reliable sensing, planning and controlling. If the corresponding computation is performed on board in the robot, it will require intensive computations which lead to increased requirements of computation capabilities and power consumption. However, a light-weight form factor is always a requirement to the mobile robots working in the real-world environment, which prevents the robots to be equipped with a large number of CPU/GPU units and large-capacity batteries. As the example provided in 
, an advanced commercially available quadruped robot, carries 3kg of batteries of about 650Wh energy, while the high-end GPU consumes more than 250W of power, significantly impacting battery life if such computational power was embedded on the robot.
Offloading computations from robots to the cloud has been studied in many references 
. Meanwhile relying on either data or code from a network to support the robot's operation, the designers of autonomous mobile robots have to consider the scenarios where the robots must include capacity of local processing for low-latency responses during periods when network access quality is varying worse.
The resulting system is different from the fully remote-controlled robot system described in 
, in which the planning and controlling are carried out by cloud computing, and the robots only reports the sensing data (incl. video), and receives the control commands. Since the complete cloud computing can hardly meet the latency requirement of the ms-level feedback control loop of some types of mobile robots, e.g. legged robots, the split control of mobile robots is an agreeable solution in this case.
introduces a robot whole-body balance control split over 5G network. The AI inference for the controlling can be split between the robot and the cloud server: As shown in Figure 5.4.1-1
, the part which is complex but less susceptible to delays is offloaded to the remote computation in the cloud or edge control server. The low-complexity part which contains the error feedback terms and is latency-critical can be efficiently done by the local computation in the robot. If the robot fails to receive the optimal control from "remote control part"
from the cloud/edge control server due to communication delays or packet loss, it can approximate the "remote control part"
using pre-computed feedback matrices received previously. And in certain duration, the approximation will still enable the robot to perform feedback control for the tasks approximately and ensure that the robot can still operate.
The results in 
show that, in case the robot is completely controlled by a cloud server, the robot cannot finish the walking task if the round-trip latency is larger than 3ms (from sending sensing data to receiving control commands, including processing at cloud/edge). Due to delayed control commands, the robot would fall down (as shown in Figure 5.4.1-2
(a)). However, if the split control is employed, a worse-case 25ms round-trip latency can be sustained, and the robot can still perform the walking task (as shown in Figure 5.4.1-2
The involved AI/ML endpoints (e.g. UE (robot), AI/ML cloud/edge server) run applications providing the capability of AI/ML model inference for robot control task, and support the split robot control operation.
The 5G system has the ability to provide 5G network related information to the AI/ML server.
The robot receives the control from local and remote with required accuracy and latency, so to finish the moving tasks, e.g. balance task and walking task.
The robot control task can be completed under the available computation and energy resource of the robot. And the consumed the computation, communication and energy resources over the AI/ML endpoints are optimized.
This use case mainly requires high data rate together with low latency. The high data rate requirements to 5G system are listed in Clause 7.1
of TS 22.261
. As in Table 7.1-1
of TS 22.261
, 300Mbps DL experienced data rate and 50Mbps UL experienced data rate are required in dense urban scenario, and 1Gbps DL experienced data rate and 500Mbps UL experienced data rate are required in indoor hotspot scenario. As in Table 7.6.1-1
of TS 22.261
, cloud/edge/Split rendering-related data transmission requires up to 0.1Gbps data rate with [5-10]ms latency countrywide.
If everything is done on the edge part, the robot needs to send 592B sensing data per control cycle (every millisecond) and receive 200B per control cycle from the "remote controller"
, which leads to a UL data rate of 4.7Mbit/s and a DL data rate of 1.6Mbit/s. However maximum round-trip latency is limited to 3ms, and the one-way latency for downloading "remote control part"
needs to be limited to 1ms. If the splitting strategy is followed, downloading the "remote control part"
from the cloud/edge control server requires downloading a 40kB data burst per control cycle (every millisecond) as more information is needed to ensure the local controller can take over in case of unexpected latencies, which leads to a user experienced DL data rate of 320Mbit/s. In that case, a maximum round-trip latency of 25ms can be tolerated between each control cycle, and the one-way latency for downloading "remote control part"
can be relaxed to 12ms.
This implies a trade-off between DL data rate and latency: Compared with the full control at edge, the split control mode requires a higher user experienced DL data rate, but relaxes the stringent latency requirement. Different from the traditional URLLC services requiring continuous coverage of 5G network which can only be provided with FR1 spectrum, for a 5G operator with FR2 spectrum, the split control for robotics can be offloaded to the 5G mmWave network with non-continuous coverage.