As expressed in
TS 26.565, typical use cases for split rendering includes immersive gaming and immersive communication. These use cases, haptic enabled, are also described in
clause 5 of this document.
Figure 8.1.3.1-1 is based on Figure 6.1.4-1 of
TS 26.565 and illustrates the haptics media functions as part of the SRS, providing support for haptic media coding and delivery, similarly to audio and video media.
On the UE,
Figure 8.1.3.1-2, based on Figure 5.1.2-1 of
TS 26.565 illustrates the haptics media entities in the XR baseline client.
The haptics media related network functions in the XR end to end split rendering architecture are similar to the one described in
clause 8.1.2. The haptics media engines in the Media AS and in the UE need to make sure that the format of the haptics media sent by the Media AS is understandable to the UE. Negotiating the split of haptic media capabilities is for further studies. Use of non-3GPP codecs is out of scope of this document.
The haptics media entities on the XR baseline Client consist of:
-
the haptic media codec, handling and decompressing a compressed haptics media bitstream is illustrated in the MAF function of the SRC along with AV codecs.
-
the haptic renderer, handling the rendering of haptics effects using the targeted actuators is illustrated in the presentation engine function of the SRC.
When a UE intends to offload part of its haptics media processing to the SRS:
-
The SRC and the SRS negotiate the desired haptics media capabilities (or profile) on the M4 interface using a SWAP (Simple WebRTC Application Protocol) message or a data channel message.
-
The SRS processes and renders the haptics media content and may use pose or interaction information to spatialised the rendered haptics media content in correlation with other rendered media stream (scene, video, objects, audio).
-
The SRS transmits to the UE the resulting haptics media streams.
-
The UE decodes and renders the haptics media stream.