Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 26.854  Word version:  19.0.0

Top   Top   Up   Prev   Next
1…   4…   5…   6…   7…   8…   8.2   8.3   9…   10…   11…   12   A…

 

8  3GPP servicesp. 28

8.1  Haptics media integration in the Generalized Media Delivery architecturep. 28

8.1.1  Overviewp. 28

The Haptics media functionalities in the generalized 5G Media Delivery architecture defined in TS 26.501, and TS 26.506 are shown in Figure 8.1.1-1. Existing interfaces and functions are reused to support Haptic media type and Haptic media transport in 5G services (e.g. 5GMS, RTC) to address the use-cases of clause 5.
The architecture is not modified, the haptics media functions are added in the below Figures for illustrative purposes.
Copy of original 3GPP image for 3GPP TS 26.854, Fig. 8.1.1-1: Haptics media in the Generalized Media Delivery architecture
Up

8.1.2  Network functions and UE entitiesp. 28

As Haptics media can be integrated in 3GPP services as a companion media type to Video and Audio, the media related definitions described in TS 26.506 and TS 26.501 are extended to describe the haptics media related functions highlighted in the Figure 8.1.1-1 as follow:
  • Media AF: An Application Function as defined in clause 6.2.10 of TS 23.501 dedicated to Media Delivery, including haptics media delivery.
  • Media AS: An Application Server dedicated to Media Delivery including haptics media delivery. The depicted Haptics media engine support access, coding and delivery function, of user plane haptic media data.
  • Media Client: A UE internal function dedicated to Media Delivery comprising:
    • Media Session Handler: An entity on the UE that communicates with the Media AF in order to establish, control and support the delivery of a media session. The media session includes delivery of Haptics media.
    • Media Access Function: An entity on the UE that communicates with the Media AS in order to access and deliver media content, including haptics media. The media access function for example may be further sub-divided into content delivery protocols, codecs, media types and metadata representation, including those related to Haptic media data.
  • Media-aware Application: An application entity on the UE that makes use of 3GPP-defined APIs to invoke the Media Session Handler and/or the Media Access Function in order to support Media Delivery, including haptics media delivery.
Up

8.1.3  Haptics media integration in XR Split rendering architecturep. 29

8.1.3.1  Overviewp. 29

As expressed in TS 26.565, typical use cases for split rendering includes immersive gaming and immersive communication. These use cases, haptic enabled, are also described in clause 5 of this document.
Figure 8.1.3.1-1 is based on Figure 6.1.4-1 of TS 26.565 and illustrates the haptics media functions as part of the SRS, providing support for haptic media coding and delivery, similarly to audio and video media.
Copy of original 3GPP image for 3GPP TS 26.854, Fig. 8.1.3.1-1: Haptic media functions in the User Plane Architecture for Split management architecture
Up
On the UE, Figure 8.1.3.1-2, based on Figure 5.1.2-1 of TS 26.565 illustrates the haptics media entities in the XR baseline client.
Copy of original 3GPP image for 3GPP TS 26.854, Fig. 8.1.3.1-2: Haptic media entities in the XR Baseline Client architecture
Up

8.1.3.2  Network functions and UE entitiesp. 30

The haptics media related network functions in the XR end to end split rendering architecture are similar to the one described in clause 8.1.2. The haptics media engines in the Media AS and in the UE need to make sure that the format of the haptics media sent by the Media AS is understandable to the UE. Negotiating the split of haptic media capabilities is for further studies. Use of non-3GPP codecs is out of scope of this document.
The haptics media entities on the XR baseline Client consist of:
  • the haptic media codec, handling and decompressing a compressed haptics media bitstream is illustrated in the MAF function of the SRC along with AV codecs.
  • the haptic renderer, handling the rendering of haptics effects using the targeted actuators is illustrated in the presentation engine function of the SRC.
Up

8.1.3.3  Split haptics media operationsp. 30

When a UE intends to offload part of its haptics media processing to the SRS:
  • The SRC and the SRS negotiate the desired haptics media capabilities (or profile) on the M4 interface using a SWAP (Simple WebRTC Application Protocol) message or a data channel message.
  • The SRS processes and renders the haptics media content and may use pose or interaction information to spatialised the rendered haptics media content in correlation with other rendered media stream (scene, video, objects, audio).
  • The SRS transmits to the UE the resulting haptics media streams.
  • The UE decodes and renders the haptics media stream.
Up

Up   Top   ToC