The support of Haptic media streaming over 5G would enable the use-cases defined in clause 5, particularly those for on demand and live streaming.
The 5GMS architecture TS 26.501 defines in Table 4.0.1-1 the list of main features of 5GMS, highlighting whether they are specified for the uplink and/or downlink. These features are equally applicable to Haptics media as illustrated in Table 8.3.1-1.
In addition to the generalized architecture described in clause 8.2 of this document, the 5GMS specification TS 26.501 details the UE functions of the 5GMSd client for downlink, in clause 4.2.2 and uplink in clause 4.2.3.
For handling haptic media, no new UE subfunctions are needed. The following highlights some requirements needed within some of the subfunctions described in TS 26.501 to support haptic media:
Media Access Client: Accesses media content, such as DASH-formatted media segments, for immediate or delayed consumption. To access haptic media content, the haptic media content needs to be provided in suitable formatted media segments (e.g. Dash segments)
Media encoder and decoder: Decodes or compress the media, such as audio, video or haptics media.
Media Capturing: Devices such as video cameras or microphones that transform an analogue media signal into digital media data. For haptic media, capturing devices, sensors are described in clause 6.4 of this document.
Metrics Collection and Reporting and Consumption Collection and Reporting: Information specific to haptic media, if any are for further study.
The analysis of haptic media formats for broadcast services remains an area for further study. The integration of haptics media into broadcast services presents challenges that require deeper investigation such as transmission methods and user experience considerations to ensure seamless and effective implementation.
In addition to the generalized architecture described in clause 8.2 of this document, the GA4RTAR specification in clause 4.2 of TS 26.506 details the UE functions of the RTC endpoint. For handling haptic media, no new UE subfunctions are needed. The RTC Access Function is extended to support accessing, coding, and transmitting haptic media.
The protocol stack for a basic RTC endpoint is specified in clause 13.1 of TS 26.113.
Figure 8.3.3.1-1 illustrates the integration of haptics media support in this protocol stack
Haptics media is represented at the same level as the other Video and Audio media. While there is no change to the structure of the protocol stacks, RTP payload for Haptics media would be handled similarly to those necessary for Audio and Video.
IMS based RTC services include IBACS TS 26.264 and MTSI TS 26.114, in which Haptics media can be integrated to satisfy the use case of clause 5. Figure 4.3 of TS 26.114 specifies the user plane protocol stack for a basic MTSI client. It also applies to IBACS.
Figure 8.3.3.2-1 illustrates the integration of haptics media support in this protocol stack.
Haptics media is represented at the same level as the other media types. While there is no change to the structure of the protocol stacks, Payload formats and RTP payload for Haptics media would be handled similarly to those necessary for Audio and Video.
To support the use-cases in clause 5 for haptics media enhanced messaging services, the following figures from TS 26.143 are modified to illustrate haptics media functionalities in the overall Messaging Media system, Figure 8.3.4-1, and in the Messaging Media player model, Figure 8.3.4-2.
The MMBP media capabilities defined in TS 26.143 currently does not support haptics media types. The inclusion of a top top-level haptics media type would happen at the same level as other media formats (text, speech, audio, image, etc.) and would not require modification to the multipart MMBPs and container formats (single, mixed, alternative, parallel, related, and 3GP9). A new sample entry for haptics media needs to be added to the supported sample entries in TS 26.244. For Messaging services, it is expected that a single File Format track is used for haptic media.
In addition, when the haptic media effect is included in a scene description in association with a particular node, the scene description capabilities of the messaging service need to be extended to support MPEG_haptic and MPEG_haptic_material as well as SD-Rendering-glTF-Interactive TS 26.119.
For instance, the capability 26143_HAPTICSMEDIA_MIHS can be defined as the capability of playing back (decoding and rendering) a file that:
is decodable by a decoder capable of the MIHS decoding and rendering capabilities as defined in [7],
is encapsulated in an ISO BMFF track with the sample entry 'mih1' as defined in [25]
is contained in a 3GP file that conforms to the 26143_CONTAINER_MP4_3GP9 capability defined in clause 5.2 of TS 26.143 and extended with haptics media capabilities.
The media type for files with this capability 26143_HAPTICSMEDIA_MIHS is signalled with haptics/mp4, profile="3gp9" codecs="mih1" or an equivalently compatible media type.
Multiple media files may be handled by creating multiple instances of the player or combining the files to output a single haptic signal for rendering through the onboard or connected actuator. Multi-technology feedback (e.g., thermal+vibrotactile) can be played simultaneously through different onboard or connected devices [7].
For instance, the capability 26143_HAPTICSMEDIA_ENC_MIHS for a content generator can be defined as the capability of:
Generate a haptics media file from an actuator or from a local stored file, such that the file can be played back by a player with the capability 26143_HAPTICSMEDIA_MIHS
Generate an ISO BMFF track that conforms with the requirements of the sample entry 'mih1' as defined in [7].
Generate a 3GP file from the ISO BMFF track that conforms to the 26143_CONTAINER_MP4_3GP9 capability extended with haptics media capabilities
Signal the haptic media capability with the generated file haptics/mp4, profile="3gp9" codecs="mih1" or an equivalently compatible media type