Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 26.854  Word version:  19.0.0

Top   Top   Up   Prev   Next
1…   4…   5…   6…   7…   8…   8.2   8.3   9…   10…   11…   12   A…

 

7  Candidate technologiesp. 24

7.1  Codecsp. 24

7.1.1  MPEG Haptics Codingp. 24

7.1.1.1  Overviewp. 24

JTC 1/SC 29/WG 7 (MPEG-3DGH) has completed the development of the MPEG Haptics Coding standard at the April 2023 meeting and is pending publication as ISO/IEC 23090-31 [7].
In addition to a codec and bitstream format, this MPEG Haptic coding standard supports multiple haptic modalities and specifies a human readable JSON coded representation format (HJIF) for descriptive and quantized media formats. Haptic primitive from AHAP and IVS parametric signals can be losslessly converted to and from HJIF. The HJIF data model is a flexible hierarchical structure that can describe one or more channels, allowing for mono, stereo or multi-channel haptics media experiences, targeting one or more actuators or devices. Application areas especially targeted for the use of MPEG Haptics coding include media streaming and broadcast, immersive applications and XR services, real time communications, as well as tactile communication, the latter being out of scope of this study.
During the standardization process, MPEG also worked on a reference and conformance software available in ISO/IEC 23090-33 [45] that was publicly released.
An in-depth evaluation of the performances of this reference codec was conducted with both objective metrics (PSNR) and subjective tests MUlti Stimulus test with Hidden Reference and Anchor (MUSHRA) as defined in ITU-R recommendation BS.1534-3 [43] for different target bitrates and different input test streams.
Three representative sets of test streams were provided by different companies and corresponded to market needs: two sets for vibrotactile signals (short effects and long effects) and one set for kinaesthetic signals (including force signals, acceleration, or movement). For a total of 43 test streams.
The objectives performances reported by MPEG in [44] are given in Figure 7.1.1.1-1. On the left the PSNR is given for three configurations of the MPEG encoder CRM3.2 considering PCM input signals (C2V: vectorial encoding, C2W: wavelet encoding, C2VWR: hybrid encoding). On the right the histogram of the bit-rate is depicted for parametric transcoding of the .ivs and .ahap parametric input streams.
Copy of original 3GPP image for 3GPP TS 26.854, Fig. 7.1.1.1-1: Objective performances (left: PCM signals, right: parametric signals)
Up
The transcoding of parametric input content is usually around 1-4 kbps (It consists in transcoding without loss the input parametric file to the MPEG parametric format which is then binarized).
The results on PCM input data show that signals lossy encoded with the 2 kbps target bitrate present perceptible distortions with an average PSNR of 24db. Signals encoded with a 8 kbps target bitrates show some distortions, but not annoying, with an average PSNR of 39.17db. Finally signals encoded with a 16kpbs bitrate have no perceptible distortion with an average PSNR of 45.62db
Thus, 8kbps is considered as a reasonable average bitrate for MPEG-Haptic encoding. Considering an average input bitrate being 128kbp (8kHz sampling for 16bits samples), it leads to a compression ratio of 16 per channel.
Subjective performances, performed by three independent laboratories and two reference haptic devices, confirmed the above results. The MUSHRA score was higher than 94 (maximum is 100) for bitrates of 8kbps and higher.
Up

7.1.1.2  MPEG Haptics Codec Architecturep. 25

Figure 7.1.1.2-1 represents the MPEG haptics codec architecture. Media formats supported by the MPEG Haptics representation and coding includes AHAP, HJIF, IVS parametric media formats, as well as the WAV time sampled media format.
The synthesizer is not defined by the standard and is illustrated in this Figure to highlight how the MPEG Haptics codec can be integrated with a renderer.
Copy of original 3GPP image for 3GPP TS 26.854, Fig. 7.1.1.2-1: MPEG Haptics codec architecture
Figure 7.1.1.2-1: MPEG Haptics codec architecture
(⇒ copy of original 3GPP image)
Up

7.1.1.3  Integration in MPEG Scene Descriptionp. 26

MPEG Scene Description [31] is used in IBACS [23] and in [SR-MSE] via MeCAR [32]. To address use-cases in clause 5.4 and 5.5 and these 3GPP services, Haptics media may be integrated with a scene description.
MPEG Scene Description provides extensions for the support of haptics media defined in [7]. Two extensions have been defined MPEG_haptic and MPEG_haptic_material. The first one associates a haptic media stream on a set of nodes and defines every haptic media object. The second describes the texture-based haptic media data when used (as a 2D map associated to an object).
The trigger/action mechanism of MPEG Scene Description can be used to associate and play haptic media effects on objects and/or avatars with specified location and playback type. In particular, the relation between a node in the scene and haptic media data is established through haptic actions. Interactive haptic media feedback is produced by defining behaviors with triggers (e.g collisions, proximity, etc.) and haptic actions. For each node in a haptic action, the associated Haptic media data is defined either through a reference to an element or through a MPEG_haptic_material attached to a mesh of the node. When a haptic action is triggered, the associated haptic media data is rendered according to the properties specified in the action.
Up

7.1.2  IEEEp. 26

7.1.2.1  Overviewp. 26

The IEEE Standard for Haptic Codecs for the Tactile Internet was developed in 2016 and approved in 2024 [30]. This standard specifies 3 different codecs (no-delay kinaesthetic codec, delay-robust kinaesthetic codec, and a tactile codec) supporting only a time-sampled media format (PCM) and no parametric media formats.
Some performances of the IEEE Haptic Codecs were documented in [2] in relation to closed loop haptic systems (with a reliability of 99,999%), which are out of scope of this study. Performance of the IEEE Haptic Codecs for open loop haptic system are for further study.
Further, with no support for parametric media formats, these codecs could not be used as mezzanine format nor interoperable formats with parametric haptics media source content widely used in use cases of clause 5.
Up

7.2  Storage formatp. 26

A specification for the storage and delivery signaling for haptics media as defined in [7] is currently under development by MPEG and has reached FDIS stage. ISO/IEC 23090-32 [25] defines how a haptics MIHS bitstream can be encapsulated in ISOBMFF media containers. The specification supports single and multi-track encapsulation, where different channels or bands of the haptics media can be stored in separate tracks to enable selective access.
MPEG is also working on a reference software implementation for the storage and delivery aspects defined in ISO/IEC 23090-32 as part of a new specification (ISO/IEC 23090-37 [26]) that has recently been initiated and is expected to be completed by the end of 2025.
Up

7.3  Transport protocolsp. 26

7.3.1  Haptics media delivery over DASHp. 26

ISO/IEC 23090-32 also specifies how haptics media is signaled in an MPEG-DASH manifest (MPD) for adaptive media delivery and defines descriptors to signal information pertaining to the haptics experience to allow a streaming client to select the parts of the haptics media to stream based on playback timeline, network conditions, and/or user interaction.

7.3.2  Haptics top-level media type and subtypesp. 26

The 'haptics' media type has been documented and registered by IANA as a top-level media type, along with 'audio', 'video', 'application', and others. [27].
Under this top-level media type, the following haptics subtypes are currently registered:
  • ivs, haptics/ivs
  • hjif, haptics/hjif
  • hmpg, haptics/hmpg
The justification for the 'haptics' top-level media type is found in [28].
Up

7.3.3  Haptics media RTP payloadp. 27

An RTP payload format for haptics media is under development in IETF [29] and has reached the working group last call. The draft describes how the haptic data, in MIHS units defined in [7] can be transmitted using the RTP protocol. This RTP payload format enables the transport of the "hmpg" media subtype defined in [28].
Some of the characteristics of the payload formats include:
  • The 4 types of MIHS units are indicated in the payload header: initialization, time dependent (temporal), time independent (spatial), and silent units.
  • A MIHS unit can be marked as independent or dependent. An independent unit resets the previous haptics effect and correspond to a "sync" MIHS unit as defined in [7].
  • 3 payload structures are defined and can be used within a same stream: a single unit payload structure for a single MIHS unit per packet, a fragmented unit payload structure for MIHS units which are too large to be transmitted in a single packet, and an aggregation packet payload structure to transport multiple MIHS units in a single RTP packet.
  • The aggregation packet payload structure can be used to transport multiple MISH units that correspond to the same timestamp, single-time aggregation packet (STAP), or to transport multiple MISH units that correspond to different timestamps, multi-time aggregation packet (MTAP).
  • In case of congestion control, the draft recommends prioritizing initialization units, to treat silent units as less important, and to use the MIHS unit layer information present in the RTP payload header to prioritize packets.
  • The draft also describes SDP considerations, to define additional optional parameters that can be used in the SDP exchange.
Up

7.3.4  OpenXR APIsp. 27

The OpenXR API [6] standardizes the use of cross-platform XR device capabilities, including haptic devices. Current support mainly addresses XR and game controllers handled by the user hands and most commercial controllers are supported. Functions such as trigger, click, touch, squeeze, grip are input actions with a potential haptic output specified by an hapticsAction and hapticPath. The hapticsAction specifies the type of haptic feedback (e.g. vibrations) and hapticPath specifies where the effect is applied (e.g. left hand). Haptic feedback is sent to a device using the ApplyHapticFeedback and StopHapticFeedback functions.
The only haptics type supported by the OpenXR API v 1.1.43 is XrHapticVibration, for haptic vibrations (vibration amplitude, duration, frequency). Some proprietary extensions also provide support for vibrations described by a haptic amplitude envelope or PCM signal (XrHapticAmplitudeEnvelopeVibrationFB, XrHapticPcmVibrationFB).
In 2022, the HIF (Haptic Industry Forum) submitted a high-level proposal for advanced haptics APIs to the OpenXR consortium. The advanced haptics APIs are designed to extend the OpenXR API to other haptics modalities and process several haptics standards among which the MPEG HJIF, IEEE P2861.3 standards as well as other proprietary formats (AHAP, IVS).
Up

Up   Top   ToC