For the purpose of this document, two different haptic control-loops systems are defined: closed-loop-haptics and open-loop-haptics.
Closed-loop-haptics systems refer to architectures where the user received feedback is based on (local or remote) information (sensory information or simulated information) received from the system following user interactions. Closed-loop-haptics systems are illustrated in
Figure 4-1.
The haptic feedback is an integral part of the interaction mechanism, it provides necessary information to the user to adjust his input to the system. User inputs are continuously monitored by the system in order to constantly adjust the haptic effects sent to the haptic device. Such closed-loop-haptics system is typically (but not exclusively) used for human-machine interactions, i.e. a human controls a remote machine and the machine responds according to the input control with measures, allowing the human to adapt his behaviour. Rapid response time is therefore mandatory along with ultra reliability which may require new network architectures. Some examples of closed-loop haptic systems are surgical simulators and teleoperation robots.
Closed-loop-haptic systems are out of scope for this feasibility study.
Open-loop haptics systems refer to architectures where the user received haptic feedback is provided by the system (local or remote) without using sensory or simulated information, i.e., they operate based on pre-defined settings and do not adjust based on the user's interactions. They include unidirectional or bi-directional haptics media transmission with similar requirements/constraints on QoE as those applying to audio and video streaming or real-time communication in the current 3GPP architecture, taking into account human touch perceptual thresholds. Some examples of open-loop haptic systems are vibration feedback in gaming controllers and tactile feedback in wearable devices like smartwatches. The different haptic modalities (including both tactile and kinaesthetic) relevant in open loop haptic systems are part of the study. Open-loop haptics systems are illustrated in the
Figure 4-2 below.
Haptics relates to the sense of touch. As a media, Haptics represents information describing physical feedback rendered for a specific user device and body location. This information is defined as a new media type and is described by a time-based signal (a haptic effect) or a spatial signal (a physical property of an object). The rendering is triggered by a timing information or an interaction. Different Haptic modalities are considered, targeting different human mechanoreceptors (tactile, kinesthetic, proprioception) and thermoreceptors.
[7]
Haptics may be used as a media type at the same level as audio and video.
Figure 4.3-1 depicts an end-to-end streaming pipeline associating audio, video and haptics in the most complete scenario of interactive 3D scenes. A simpler version of the pipeline consists in 2D AVH (Audio-Video-Haptics media) with only 2D assets and no interaction.
This Figure also illustrates the different formats and some existing APIs supporting haptics media.
Figure 4.3-2 depicts an end-to-end real-time communication pipeline associating audio, video and haptics media. In a real-time communication use case, the haptic media is managed similarly to audio and video. The input is captured from input sensors, a camera for the video, a microphone for audio and dedicated haptic sensors (motion, pressure…) for haptics media. Alternatively, the signal can be loaded from the storage, typically a library of encoded effects can be stored and retrieved on the fly and sent to the receiver. The signals are then encoded and distributed using existing coded formats and protocols.
On the receiver side, the various media are decoded and rendered with the appropriate devices, loudspeaker, screen or haptics devices (e.g. vibrotactile).
Figure 4.3-2 also illustrates the different formats and some existing APIs supporting haptics media.
For the creation of haptics effects and their association with A/V content, several tools have been developed and can be used. In
Figure 4.3.1-1 an example of such tool, the HFX studio
[5] is illustrated. It shows how to create timeline for haptics, with several channels and different haptic effects. It also shows how to design effects for particular body parts on the user.
The main principle of authoring tools is to create effects for the user and the targeted experience, and let the application manage this experience regarding the available rendering devices. For instance, in this picture a haptic effect is generated on the user torso assuming several actuators. If the receiving application does not support a haptic suit as a rendering device but may just be run on a smartphone with a single actuator, the application will select only the first vibrotactile signal from the distributed file or stream. On the opposite more complex receivers might use the full file or stream with more complex setups.
The HFX studio supports the HJIF interchanged format (or mezzanine format) defined in
clause 6 of [7] for the creation, editing and distribution of haptic effects.
Other commercial tools exist, and they are generally dedicated to their associated proprietary format or platform:
-
bHaptics Designer [8]
-
Meta Haptics Studio [9]
-
Haptic Composer - Design, Test, & Play Haptics [10]
Several sdks have been provided to integrate haptics into standalone applications or for developing dedicated applications such as Unity and Unreal. Often provided with their proprietary format such as the meta sdk for unity
[11] or unreal
[12], bHaptics sdk
[13] or Apple Core Haptics sdk
[14].
The Interhaptics platform
[15] is providing both authoring tools and Software Development Kit (SDK) for software providers and OEMs manufacturers.
APIs with the devices are often based on OpenXR
[6]. However the current haptic support in OpenXR is limited, but some work is ongoing to extend capabilities.
A large number of haptic devices exist. The simpler devices are the smartphone and wearables integrating vibrotactile devices, usually one. Chipsets support mono or stereo haptics, for instance the Snapdragon G3X gen2
[16]. When several wearables are supported by an application some form of spatialization is then possible.