Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 26.928  Word version:  17.0.0

Top   Top   Up   Prev   Next
0…   4…   4.1.2…   4.2…   4.3…   4.4…   4.5…   4.6…   4.6.7   4.7…   4.9…   5…   6…   7…   8   A…   A.4…   A.7…   A.10…   A.13   A.14   A.15   A.16…   A.19…   A.22…

 

A.10  Use Case 9: Police Critical Mission with ARp. 97

Use Case Name: Police Critical Mission with AR
  • A squad team of police officers (Hugo, Paco and Luis) are sent to a dangerous location to perform a task, for instance, a rescue mission
  • Each team member is equipped with a helmet with:
  • AR displays (or AR Glasses),
  • stereo headphones with embedded microphones for capturing the surrounding sound and a microphone for conversational purposes (see audio sub- use case below)
  • VR360 camera, e.g. double fish eye or a more advance camera array in such way that are located in surface of the helmet (for safety reasons)
  • 5G connectivity and very accurate 5G location
  • Each team member can talk each other via PTT or duplex communication
  • Each team capture and deliver VR video with extremely low latency to central police.
  • A lower quality may be sent to lower the latency requirement
  • A high quality is stream up for recording purposes
  • Surround sound maybe capture as well.
  • The squad team can be backed up by one or more drones relaying 360 VR video, hyper-sensorial data, and enabling XR haptics.
  • Squad team members can augment their surroundings with drone data.
  • Squad team members can extend their physical presence by taking over control of one or more drones.
  • Police central operations can extend their physical presence by taking over control of one or more drones.
  • At the police central facilities, they can see each VR360 camera and have communication to all members of the team
  • Each squad team may have a counterpart (person) who is monitoring VR360 camera using HMD so can assist for dangerous situation outside of its field of view. This may be an automated process too that signal Graphics information of an incoming danger.
  • The central facilities may share additional information to every team member such maps, routes, location of possible danger and additional information via text or simple graphics
  • Each team member shared their accurate positioning to each team and can be displayed/indicated in the AR display (e.g. showing that someone is behind a wall)
  • Each camera VR capture is analyzed in real time to identify moving objects and shared to others team members (as point above)
    Audio
  • Each team communicates via microphone, and automatic Speech to text can be generated so it is rendered in AR display in case of noisy conditions
  • Stereo communication is needed to enhance the intelligibility
  • Since each team is wearing stereo headset
  • Microphones are place near speakers to capture the surround noise and it is feedback (with no latency) to each earpiece.
  • The receiving audio of each team member is 3D spatially placed (e.g. in front or in the direction where the other team members are located) so the user does not get distracted from the surround sound environment. (this audio is mixed with the microphone feedback)
    Categorization
    Type: AR, VR
    Degrees of Freedom: 3DoF to 6DoF
    Delivery: Local, Streaming, Interactive, Conversational, Group Communication
    Device: 5G AR Glasses/Helmet, VR camera/microphone, Audio stereo headset, 5G accurate positioning
    Preconditions
  • AR 5G Glasses/Helmet
  • VR camera and microphone capture
  • 5G connectivity and positioning
  • Real time communication
  • One or more drones relaying 360 VR video, hyper-sensorial data, and enabling XR haptics
    Requirements and QoS/QoE Considerations
  • Accurate user location (indoor/outdoor)
  • Low latency
  • High bandwidth
    Feasibility
  • There are a few devices available today that target some of the requirements described in this Use Case, e.g. "HUD 3.0", a military HMD that projects critical data to the soldier's field of view (https://www.popularmechanics.com/military/a19635016/us-troops-to-test-augmented-reality-by-2019/). With the new announcement of the HoloLens 2 from Microsoft with more advance technology for AR applications and better rendering quality makes it easy to create a proof of concept for this use case (ignoring form factors and security requirements for police helmet). The HoloLens 2 features are described here: https://pureinfotech.com/microsoft-hololens-2-tech-specs/. The device can use WIFI connectivity to connect to a 5G device. A VR camera can easily be mounted to the for proof of concept.
    Potential Standardization Status and Needs
  • 5G connectivity with dedicated slices for high resilience on critical communications
  • 5G positioning
  • MTSI/MCPTT SWB/FB voice communication
  • MTSI/FLUS uplink 3D audio
  • MTSI/FLUS uplink VR
  • Downlink AR video with overlaid graphics with local/cloud computation and rendering
  • Downlink AR audio with mixed-in 3D audio objects with local/cloud computation and rendering
  • Up

    A.11  Use Case 10: Online shopping from a catalogue - downloadingp. 99

    Use Case Description: Online shopping from a catalogue - downloading
    In order to purchase a new sofa for his living room, John connects to an online shop offering the ability to virtually insert items in his home place. This online shop provides for each selling product, 2D images, 3D objects models and detailed information on size, colour, materials.
    John chooses his favourite sofa from the item list via the shop application on his smartphone or tablet.
    Option1: John is only equipped with a smartphone.
    The sofa is added his living room on his smartphone thanks to the onboard camera and depth sensor from the device. John can then try different locations in the living room, select the colour that better fits with his home place.
    Option 2: John is also equipped with a pair of AR glasses
    When connected to the online store via his smartphone, John also connects his AR glasses to his smartphone. The sofa is then rendered on his AR glasses and John continue to use his smartphone in order to control the location of the sofa within the living room.
    Categorization
    Type: AR
    Degrees of Freedom: 6DoF
    Delivery: Download
    Device: AR Glasses, Rendering system, Tablet (or smartphone), Capture device
    PreCondition
    Tablet (or smartphone) with the following features
  • 4G/5G connectivity
  • 3D capture capabilities with depth capturing
  • rendering of overlay 3D model in the captured scene/video
    Capture device (video and depth camera).
    AR glasses with connectivity to the tablet/smartphone.
    Application with 3D model representation of selling items.
    QoS and QoE considerations
    QoS:
  • Accurate and low latency rendering
  • Reliable and fast download of the 3D model to be rendered.
    QoE:
  • Fast and accurate rendering of 3D object of items (such as proper lightening and reflectance in AR scenes)
  • Accurate placement of the 3D object in AR scene.
  • Less heterogeneity through AR glasses
    Feasibility
    AR services of furniture planning are already available. For example,
  • IKEA™: https://www.youtube.com/watch?v=vDNzTasuYEw
  • Amazon™: https://www.amazon.com/adlp/arview
    In such applications, the chosen item can be placed in the AR scene. Therefore, it would be possible that the item is represented through AR glasses if it has information of the 3D model. Rendering device is capable of rendering a 3D object in the captured scene or in the field of view of the user's AR glasses.
    Potential Standardization Status and Needs
    The following aspects may require standardization work:
  • Standardized format for 3D object such as point clouds
  • Delivery protocols for 3D object
  • Decoding, rendering, composition API for 3D object in AR scene
  • Up

    A.12  Use Case 11: Real-time communication with the shop assistantp. 100

    Use Case Description: Real-time communication with the shop assistant
    In addition to the above use case for online shopping from a catalogue, the remote assistant is available for products on sale. John can seek advice from the online shop assistant on which colour the sofa better matches with the living room.
    John chooses his favourite sofa from the item list via the shop application on his smartphone or tablet and can add 3D representation of the sofa into his living room scene captured by the camera. John can try different locations in the living room, select the colour that better fits with his home place.
    John captures the AR scene with 3D representation of the sofa in his living room and transmits the captured scene of the living room is transmitted in real time to the online assistant who can make suggestions to John.
    Use case extension:
    The shop assistant is able to place virtual furniture, e.g., a lamp into John's captured scene in real time and transmit to suggest for John to also buy a lamp that nicely fits with the rest of the living room.
    Categorization
    Type: AR
    Degrees of Freedom: 6DoF
    Delivery: Interactive, Conversational
    Device: AR Glasses, Rendering system, Table (or smart phone), audio headset
    PreCondition
    AR glasses equipped or connected with capture device (depth camera), positioning system and rendering system. Capture device supports to save the captured scenes in point cloud format.
    Tablet (or smartphone) with 4G/5G connection
    Headset (headphones with embedded microphones) is used for conversation.
    Online shopping mall supports all of the items in point clouds.
    QoS and QoE considerations
    QoS:
  • In case of sufficient bandwidth, the user and assistant should be able to transmit and receive the scene and the voice streams simultaneously (if necessary, simultaneous instant messaging service should be possible). For HD video quality, at least over 1 Mbit/s is needed.
  • conversational QoS requirements
    QoE:
  • No disconnection or interruption in the middle of the conversation between the user and the assistant even in the environment where the captured scenes are sharing.
  • high-quality AR scene with accurate placement and rendering of 3D object in real environment
  • Synchronized AR scene between user and assistant
    Feasibility
  • real time AR communication or assistance, for example:
  • https://www.youtube.com/watch?v=GFhpAe10qnk9 (Live remote support with 3D annotation to the Microsoft™ HoloLens™)
  • In this application, field technicians can use the Microsoft™ HoloLens™ to connect to a remote expert with an unprecedented clarity of communication, as well as receive assistance and perform tasks with unmatched speed and accuracy
  • https://chalk.vuforia.com/ (Vuforia™ chalk)
    It provides a remote guidance and collaboration app designed for technicians and experts to more effectively communicate to solve problems.
    Potential Standardization Status and Needs
    The following aspects may require standardization work:
  • Coded representations of AR scene and delivery in MTSI context
  • MTSI/FLUS uplink AR video
  • Downlink AR video with local/cloud computation and rendering
  • MTSI regular audio between John and assistant
  • Up

    Up   Top   ToC