Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 26.928  Word version:  17.0.0

Top   Top   Up   Prev   None
0…   4…   4.1.2…   4.2…   4.3…   4.4…   4.5…   4.6…   4.6.7   4.7…   4.9…   5…   6…   7…   8   A…   A.4…   A.7…   A.10…   A.13   A.14   A.15   A.16…   A.19…   A.22…

 

A.22  Use Case 21: Immersive 6DoF Streaming with Social Interactionp. 124

Use Case Description: Immersive 6DoF Streaming with Social Interaction
In an extension to the use case 3 in clause 6.4 for which Alice is consuming the game in live mode, Alice is now integrated into social interaction:
  • She virtually watching the game with other friends who are geographically distributed and whose avatars are sitting in the stadium next to her. She has voice conversations with those friends while watching the game.
  • While she moves through the stadium to another location, she make friends with other folks watching the same game in the virtual environment.
  • She gets overlaid contextually relevant twitter feeds
    Categorization
    Type: VR and Social VR
    Degrees of Freedom: 3DoF+, 6DoF
    Delivery: Streaming, Split, Conversational, Interactive
    Device: HMD with a controller
    Preconditions
  • Application is installed that permits to consume the scene
  • The application uses existing HW capabilities on the device, including A/V decoders, rendering functionalities as well as sensors. Inside-out Tracking is available.
  • Media is captured properly and accessible on cloud storage through HTTP access
  • One or multiple communication channels across users can be setup
    Requirements and QoS/QoE Considerations
  • Same as use case in clause 6.3. In addition, the following applies
  • Required QoS:
  • Sufficient low latency for the communication channel
  • Required QoE:
  • Sufficiently low communication latency
  • Synchronization of user communication with action
  • Synchronized and context-aware twitter feeds
    Feasibility
    See use case 3 in clause A.4.
    The addition of social aspects can be addressed by apps.
    Some discussion on this matter:
  • https://www.roadtovr.com/nextvr-latest-tech-is-bringing-new-levels-of-fidelity-to-vr-video/, see the second page. However, still no publicly announced details.
    Social VR is used in different context. See for example here:
  • https://www.juegostudio.com/infographic/various-social-vr-platforms
  • https://www.g2crowd.com/categories/vr-social-platforms
    Some example applications are provided
  • Facebook Spaces™
  • https://www.facebook.com/spaces
  • VRChat
  • https://www.vrchat.net/
  • https://en.wikipedia.org/wiki/VRChat
  • https://youtu.be/5cpElonP33k
  • Oculus Venues ™
  • https://www.engadget.com/2018/05/30/oculus-venues-hands-on
  • https://www.esquireme.com/oculus-headset-will-let-you-watch-live-sport-in-virtual-reality
    Optimizations can be done by integrating social A/V with main content (rendering, blending, overlay).
    Additional pointers to deployments and use cases:
  • https://www.nextvr.com/nextvr-gets-social-with-oculus-venues-now-fans-can-enjoy-live-vr-experiences-together/
  • https://www.oculus.com/blog/go-behind-the-scenes-of-the-oc5-oculus-venues-livestream-with-supersphere/?locale=en_US
  • Verizon presentation at XR Workshop
  • Virtual Live Events w/Friends
  • Virtually attend live events with friends in 4K/8K 360°3D video (aka 'VR')
  • Technical Requirements
  • 4K, 8K+ (6DoF) real time (volumetric)streaming, Immersive 360°Video (stereoscopic, 90+ FPS) → MEC for video stitching is optional on 4K
  • Directional audio, user point of view → For real time chat, selectable viewpoints
  • Integrated Videos and Communications → RCS-based communication, supports delivery to all deployed smartphones as well as VR devices
    Potential Challenges:
  • Quality of avatars
  • Synchronization of scene
  • Quality of interactions
    Potential Standardization Status and Needs
    The following aspects may require standardization work:
  • same as use case 6.3
  • Social VR component and conversation
  • Synchronized Playout of users in the same room
  • Up

    A.23  Use Case 22: 5G Online Gaming partyp. 126

    Use Case Description: 5G Online Gaming party
    In an extension to use case 5 in Annex A.6 on Online Immersive Gaming experience, the users join a Gaming Party either physically or virtually in order to experience maximum and controlled user experience. There are two setups for the party:
  • The friends connect to a common server through 5G that provides managed resources and access guarantees to meet their high-demand requirements for gaming.
  • The friends meet physically and connect to an infrastructure using wireless 5G connection. The setup explores all options, including connecting to a centralized infrastructure, but also possibly connecting HMDs using device to device communication.
    The experience is improved and especially very consistent compared to best effort connections they had been used to before.
    In a similar use case as presented during the 2nd XR Workshop, it is referred to as "City-wide multiplayer, immersive AR gaming action/adventure experience"
  • User enters an outdoor geo-fenced space including parks & other activation sites for an AR gaming play experience.
  • Once inside the geolocation, user opens app on 5G phone & goes through local matchmaking to join with other nearby players for co-operative experience.
  • Players use AR wayfinding to head to the first dead drop.
  • User scans environment using AR Lens to uncover first clue and work alongside other players to solve AR puzzle to unlock the next level.
  • The winners from the battle unlock AR Wayfinding for next location and next battle.
  • At the final location, the remaining users confront final opponent and play AR combat mini game to defeat him and unlock exclusive content.
    Categorization
    Type: VR, AR
    Degrees of Freedom: 6DoF
    Delivery: Streaming, Interactive, Split, device-to-device
    Device: HMD with a Gaming controller, AR glasses
    Preconditions
  • Gaming client is installed that permits to consume the game
  • The application uses existing HW capabilities on the device, including game engines, rendering functionalities as well as sensors. Inside-out Tracking is available.
  • Connectivity to the network is provided.
  • Connectivity can be managed properly
  • Devices may connect using device-to-device communication
  • Wayfinding and SLAM is provided to locate and map to the venue in case of AR
  • AR and AI functionalities are provided for example for Image & Object Recognition, XR Lighting, Occlusion Avoidance, Shared Persistence
    Requirements and QoS/QoE Considerations
    The requirements are similar to what is discussed in use case 6.25.
    Feasibility
    Feasibility follows the previous discussions. However, a 5G Core Architecture that would provide such functionalities, would be needed. In addition, authentication for such "5G parties" is needed.
    Potential Standardization Status and Needs
    The following aspects may require standardization work:
  • Network conditions that fulfill the QoS and QoE Requirements
  • Content Delivery Protocols
  • Decoding, rendering and sensor APIs
  • Architectures for computing support in the network
  • TR 22.842 provides a gap analysis in clause 5.3.6 that is in line with these needs
  • Authentication to such groups
  • Possible support for device-to-device communication
  • Up

    A.24  Use Case 23: 5G Shared Spatial Datap. 128

    Use Case Description: Shared Spatial Data
    Consider as an example people moving through Heathrow airport. The environment is supported by spatial map sharing, spatial anchors, and downloading/streaming location based digital content. The airport is a huge dynamic environment with thousands of people congregating. Spatial maps and content will change frequently. Whereas base maps have been produced by professional scanners, they are continuously updated and improved by crowd sourced data. Semi-dynamic landmarks such a growing tree, a new park bench, or holiday decorations are incorporated into the base map via crowd sourced data. Based on this individuals have their own maps and portions of those maps may be shared with friends nearby. One could imagine spatial content will consume as much bandwidth as permitted, be it a high resolution volumetric marketing gimmick with virtually landing Concorde in Heathrow or a simple overlay outside a lounge showing the current wait time for getting access.
    As people walk through 1km+ size spaces like the airport, they'll be progressively downloading updates and discarding map information that is no longer relevant. Similar to data flows in Google maps, smartphones continually send location and 3D positioning data (GPS, WiFi, scans, etc…) to the cloud in order to improve and augment 3D information. AR maps and content will in all likelihood be similarly layered, dynamic, and progressively downloaded. Spatial AR maps will be a mixture of underlying living spatial maps and digital content items.
    The use case addresses several scenarios:
  • Co-located people wearing an XR HMD collaboratively interact with a detailed 3D virtual model from their own perspective into a shared coordinate system (using a shared map).
  • One person wearing an XR HMD places virtual objects at locations in 3D space for later discovery by other's wearing an XR HMD. This requires a shared map and shared digital assets.
  • XR clients continuously send sensing data to a cloud service. The service constructs a detailed and timely map from client contributions and provides the map back to clients.
  • An XR HMD receives a detailed reconstruction of a space, potentially captured by a device(s) with superior sensing and processing capabilities.
    Categorization
    Type: AR
    Degrees of Freedom: 6DoF
    Delivery: Streaming, Interactive, Split, device-to-device, different types
    Device: HMD, AR Glasses
    Preconditions
  • Application is installed on an HMD or phone with connected AR glass
  • The application uses existing HW capabilities on the device, rendering functionalities as well as sensors. Inside-out Tracking is available. Also a global positioning system for anchoring is available
  • Connectivity to the network is provided.
  • Wayfinding and SLAM is provided to locate and map in case of AR
  • AR and AI functionalities are provided for example for Image & Object Recognition, XR Lighting, Occlusion Avoidance, Shared Persistence
    Requirements and QoS/QoE Considerations
    5G's low-latency high-bandwidth capabilities, as compared to 4G's capabilities, make 5G better suited for sending dense spatial data and associated 3D digital assets over a mobile network to XR clients.
    This data could be transferred as discrete data downloads or streamed and may be lossy or lossless.
    Continuous connectivity is important, sharing local information to improve maps.
    The underlying AR maps should be accurate and should be up to date.
    The content objects should be realistic.
    The data representation for the AR maps and the content objects is scalable.
    Feasibility
  • Microsoft Spatial Anchors: https://azure.microsoft.com/en-us/services/spatial-anchors/
  • Co-located people wearing an XR HMD collaboratively interact with a detailed 3D virtual model from their own perspective into a shared coordinate system (using a shared map).
  • Google: Shared AR Experiences with Cloud Anchors: https://developers.google.com/ar/develop/java/cloud-anchors/overview-android
  • One person wearing an XR HMD places virtual objects at locations in 3D space for later discovery by other's wearing an XR HMD. This requires a shared map and shared digital assets
  • Google Visual Positioning Service: https://www.roadtovr.com/googles-visual-positioning-service-announced-tango-ar-platform/
  • XR clients continuously send sensing data to a cloud service. The service constructs a detailed and timely map from client contributions and provides the map back to clients. Example is Google's Visual Positioning Service
  • Drivenet Maps - Open Data real-time road Maps for Autonomous Driving from 3D LIDAR point clouds: https://sdi4apps.eu/2016/03/drivenet-maps-open-data-real-time-road-maps-for-autonomous-driving-from-3d-lidar-point-clouds/
  • An XR HMD receives a detailed reconstruction of a space, potentially captured by a device(s) with superior sensing and processing capabilities. An example of navigation is given in the MPEG-I use case document for point cloud compression (w16331, section 2.6)
    Potential Standardization Status and Needs
    The following aspects may require standardization work:
  • Data representations for AR maps
  • Collected sensor data to be streamed up streams
  • Scalable streaming and storage formats for AR maps
  • Content delivery protocols to access AR maps and content items
    Network conditions that fulfill the QoS and QoE Requirements
  • Up

    $  Change historyp. 131


    Up   Top