Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 26.998  Word version:  18.1.0

Top   Top   Up   Prev   Next
0…   4…   4.2…   4.2.2…   4.2.2.2   4.2.2.3   4.2.2.4   4.2.3…   4.3…   4.4…   4.5…   4.6…   4.6.4…   4.6.5…   4.6.8…   5   6…   6.2…   6.2.4…   6.2.4.2   6.2.5…   6.3…   6.3.4…   6.3.4.2   6.3.5…   6.4…   6.4.4   6.4.5…   6.5…   6.5.4   6.5.5   6.5.6…   6.6…   6.6.4   6.6.5…   7…   8…   8.9   9   A…   A.2   A.3…   A.4   A.5   A.6   A.7…

 

6.6.4  Generic Call flowp. 89

Figure 6.6.4-1 illustrates the call flow for an immersive AR conversational for a receiving EDGAR UE. Only one sender is shown in this diagram without showing its detailed call flow.
Copy of original 3GPP image for 3GPP TS 26.998, Fig. 6.6.4-1: Shared AR conversational experience call flow for a receiving EDGAR UE
Up
Procedures:
Step 1.
Session Establishment:
  1. The AR/MR Application requests to start a session through EDGE.
  2. The EDGE negotiates with the Scene Composite Generator (SCG) and the sender UE to establish the session.
  3. The EDGE acknowledges the session establishment to the UE.
Step 2.
Media pipeline configuration:
  1. MAF configures its pipelines.
  2. EDGE configures its pipelines.
Step 3.
The AR/MR Application requests the start of the session.
Loops 4, 5, 6, and 7 are run in parallel:
Step 4.
AR uplink loop:
  1. The AR Runtime sends the AR data to the AR/MR Application.
  2. The AR/MR Application processes the data and sends it to the MAF.
  3. The MAF streams up the AR data to the EDGE.
Step 5.
Shared experience loop:
  1. Parallel to 9, the sender UE streams its media streams up to Media Delivery (MD).
  2. The sender UE streams its AR data up to the Scene Graph Compositor (SGC).
  3. Using the AR data from various participants, the SCG creates the composted scene.
  4. The composted scene is delivered to the EDGE.
  5. The media streams are delivered to the EDGE.
Step 6.
Media uplink loop:
  1. The AR Runtime captures the media components and processes them.
  2. The AR Runtime sends the media data to the MAF.
  3. The MAF encodes the media.
  4. The MAF streams up the media streams to the EDGE.
Step 7.
Media downlink loop:
  1. The EDGE parses the scene description and media components, partially renders the scene, and creates a simple scene description as well as the media component.
  2. The simplified scene is delivered to the Media Client and Scene Manager.
  3. Media stream loop:
    1. The pre-rendered media components are streamed to the MAF.
    2. The MAF decodes the media streams.
    3. The Scene Manager parses the basic scene description and composes the scene.
    4. The AR manager after correcting the pose, renders the immersive scene including the registration of AR content into the real world.
Up

Up   Top   ToC