S4-260261 - AI Summary

[FS_Q4RTC_MED] Application scenario: Conference using QUIC-based media protocols for RTC

Back to Agenda Download Summary
AI-Generated Summary AI

Summary of S4-260261: Conference Application Scenario for QUIC-based RTC

Document Information

  • Source: InterDigital
  • Title: Application scenario: Conference using QUIC-based media protocols for RTC
  • Specification: 3GPP TR 26.836 v0.0.1
  • Purpose: Discussion and Agreement

Introduction and Objective

This contribution addresses the Study on QUIC-based media delivery for real-time communication and services (FS_Q4RTC_MED). The focus is on identifying and documenting relevant application scenarios for evaluating QUIC-based media delivery protocols, specifically for conference applications in real-time communication services.

Main Technical Contributions

Reference Updates

Addition of normative reference:
- [X.1] 3GPP TR 22.870: "Study on 6G Use Cases and Service Requirements"

Conference Application Scenario (Section 5.2.1.X)

General Description

The contribution defines a conference application scenario enabling multiple UEs (smartphones, tablets, smart glasses) to participate in real-time interactive sessions from web-based or native clients. Key characteristics include:
- Support for audio, video, haptic media and data sharing (chat, presence, screen-sharing metadata)
- Reliable control signaling and non-media data
- Low latency and continuity prioritization for media delivery
- Support for different browsers and dedicated applications

Architecture 1: Single Output Scenario with Centralized Mixing (Section 5.2.1.X.2)

Architecture characteristics:
- Media streams (audio/video/haptic) from all participants sent to central conferencing server
- Server includes composition function/media mixer
- Mixer combines different input streams into single composite output stream per session
- All UEs receive identical combined/mixed streams
- Control and signaling messages exchanged between UEs and conferencing server

Functional aspects:
- Capability and state exchange between all parties
- Dynamic adaptation by media mixer to changes (resolution, video source)
- Server manages admission of new participants

Architecture 2: Multi-Stream Scenario (Section 5.2.1.X.3)

Architecture characteristics:
- Participants subscribe to audio/video streams published by remote participants
- Central conferencing server manages subscription and publish mechanisms
- Dynamic subscription model: UEs can subscribe to one or more streams, changeable over time
- Composition performed on UE side

Example scenarios:
- UE1 subscribes to all audio and video streams from other UEs
- UE2 subscribes selectively (e.g., video from UE1, audio from UE1/UE3/UE4)
- Dynamic changes supported (e.g., UE2 later subscribes to UE4 video)

Mapping to TR 22.870 Use Cases (Section 5.2.1.X.4)

The contribution maps the two conference architectures to specific use cases from TR 22.870:

Multi-Stream Scenario Mapping:

  • Holographic telepresence in healthcare (clause 9.8): Separate streams for audio, hologram (avatar), and haptics with tight inter-stream synchronization

Single Output Scenario Mapping:

  • Multi-site immersive communication (clause 9.6): Multi-camera capture with central rendering of global scene, distributed to on-site audiences and remote viewers

Additional Relevant Use Cases:

  1. Immersive gaming (clause 9.2): Conversational media, interactive XR, haptics, synchronized shared state
  2. Seamless immersive reality in education (clause 9.5): Real-time immersive telepresence with conversational latency and tight media synchronization
  3. Collaborative service in multi-site involved immersive communication (clause 9.6): Multi-site immersive communication with stringent latency, synchronization, and uplink requirements
  4. Multiple application media synchronisation (clause 9.7): Ultra-tight real-time, multi-modal communication with millisecond-level cross-flow and cross-device synchronization
  5. Mixed reality gaming (clause 9.9): Interactive real-time mixed reality with low latency and bidirectional data exchange
  6. Personalised interactive immersive guided tour (clause 9.12): Interactive, multi-user real-time immersive communication with personalized content and tight multi-modal synchronization

QUIC Protocol Suitability Justification

The contribution provides rationale for QUIC-based transport for immersive use cases based on alignment with TR 22.870 requirements:

Key requirements addressed:
- Low latency
- Bidirectional communication
- High reliability
- Multi-modal traffic support
- Strong security

QUIC protocol advantages:
- Encrypted-by-default communication
- Stream multiplexing without head-of-line blocking
- Robustness to packet loss and network variability
- Built-in standardized congestion control and loss recovery mechanisms

These characteristics make QUIC suitable for interactive and immersive communication application scenarios.

Document Information
Source:
InterDigital Pennsylvania
Type:
pCR
For:
Agreement
Original Document:
View on 3GPP
Title: [FS_Q4RTC_MED] Application scenario: Conference using QUIC-based media protocols for RTC
Agenda item: 10.7
Agenda item description: FS_Q4RTC_MED (Study on QUIC-based Media Delivery for Real-time Communication)
Doc type: pCR
For action: Agreement
Release: Rel-20
Specification: 26.836
Version: 0.0.1
Related WIs: FS_Q4RTC_MED
Spec: 26.836
Contact: Srinivas Gudumasu
Uploaded: 2026-02-03T22:28:09.117000
Contact ID: 87955
Revised to: S4-260401
TDoc Status: revised
Reservation date: 03/02/2026 21:50:51
Agenda item sort order: 54