S4-260258 - AI Summary

[FS_Q4RTC_MED] Application scenario: Real-time Peer to Application to Peer communication

Back to Agenda Download Summary
AI-Generated Summary AI

Summary of S4-260258: Application Scenario for Real-time Peer to Application to Peer Communication

Document Overview

This contribution to TR 26.836 (Study on QUIC-based media delivery for real-time communication and services) proposes the addition of a Peer to Application to Peer (P2A2P) application scenario for evaluating QUIC-based media delivery protocols in real-time communication services. The document references existing 3GPP specifications (TS 23.228 and TS 26.114) as the basis for this scenario.

Main Technical Contributions

1. Reference Updates

The CR adds three new normative references to support the P2A2P scenario:

  • TS 26.506: 5G Real-time Media Communication Architecture (Stage 2)
  • TR 26.113: Real-Time Media Communication; Protocols and APIs
  • TR 22.870: Study on 6G Use Cases and Service Requirements

2. P2A2P Application Scenario Definition (New Clause 5.2.1.1)

2.1 Core Architecture Description

The scenario describes an RTC session where:

  • User A establishes a session to an Application Server (AS) rather than directly to User B
  • A SWAP server (defined in TS 26.113) terminates the SDP offer from User A and initiates a second SDP offer towards User B
  • The AS terminates media from User A and forwards it to User B, acting as an intermediary
  • Leverages the GA4RTAR service architecture (TS 26.506) where the AF triggers appropriate AS based on service logic

2.2 Media Handling Characteristics

The AS is characterized as a media-aware entity that actively processes real-time media streams:

  • Decodes, analyzes, modifies, or regenerates media before forwarding
  • Supports value-added services: call screening, IVR, real-time translation, media mixing, conferencing
  • Must maintain conversational latency bounds per TS 26.113
  • Ensures synchronization between media components (e.g., voice and RTT)
  • Provides graceful adaptation to packet loss or bandwidth variation

3. Real-Time Communication for Conversational XR Services Use Case (New Clause 5.2.1.1.2)

3.1 General XR Communication Framework

Describes RTC augmented by shared XR scenes with:

  • User representation via 2D/3D avatars or holograms
  • Multi-modal immersive experience through multiple XR devices (glasses/headset, immersive audio, haptics)
  • Main XR Scene Manager located in Media Function (MF) of AS, responsible for maintaining synchronized XR scene for all participants
  • Interactive virtual objects within the XR scene

3.2 Deployment Configurations

Three participant configurations are defined:

  1. Full VR: All participants remote, represented by avatars in common virtual 3D environment
  2. Full AR: All participants local, common XR scene inserted into physical conference room using AR technology
  3. Hybrid: Mix of local and remote participants; XR scene inserted for local AR viewing, remote participants represented by avatars

3.3 Specific Use Cases from TR 22.870

Three detailed use cases are referenced:

Seamless Immersive Reality in Education (Clause 9.5)
- Supports local, hybrid, or fully immersive classroom configurations
- Virtual objects for learning enhancement

Seamless Holographic Telepresence in Healthcare (Clause 9.8)
- Highly immersive real-time interactions between patients and medical practitioners
- Recreates physical co-presence using holograms, avatars, and multi-sensory media
- Requires synchronized multi-modal data streams (video, audio, haptics, motion, volumetric data)
- Supports six degrees of freedom (6DoF)
- Emphasizes security and privacy for sensitive biometric information (facial features, voiceprints, gestures, health signals)

Personalized Interactive Immersive Guided Tour (Clause 9.12)
- Combines location-aware MR/XR with avatars and multi-modal interaction
- Remote touristic guide represented by personalized avatars
- Heterogeneous 5G/6G-connected devices (AR glasses, XR headsets, smartphones, haptic wearables, immersive audio)
- Personalized experience: individual choice of immersion level, devices, language, avatar appearance, content type
- AI-based analysis of gaze, facial expressions, and behavior for contextual content adaptation
- Rich XR features: 2D/3D video, volumetric video, 6DoF movement, immersive audio, tactile feedback
- Supports both indoor and outdoor locations

Technical Significance

This CR establishes P2A2P as a relevant application scenario for evaluating QUIC-based media delivery in RTC services, particularly emphasizing:

  • Media processing at intermediary AS nodes rather than pure end-to-end delivery
  • Complex multi-modal XR communication requirements
  • Need for synchronized, low-latency media handling with active processing
  • Support for diverse deployment configurations and personalization
  • Integration with GA4RTAR architecture and SWAP server functionality
Document Information
Source:
InterDigital Pennsylvania
Type:
pCR
For:
Agreement
Original Document:
View on 3GPP
Title: [FS_Q4RTC_MED] Application scenario: Real-time Peer to Application to Peer communication
Agenda item: 10.7
Agenda item description: FS_Q4RTC_MED (Study on QUIC-based Media Delivery for Real-time Communication)
Doc type: pCR
For action: Agreement
Release: Rel-20
Specification: 26.836
Version: 0.0.1
Related WIs: FS_Q4RTC_MED
Spec: 26.836
Contact: Srinivas Gudumasu
Uploaded: 2026-02-03T22:23:52.913000
Contact ID: 87955
Revised to: S4-260399
TDoc Status: revised
Reservation date: 03/02/2026 21:47:09
Agenda item sort order: 54