Extracted Proposals
Proposal: Multi-point Fine-grained Trace Generation
The MFTG methodology aims to decouple physical layer simulation assumptions from application-layer codec design. By providing a high-resolution library of error traces rather than a single static operating point, it enables a fair and flexible evaluation of various codec strategies (e.g., different bitrate/BLER trade-offs) while bypassing the current standardization deadlock.
Step 1: Resource Baseline Normalization (TBS Definition)
- Define a set of Reference Transport Block Sizes (TBS) based on a unified packect overhead.
- These TBS values must be kept consistent across all candidate codec bitrates to ensure a fair comparison of resource efficiency.
Step 2: Link Budget Mapping and Granularity Setup
- Identify the target range of Link Budgets (SNR/CNR) based on realistic NTN deployment scenarios (e.g., LEO/GEO, UE power classes).
- Establish a fine-grained sampling interval (e.g., 1% BLER to 10% BLER in step of 1% or 2% from BLER perspective or -5dB to 10dB in step of 1dB from SNR perspective) along the SNR-BLER curve to ensure high resolution for subsequent selection.
Step 3: Large-scale Link-Level Simulation (LLS)
- Execute Monte Carlo simulations for each defined TBS at every fine-grained sampling interval.
Step 4: Flexible Trace Selection for Verification
- For Performance Comparison: Proponents selecting a specific source bitrate can identify and utilize the trace from the library whose SNR/BLER most closely matches their design's intended link budget.
- For Robustness Testing: Proponents can select "stress-test" traces (e.g., those with higher BLER or specific jitter profiles) from the same library to verify PLC and JBM algorithms.