# Summary of S4-260124: Complexity Evaluation of DaCAS Example Solutions

## 1. Introduction and Background

This contribution from Bytedance addresses the complexity evaluation framework for DaCAS (Data Collection for Audio Scene) example solutions. The document recognizes that different example solutions will exhibit varying computational complexity due to differences in algorithms, number of channels, metadata amounts, and output formats. While the current DaCAS pdoc-3 includes complexity analysis as an optional component, this paper proposes making it a documented requirement.

## 2. Main Technical Contributions

### 2.1 Rationale for Complexity Analysis

The document establishes two key justifications for requiring complexity reporting:

- **Feasibility Assessment**: Provides fundamental information on compute and/or latency costs, enabling evaluation of whether an example solution can fit specific hardware/CPU constraints or operate in real-time
- **Documentation Completeness**: Enhances the sufficiency of example solution deliverables

### 2.2 Evaluation Procedure

Building on agreements from post-#134 telcos regarding self-evaluation of example solution performance, the proposal establishes that:

- Complexity analysis should be conducted alongside the self-evaluation process
- The proponent of each example solution is responsible for performing the complexity evaluation
- Proponents must provide sufficient documentation covering both the test procedure and obtained results

### 2.3 Metrics and Requirements Framework

The document proposes a flexible, non-comparative approach:

- **No Minimum Requirements**: No complexity threshold is established at this stage
- **No Exclusion Criteria**: Example solutions will not be rejected based on complexity
- **No Cross-Comparison**: Complexity results across different example solutions will not be compared
- **Proponent Freedom**: Proponents may select complexity metrics and test conditions at their discretion

### 2.4 Documentation Requirements

The proposal identifies key factors affecting complexity measurements that should be documented:

- **Complexity metrics used**
- **Processing architecture** (e.g., CPU, aDSP, NPU) and specific hardware model
- **Software environment**: Programming language, optimization level, and compiler
- **Algorithm configurations**: Filter length, encoder dimension, numerical precision of data

Proponents must document the selected complexity metrics along with sufficient detail on test conditions in the example solution deliverable.

## 3. Proposal for Agreement

The document proposes that SA4 agree that:

- Complexity analysis of example solutions should be conducted using metrics chosen by the proponent
- Results must be documented with sufficient detail in the deliverable submission