# Report from Audio SWG Teleconference on DaCAS (9 December 2025)

## Meeting Overview

The Audio SWG held a teleconference on DaCAS (Diverse audio CApturing system for Smartphone devices) with 19 participants for 1 hour. Three input documents were discussed, all focused on defining the evaluation approach, specification methodology, and deliverables for DaCAS example solutions.

## Main Technical Contributions

### S4aA250140: Evaluation Approach for DaCAS Example Solutions (Nokia, Fraunhofer IIS)

**Key Proposals:**
- **Self-evaluation approach** for example solutions rather than cross-evaluation
- High-level documentation guidelines for proponents
- Informal assessment against requirements in TS 26.261
- Updated work plan with two timeline options depending on evaluation procedure chosen

**Timeline Proposals:**
- Table 1: Faster timeline with self-evaluation only
- Table 2: Extended timeline including cross-evaluation
- Key milestones: submission of example solutions in Montreal, specification work following

**Discussion Points:**
- Mixed support: some agreement on self-evaluation concept, but reservations about completely skipping cross-evaluation
- Questions on deliverable package definition (executable vs source code)
- Concerns about timeline dependencies on deliverable agreements
- Clarification needed on what "informal evaluation" means in practice
- Need to distinguish between demonstrating feasibility vs determining suitability of solutions

**Outcome:** Document noted; offline discussions encouraged to provide potential revision

### S4aA250141: Specification of Suitable DaCAS Example Solutions (Nokia)

**Key Proposals:**
- **Two-tier approach for example solutions:**
  - All evaluations and documentation included in informative Annex of TS 26.533
  - Example solutions additionally meeting requirements from TS 26.261 specified in TS 26.533 (status TBD)

**Documentation Requirements:**
- Algorithmic description with sufficient detail for implementation
- Optional elements:
  - Adaptation considerations for commercial devices
  - Support for IVAS PI (Parametric Information) data
  - Integration with commercial processing pipelines
  - Device limitations and generalization capabilities

**Discussion Points:**
- Questions on required detail level: pseudocode vs source code
- Clarification that last three bullets are optional/invitational
- Relationship to ATIAS Phase 3 work
- Distinction between two categories:
  1. Feasible example solutions (demonstration via self-evaluation)
  2. Endorsed solutions meeting TS 26.261 requirements (requiring testing)
- Suggestion to share draft updates via SA4 reflector for broader input

**Outcome:** Document noted; revision expected with broader consultation

### S4aA250142: Example Solution Deliverables (Bytedance, Xiaomi)

**Key Proposals for Deliverable Package:**

**For Neural Network-based Solutions:**
- Code package including:
  - Training code
  - Inference code
  - Model checkpoints
- Rationale: enable retraining for different devices beyond target device

**For DSP-based Solutions:**
- Algorithmic description allowing reimplementation

**General Requirements:**
- Legal framework submission (depending on solution specifics)
- Sufficient details to convert raw microphone recordings to IVAS formats

**Discussion Points:**
- **Strong concerns** about mandating training code and checkpoints:
  - Risk of providing false impression of quality transferability
  - Training typically tailored to specific devices
  - Cannot guarantee similar quality on different devices
  - May be unnecessary if solution generalizes without retraining
- Distinction between providing model (equivalent to DSP description) vs full training pipeline
- Legal framework requirements too general and need clarification
- Fundamental difference in reproducibility between DSP descriptions (anyone can reimplement) and NN solutions

**Outcome:** Document noted; significant concerns raised about mandatory training code requirements

## Cross-Cutting Issues and Open Questions

### Evaluation Methodology
- **Self-evaluation vs cross-evaluation trade-off:** timeline efficiency vs validation robustness
- Need to define "informal assessment" against TS 26.261 requirements
- Distinction between feasibility demonstration and suitability determination

### Deliverables Definition
- Critical dependency: timeline cannot be finalized without agreed deliverable package
- Balance between sufficient detail for reproducibility and practical constraints
- Different requirements for NN-based vs DSP-based solutions

### Specification Approach
- Two-tier system emerging: informative documentation for all vs normative specification for qualified solutions
- No selection procedure planned, but potential endorsement mechanism discussed
- Relationship to requirements verification unclear

### Timeline Dependencies
- Agreement on deliverables needed at SA4#135 (India, February 2026) to maintain schedule
- Submission of example solutions potentially at SA4#136 (Montreal)
- Specification work to follow based on submitted packages and evaluation reports

## Next Steps

- Offline discussions encouraged on evaluation approach (S4aA250140)
- Revision expected for specification approach with broader consultation via reflector (S4aA250141)
- Further clarification needed on deliverable requirements, particularly for NN-based solutions (S4aA250142)
- Next DaCAS telco scheduled for January 13, 2026, 15:00-16:00 CET
- Repository discussion for material exchange (forge) to be continued offline