axiom-avfoundation-ref

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

AVFoundation Audio Reference

AVFoundation 音频参考文档

Quick Reference

快速参考

swift
// AUDIO SESSION SETUP
import AVFoundation

try AVAudioSession.sharedInstance().setCategory(
    .playback,                              // or .playAndRecord, .ambient
    mode: .default,                         // or .voiceChat, .measurement
    options: [.mixWithOthers, .allowBluetooth]
)
try AVAudioSession.sharedInstance().setActive(true)

// AUDIO ENGINE PIPELINE
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: nil)
try engine.start()
player.scheduleFile(audioFile, at: nil)
player.play()

// INPUT PICKER (iOS 26+)
import AVKit
let picker = AVInputPickerInteraction()
picker.delegate = self
myButton.addInteraction(picker)
// In button action: picker.present()

// AIRPODS HIGH QUALITY (iOS 26+)
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    options: [.bluetoothHighQualityRecording, .allowBluetoothA2DP]
)

swift
// 音频会话设置
import AVFoundation

try AVAudioSession.sharedInstance().setCategory(
    .playback,                              // 或 .playAndRecord、.ambient
    mode: .default,                         // 或 .voiceChat、.measurement
    options: [.mixWithOthers, .allowBluetooth]
)
try AVAudioSession.sharedInstance().setActive(true)

// 音频引擎管线
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: nil)
try engine.start()
player.scheduleFile(audioFile, at: nil)
player.play()

// 输入选择器(iOS 26+)
import AVKit
let picker = AVInputPickerInteraction()
picker.delegate = self
myButton.addInteraction(picker)
// 按钮点击事件中调用:picker.present()

// AirPods 高质量录制(iOS 26+)
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    options: [.bluetoothHighQualityRecording, .allowBluetoothA2DP]
)

AVAudioSession

AVAudioSession

Categories

类别

CategoryUse CaseSilent SwitchBackground
.ambient
Game sounds, not primarySilencesNo
.soloAmbient
Default, interrupts othersSilencesNo
.playback
Music player, podcastIgnoresYes
.record
Voice recorderYes
.playAndRecord
VoIP, voice chatIgnoresYes
.multiRoute
DJ apps, multiple outputsIgnoresYes
类别使用场景静音开关后台运行
.ambient
游戏音效(非主要音频)会被静音不支持
.soloAmbient
默认选项,会中断其他音频会被静音不支持
.playback
音乐播放器、播客忽略静音开关支持
.record
语音录制支持
.playAndRecord
网络电话、语音聊天忽略静音开关支持
.multiRoute
DJ应用、多输出设备忽略静音开关支持

Modes

模式

ModeUse Case
.default
General audio
.voiceChat
VoIP, reduces echo
.videoChat
FaceTime-style
.gameChat
Voice chat in games
.videoRecording
Camera recording
.measurement
Flat response, no processing
.moviePlayback
Video playback
.spokenAudio
Podcasts, audiobooks
模式使用场景
.default
通用音频场景
.voiceChat
网络电话,减少回声
.videoChat
FaceTime 类视频通话
.gameChat
游戏内语音聊天
.videoRecording
相机录制
.measurement
平坦响应,无音频处理
.moviePlayback
视频播放
.spokenAudio
播客、有声书

Options

选项

swift
// Mixing
.mixWithOthers          // Play with other apps
.duckOthers             // Lower other audio while playing
.interruptSpokenAudioAndMixWithOthers  // Pause podcasts, mix music

// Bluetooth
.allowBluetooth         // HFP (calls)
.allowBluetoothA2DP     // High quality stereo
.bluetoothHighQualityRecording  // iOS 26+ AirPods recording

// Routing
.defaultToSpeaker       // Route to speaker (not receiver)
.allowAirPlay           // Enable AirPlay
swift
// 混音相关
.mixWithOthers          // 与其他应用音频同时播放
.duckOthers             // 播放时降低其他应用音频音量
.interruptSpokenAudioAndMixWithOthers  // 暂停播客,与音乐混音

// 蓝牙相关
.allowBluetooth         // 支持HFP(通话模式)
.allowBluetoothA2DP     // 支持高质量立体声
.bluetoothHighQualityRecording  // iOS 26+ AirPods 高质量录制

// 路由相关
.defaultToSpeaker       // 音频路由到扬声器(而非听筒)
.allowAirPlay           // 启用AirPlay

Interruption Handling

中断处理

swift
NotificationCenter.default.addObserver(
    forName: AVAudioSession.interruptionNotification,
    object: nil,
    queue: .main
) { notification in
    guard let userInfo = notification.userInfo,
          let typeValue = userInfo[AVAudioSessionInterruptionTypeKey] as? UInt,
          let type = AVAudioSession.InterruptionType(rawValue: typeValue) else {
        return
    }

    switch type {
    case .began:
        // Pause playback
        player.pause()

    case .ended:
        guard let optionsValue = userInfo[AVAudioSessionInterruptionOptionKey] as? UInt else { return }
        let options = AVAudioSession.InterruptionOptions(rawValue: optionsValue)
        if options.contains(.shouldResume) {
            player.play()
        }

    @unknown default:
        break
    }
}
swift
NotificationCenter.default.addObserver(
    forName: AVAudioSession.interruptionNotification,
    object: nil,
    queue: .main
) { notification in
    guard let userInfo = notification.userInfo,
          let typeValue = userInfo[AVAudioSessionInterruptionTypeKey] as? UInt,
          let type = AVAudioSession.InterruptionType(rawValue: typeValue) else {
        return
    }

    switch type {
    case .began:
        // 暂停播放
        player.pause()

    case .ended:
        guard let optionsValue = userInfo[AVAudioSessionInterruptionOptionKey] as? UInt else { return }
        let options = AVAudioSession.InterruptionOptions(rawValue: optionsValue)
        if options.contains(.shouldResume) {
            player.play()
        }

    @unknown default:
        break
    }
}

Route Change Handling

路由变更处理

swift
NotificationCenter.default.addObserver(
    forName: AVAudioSession.routeChangeNotification,
    object: nil,
    queue: .main
) { notification in
    guard let userInfo = notification.userInfo,
          let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
          let reason = AVAudioSession.RouteChangeReason(rawValue: reasonValue) else {
        return
    }

    switch reason {
    case .oldDeviceUnavailable:
        // Headphones unplugged — pause playback
        player.pause()

    case .newDeviceAvailable:
        // New device connected
        break

    case .categoryChange:
        // Category changed by system or another app
        break

    default:
        break
    }
}

swift
NotificationCenter.default.addObserver(
    forName: AVAudioSession.routeChangeNotification,
    object: nil,
    queue: .main
) { notification in
    guard let userInfo = notification.userInfo,
          let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
          let reason = AVAudioSession.RouteChangeReason(rawValue: reasonValue) else {
        return
    }

    switch reason {
    case .oldDeviceUnavailable:
        // 耳机拔出 — 暂停播放
        player.pause()

    case .newDeviceAvailable:
        // 新设备连接
        break

    case .categoryChange:
        // 系统或其他应用修改了音频类别
        break

    default:
        break
    }
}

AVAudioEngine

AVAudioEngine

Basic Pipeline

基础管线

swift
let engine = AVAudioEngine()

// Create nodes
let player = AVAudioPlayerNode()
let reverb = AVAudioUnitReverb()
reverb.loadFactoryPreset(.largeHall)
reverb.wetDryMix = 50

// Attach to engine
engine.attach(player)
engine.attach(reverb)

// Connect: player → reverb → mixer → output
engine.connect(player, to: reverb, format: nil)
engine.connect(reverb, to: engine.mainMixerNode, format: nil)

// Start
engine.prepare()
try engine.start()

// Play file
let url = Bundle.main.url(forResource: "audio", withExtension: "m4a")!
let file = try AVAudioFile(forReading: url)
player.scheduleFile(file, at: nil)
player.play()
swift
let engine = AVAudioEngine()

// 创建节点
let player = AVAudioPlayerNode()
let reverb = AVAudioUnitReverb()
reverb.loadFactoryPreset(.largeHall)
reverb.wetDryMix = 50

// 将节点附加到引擎
engine.attach(player)
engine.attach(reverb)

// 连接:播放器 → 混响 → 混音器 → 输出
engine.connect(player, to: reverb, format: nil)
engine.connect(reverb, to: engine.mainMixerNode, format: nil)

// 启动引擎
engine.prepare()
try engine.start()

// 播放音频文件
let url = Bundle.main.url(forResource: "audio", withExtension: "m4a")!
let file = try AVAudioFile(forReading: url)
player.scheduleFile(file, at: nil)
player.play()

Node Types

节点类型

NodePurpose
AVAudioPlayerNode
Plays audio files/buffers
AVAudioInputNode
Mic input (engine.inputNode)
AVAudioOutputNode
Speaker output (engine.outputNode)
AVAudioMixerNode
Mix multiple inputs
AVAudioUnitEQ
Equalizer
AVAudioUnitReverb
Reverb effect
AVAudioUnitDelay
Delay effect
AVAudioUnitDistortion
Distortion effect
AVAudioUnitTimePitch
Time stretch / pitch shift
节点用途
AVAudioPlayerNode
播放音频文件/缓冲区
AVAudioInputNode
麦克风输入(engine.inputNode)
AVAudioOutputNode
扬声器输出(engine.outputNode)
AVAudioMixerNode
混合多个输入音频
AVAudioUnitEQ
均衡器
AVAudioUnitReverb
混响效果器
AVAudioUnitDelay
延迟效果器
AVAudioUnitDistortion
失真效果器
AVAudioUnitTimePitch
时间拉伸/音调调整

Installing Taps (Audio Analysis)

安装Tap(音频分析)

swift
let inputNode = engine.inputNode
let format = inputNode.outputFormat(forBus: 0)

inputNode.installTap(onBus: 0, bufferSize: 1024, format: format) { buffer, time in
    // Process audio buffer
    guard let channelData = buffer.floatChannelData?[0] else { return }
    let frameLength = Int(buffer.frameLength)

    // Calculate RMS level
    var sum: Float = 0
    for i in 0..<frameLength {
        sum += channelData[i] * channelData[i]
    }
    let rms = sqrt(sum / Float(frameLength))
    let dB = 20 * log10(rms)

    DispatchQueue.main.async {
        self.levelMeter = dB
    }
}

// Don't forget to remove when done
inputNode.removeTap(onBus: 0)
swift
let inputNode = engine.inputNode
let format = inputNode.outputFormat(forBus: 0)

inputNode.installTap(onBus: 0, bufferSize: 1024, format: format) { buffer, time in
    // 处理音频缓冲区
    guard let channelData = buffer.floatChannelData?[0] else { return }
    let frameLength = Int(buffer.frameLength)

    // 计算RMS电平
    var sum: Float = 0
    for i in 0..<frameLength {
        sum += channelData[i] * channelData[i]
    }
    let rms = sqrt(sum / Float(frameLength))
    let dB = 20 * log10(rms)

    DispatchQueue.main.async {
        self.levelMeter = dB
    }
}

// 使用完毕后记得移除Tap
inputNode.removeTap(onBus: 0)

Format Conversion

格式转换

swift
// AVAudioEngine mic input is always 44.1kHz/32-bit float
// Use AVAudioConverter for other formats

let inputFormat = engine.inputNode.outputFormat(forBus: 0)
let outputFormat = AVAudioFormat(
    commonFormat: .pcmFormatInt16,
    sampleRate: 48000,
    channels: 1,
    interleaved: false
)!

let converter = AVAudioConverter(from: inputFormat, to: outputFormat)!

// In tap callback:
let outputBuffer = AVAudioPCMBuffer(
    pcmFormat: outputFormat,
    frameCapacity: AVAudioFrameCount(outputFormat.sampleRate * 0.1)
)!

var error: NSError?
converter.convert(to: outputBuffer, error: &error) { inNumPackets, outStatus in
    outStatus.pointee = .haveData
    return inputBuffer
}

swift
// AVAudioEngine 麦克风输入默认是44.1kHz/32位浮点格式
// 使用AVAudioConverter进行格式转换

let inputFormat = engine.inputNode.outputFormat(forBus: 0)
let outputFormat = AVAudioFormat(
    commonFormat: .pcmFormatInt16,
    sampleRate: 48000,
    channels: 1,
    interleaved: false
)!

let converter = AVAudioConverter(from: inputFormat, to: outputFormat)!

// 在Tap回调中:
let outputBuffer = AVAudioPCMBuffer(
    pcmFormat: outputFormat,
    frameCapacity: AVAudioFrameCount(outputFormat.sampleRate * 0.1)
)!

var error: NSError?
converter.convert(to: outputBuffer, error: &error) { inNumPackets, outStatus in
    outStatus.pointee = .haveData
    return inputBuffer
}

Bit-Perfect Audio / DAC Output

比特完美音频 / DAC输出

iOS Behavior

iOS 行为

iOS provides bit-perfect output by default to USB DACs — no resampling occurs. The DAC receives the source sample rate directly.
swift
// iOS automatically matches source sample rate to DAC
// No special configuration needed for bit-perfect output

let player = AVAudioPlayerNode()
// File at 96kHz → DAC receives 96kHz
iOS 默认向USB DAC提供比特完美输出 — 不会进行重采样。DAC会直接接收源音频的采样率。
swift
// iOS 会自动匹配源音频采样率与DAC
// 比特完美输出无需特殊配置

let player = AVAudioPlayerNode()
// 96kHz的音频文件 → DAC接收96kHz音频

Avoiding Resampling

避免重采样

swift
// Check hardware sample rate
let hardwareSampleRate = AVAudioSession.sharedInstance().sampleRate

// Match your audio format to hardware when possible
let format = AVAudioFormat(
    standardFormatWithSampleRate: hardwareSampleRate,
    channels: 2
)
swift
// 检查硬件采样率
let hardwareSampleRate = AVAudioSession.sharedInstance().sampleRate

// 尽可能将音频格式与硬件匹配
let format = AVAudioFormat(
    standardFormatWithSampleRate: hardwareSampleRate,
    channels: 2
)

USB DAC Routing

USB DAC 路由

swift
// List available outputs
let currentRoute = AVAudioSession.sharedInstance().currentRoute
for output in currentRoute.outputs {
    print("Output: \(output.portName), Type: \(output.portType)")
    // USB DAC shows as .usbAudio
}

// Prefer USB output
try AVAudioSession.sharedInstance().setPreferredInput(usbPort)
swift
// 列出可用输出设备
let currentRoute = AVAudioSession.sharedInstance().currentRoute
for output in currentRoute.outputs {
    print("输出设备: \(output.portName), 类型: \(output.portType)")
    // USB DAC 会显示为.usbAudio
}

// 优先选择USB输出
try AVAudioSession.sharedInstance().setPreferredInput(usbPort)

Sample Rate Considerations

采样率注意事项

SourceiOS BehaviorNotes
44.1 kHzPassthroughCD quality
48 kHzPassthroughVideo standard
96 kHzPassthroughHi-res
192 kHzPassthroughHi-res
DSDNot supportedUse DoP or convert

源音频iOS 行为说明
44.1 kHz直接透传CD音质
48 kHz直接透传视频标准采样率
96 kHz直接透传高解析度
192 kHz直接透传高解析度
DSD不支持使用DoP或转换格式

iOS 26+ Input Selection

iOS 26+ 输入设备选择

AVInputPickerInteraction

AVInputPickerInteraction

Native input device selection with live metering:
swift
import AVKit

class RecordingViewController: UIViewController {
    let inputPicker = AVInputPickerInteraction()

    override func viewDidLoad() {
        super.viewDidLoad()

        // Configure audio session first
        try? AVAudioSession.sharedInstance().setCategory(.playAndRecord)
        try? AVAudioSession.sharedInstance().setActive(true)

        // Setup picker
        inputPicker.delegate = self
        selectMicButton.addInteraction(inputPicker)
    }

    @IBAction func selectMicTapped(_ sender: UIButton) {
        inputPicker.present()
    }
}

extension RecordingViewController: AVInputPickerInteractionDelegate {
    // Implement delegate methods as needed
}
Features:
  • Live sound level metering
  • Microphone mode selection
  • System remembers selection per app

原生输入设备选择器,支持实时电平计量:
swift
import AVKit

class RecordingViewController: UIViewController {
    let inputPicker = AVInputPickerInteraction()

    override func viewDidLoad() {
        super.viewDidLoad()

        // 先配置音频会话
        try? AVAudioSession.sharedInstance().setCategory(.playAndRecord)
        try? AVAudioSession.sharedInstance().setActive(true)

        // 设置选择器
        inputPicker.delegate = self
        selectMicButton.addInteraction(inputPicker)
    }

    @IBAction func selectMicTapped(_ sender: UIButton) {
        inputPicker.present()
    }
}

extension RecordingViewController: AVInputPickerInteractionDelegate {
    // 根据需要实现代理方法
}
特性:
  • 实时音量电平计量
  • 麦克风模式选择
  • 系统会记住每个应用的设备选择

iOS 26+ AirPods High Quality Recording

iOS 26+ AirPods 高质量录制

LAV-microphone equivalent quality for content creators:
swift
// AVAudioSession approach
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    options: [
        .bluetoothHighQualityRecording,  // New in iOS 26
        .allowBluetoothA2DP              // Fallback
    ]
)

// AVCaptureSession approach
let captureSession = AVCaptureSession()
captureSession.configuresApplicationAudioSessionForBluetoothHighQualityRecording = true
Notes:
  • Uses dedicated Bluetooth link optimized for AirPods
  • Falls back to HFP if device doesn't support HQ mode
  • Supports AirPods stem controls for start/stop recording

为内容创作者提供媲美LAV麦克风的录制质量:
swift
// AVAudioSession 实现方式
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    options: [
        .bluetoothHighQualityRecording,  // iOS 26 新增
        .allowBluetoothA2DP              // 降级兼容选项
    ]
)

// AVCaptureSession 实现方式
let captureSession = AVCaptureSession()
captureSession.configuresApplicationAudioSessionForBluetoothHighQualityRecording = true
注意:
  • 使用为AirPods优化的专用蓝牙链路
  • 如果设备不支持高质量模式,会降级到HFP模式
  • 支持AirPods柄部按键控制录制启停

Spatial Audio Capture (iOS 26+)

空间音频录制(iOS 26+)

First Order Ambisonics (FOA)

一阶Ambisonics(FOA)

Record 3D spatial audio using device microphone array:
swift
// With AVCaptureMovieFileOutput (simple)
let audioInput = AVCaptureDeviceInput(device: audioDevice)
audioInput.multichannelAudioMode = .firstOrderAmbisonics

// With AVAssetWriter (full control)
// Requires two AudioDataOutputs: FOA (4ch) + Stereo (2ch)
使用设备麦克风阵列录制3D空间音频:
swift
// 使用AVCaptureMovieFileOutput(简单方式)
let audioInput = AVCaptureDeviceInput(device: audioDevice)
audioInput.multichannelAudioMode = .firstOrderAmbisonics

// 使用AVAssetWriter(完全控制)
// 需要两个AudioDataOutput:FOA(4声道)+ 立体声(2声道)

AVAssetWriter Spatial Audio Setup

AVAssetWriter 空间音频设置

swift
// Configure two AudioDataOutputs
let foaOutput = AVCaptureAudioDataOutput()
foaOutput.spatialAudioChannelLayoutTag = kAudioChannelLayoutTag_HOA_ACN_SN3D  // 4 channels

let stereoOutput = AVCaptureAudioDataOutput()
stereoOutput.spatialAudioChannelLayoutTag = kAudioChannelLayoutTag_Stereo    // 2 channels

// Create metadata generator
let metadataGenerator = AVCaptureSpatialAudioMetadataSampleGenerator()

// Feed FOA buffers to generator
func captureOutput(_ output: AVCaptureOutput,
                   didOutput sampleBuffer: CMSampleBuffer,
                   from connection: AVCaptureConnection) {
    metadataGenerator.append(sampleBuffer)
    // Also write to FOA AssetWriterInput
}

// When recording stops, get metadata sample
let metadataSample = metadataGenerator.createMetadataSample()
// Write to metadata track
swift
// 配置两个AudioDataOutput
let foaOutput = AVCaptureAudioDataOutput()
foaOutput.spatialAudioChannelLayoutTag = kAudioChannelLayoutTag_HOA_ACN_SN3D  // 4声道

let stereoOutput = AVCaptureAudioDataOutput()
stereoOutput.spatialAudioChannelLayoutTag = kAudioChannelLayoutTag_Stereo    // 2声道

// 创建元数据生成器
let metadataGenerator = AVCaptureSpatialAudioMetadataSampleGenerator()

// 将FOA缓冲区传入生成器
func captureOutput(_ output: AVCaptureOutput,
                   didOutput sampleBuffer: CMSampleBuffer,
                   from connection: AVCaptureConnection) {
    metadataGenerator.append(sampleBuffer)
    // 同时写入FOA AssetWriterInput
}

// 录制停止时,获取元数据样本
let metadataSample = metadataGenerator.createMetadataSample()
// 写入元数据轨道

Output File Structure

输出文件结构

Spatial audio files contain:
  1. Stereo AAC track — Compatibility fallback
  2. APAC track — Spatial audio (FOA)
  3. Metadata track — Audio Mix tuning parameters
File formats:
.mov
,
.mp4
,
.qta
(QuickTime Audio, iOS 26+)

空间音频文件包含:
  1. 立体声AAC轨道 — 兼容 fallback
  2. APAC轨道 — 空间音频(FOA格式)
  3. 元数据轨道 — 音频混音调校参数
支持文件格式:
.mov
,
.mp4
,
.qta
(QuickTime Audio,iOS 26+)

ASAF / APAC (Apple Spatial Audio)

ASAF / APAC(苹果空间音频)

Overview

概述

ComponentPurpose
ASAFApple Spatial Audio Format — production format
APACApple Positional Audio Codec — delivery codec
组件用途
ASAF苹果空间音频格式 — 制作端格式
APAC苹果位置音频编解码器 — 分发端编解码器

APAC Capabilities

APAC 能力

  • Bitrates: 64 kbps to 768 kbps
  • Supports: Channels, Objects, Higher Order Ambisonics, Dialogue, Binaural
  • Head-tracked rendering adaptive to listener position/orientation
  • Required for Apple Immersive Video
  • 比特率:64 kbps 至 768 kbps
  • 支持:声道、对象、高阶Ambisonics、对话、双耳音频
  • 头部跟踪渲染,适配听者位置/朝向
  • 苹果沉浸式视频必需格式

Playback

播放

swift
// Standard AVPlayer handles APAC automatically
let player = AVPlayer(url: spatialAudioURL)
player.play()

// Head tracking enabled automatically on AirPods
swift
// 标准AVPlayer会自动处理APAC格式
let player = AVPlayer(url: spatialAudioURL)
player.play()

// AirPods会自动启用头部跟踪

Platform Support

平台支持

All Apple platforms except watchOS support APAC playback.

除watchOS外,所有苹果平台均支持APAC播放。

Audio Mix (Cinematic Framework)

音频混音(Cinematic框架)

Separate and remix speech vs ambient sounds in spatial recordings:
分离并重新混音空间录制中的语音与环境音:

AVPlayer Integration

AVPlayer 集成

swift
import Cinematic

// Load spatial audio asset
let asset = AVURLAsset(url: spatialAudioURL)
let audioInfo = try await CNAssetSpatialAudioInfo(asset: asset)

// Configure mix parameters
let intensity: Float = 0.5  // 0.0 to 1.0
let style = CNSpatialAudioRenderingStyle.cinematic

// Create and apply audio mix
let audioMix = audioInfo.audioMix(
    effectIntensity: intensity,
    renderingStyle: style
)
playerItem.audioMix = audioMix
swift
import Cinematic

// 加载空间音频资源
let asset = AVURLAsset(url: spatialAudioURL)
let audioInfo = try await CNAssetSpatialAudioInfo(asset: asset)

// 配置混音参数
let intensity: Float = 0.5  // 0.0 至 1.0
let style = CNSpatialAudioRenderingStyle.cinematic

// 创建并应用音频混音
let audioMix = audioInfo.audioMix(
    effectIntensity: intensity,
    renderingStyle: style
)
playerItem.audioMix = audioMix

Rendering Styles

渲染风格

StyleEffect
.cinematic
Balanced speech/ambient
.studio
Enhanced speech clarity
.inFrame
Focus on visible speakers
+ 6 extraction modesSpeech-only, ambient-only stems
风格效果
.cinematic
平衡语音与环境音
.studio
增强语音清晰度
.inFrame
聚焦画面内说话者
+ 6种提取模式仅语音、仅环境音等独立音轨

AUAudioMix (Direct AudioUnit)

AUAudioMix(直接使用AudioUnit)

For apps not using AVPlayer:
swift
// Input: 4 channels FOA
// Output: Separated speech + ambient

// Get tuning metadata from file
let audioInfo = try await CNAssetSpatialAudioInfo(asset: asset)
let remixMetadata = audioInfo.spatialAudioMixMetadata as CFData

// Apply to AudioUnit via AudioUnitSetProperty

适用于不使用AVPlayer的应用:
swift
// 输入:4声道FOA
// 输出:分离后的语音 + 环境音

// 从文件中获取调校元数据
let audioInfo = try await CNAssetSpatialAudioInfo(asset: asset)
let remixMetadata = audioInfo.spatialAudioMixMetadata as CFData

// 通过AudioUnitSetProperty应用到AudioUnit

Common Patterns

常见模式

Background Audio Playback

后台音频播放

swift
// 1. Set category
try AVAudioSession.sharedInstance().setCategory(.playback)

// 2. Enable background mode in Info.plist
// <key>UIBackgroundModes</key>
// <array><string>audio</string></array>

// 3. Set Now Playing info (recommended)
let nowPlayingInfo: [String: Any] = [
    MPMediaItemPropertyTitle: "Song Title",
    MPMediaItemPropertyArtist: "Artist",
    MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentTime,
    MPMediaItemPropertyPlaybackDuration: duration
]
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
swift
// 1. 设置音频类别
try AVAudioSession.sharedInstance().setCategory(.playback)

// 2. 在Info.plist中启用后台模式
// <key>UIBackgroundModes</key>
// <array><string>audio</string></array>

// 3. 设置Now Playing信息(推荐)
let nowPlayingInfo: [String: Any] = [
    MPMediaItemPropertyTitle: "歌曲标题",
    MPMediaItemPropertyArtist: "艺术家",
    MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentTime,
    MPMediaItemPropertyPlaybackDuration: duration
]
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo

Ducking Other Audio

压低其他应用音频

swift
try AVAudioSession.sharedInstance().setCategory(
    .playback,
    options: .duckOthers
)

// When done, restore others
try AVAudioSession.sharedInstance().setActive(false, options: .notifyOthersOnDeactivation)
swift
try AVAudioSession.sharedInstance().setCategory(
    .playback,
    options: .duckOthers
)

// 播放结束后,恢复其他应用音频
try AVAudioSession.sharedInstance().setActive(false, options: .notifyOthersOnDeactivation)

Bluetooth Device Handling

蓝牙设备处理

swift
// Allow all Bluetooth
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    options: [.allowBluetooth, .allowBluetoothA2DP]
)

// Check current Bluetooth route
let route = AVAudioSession.sharedInstance().currentRoute
let hasBluetoothOutput = route.outputs.contains {
    $0.portType == .bluetoothA2DP || $0.portType == .bluetoothHFP
}

swift
// 支持所有蓝牙模式
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    options: [.allowBluetooth, .allowBluetoothA2DP]
)

// 检查当前蓝牙路由
let route = AVAudioSession.sharedInstance().currentRoute
let hasBluetoothOutput = route.outputs.contains {
    $0.portType == .bluetoothA2DP || $0.portType == .bluetoothHFP
}

Anti-Patterns

反模式

Wrong Category

错误的类别选择

swift
// WRONG — music player using ambient (silenced by switch)
try AVAudioSession.sharedInstance().setCategory(.ambient)

// CORRECT — music needs .playback
try AVAudioSession.sharedInstance().setCategory(.playback)
swift
// 错误示例 — 音乐播放器使用ambient类别(会被静音开关静音)
try AVAudioSession.sharedInstance().setCategory(.ambient)

// 正确示例 — 音乐播放器应使用playback类别
try AVAudioSession.sharedInstance().setCategory(.playback)

Missing Interruption Handling

缺少中断处理

swift
// WRONG — no interruption observer
// Audio stops on phone call and never resumes

// CORRECT — always handle interruptions
NotificationCenter.default.addObserver(
    forName: AVAudioSession.interruptionNotification,
    // ... handle began/ended
)
swift
// 错误示例 — 未添加中断观察者
// 来电时音频停止,且不会恢复

// 正确示例 — 始终处理音频中断
NotificationCenter.default.addObserver(
    forName: AVAudioSession.interruptionNotification,
    // ... 处理中断开始/结束
)

Tap Memory Leaks

Tap内存泄漏

swift
// WRONG — tap installed, never removed
engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: format) { ... }

// CORRECT — remove tap when done
deinit {
    engine.inputNode.removeTap(onBus: 0)
}
swift
// 错误示例 — 安装Tap后未移除
engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: format) { ... }

// 正确示例 — 使用完毕后移除Tap
deinit {
    engine.inputNode.removeTap(onBus: 0)
}

Format Mismatch Crashes

格式不匹配导致崩溃

swift
// WRONG — connecting nodes with incompatible formats
engine.connect(playerNode, to: mixerNode, format: wrongFormat)  // Crash!

// CORRECT — use nil for automatic format negotiation, or match exactly
engine.connect(playerNode, to: mixerNode, format: nil)
swift
// 错误示例 — 连接格式不兼容的节点
engine.connect(playerNode, to: mixerNode, format: wrongFormat)  // 崩溃!

// 正确示例 — 使用nil自动协商格式,或完全匹配格式
engine.connect(playerNode, to: mixerNode, format: nil)

Forgetting to Activate Session

忘记激活音频会话

swift
// WRONG — configure but don't activate
try AVAudioSession.sharedInstance().setCategory(.playback)
// Audio doesn't work!

// CORRECT — always activate
try AVAudioSession.sharedInstance().setCategory(.playback)
try AVAudioSession.sharedInstance().setActive(true)

swift
// 错误示例 — 配置了类别但未激活
try AVAudioSession.sharedInstance().setCategory(.playback)
// 音频无法播放!

// 正确示例 — 配置后始终激活会话
try AVAudioSession.sharedInstance().setCategory(.playback)
try AVAudioSession.sharedInstance().setActive(true)

Resources

资源

WWDC: 2025-251, 2025-403, 2019-510
Docs: /avfoundation, /avkit, /cinematic

Targets: iOS 12+ (core), iOS 26+ (spatial features) Frameworks: AVFoundation, AVKit, Cinematic (iOS 26+) History: See git log for changes
WWDC: 2025-251, 2025-403, 2019-510
文档: /avfoundation, /avkit, /cinematic

目标系统: iOS 12+(核心功能), iOS 26+(空间音频功能) 框架: AVFoundation, AVKit, Cinematic(iOS 26+) 历史变更: 查看git日志了解修改记录