Skip to main content

General Questions

The LiveKit Swift SDK supports:
  • iOS 13.0+
  • macOS 10.15+
  • macOS Catalyst
  • visionOS 1.0+
  • tvOS 13.0+
The SDK is available as a Swift Package and supports CocoaPods (deprecated).
The SDK is currently compiled using Swift 6.0 with full support for strict concurrency. Apps compiled in Swift 6 language mode will not need to use @preconcurrency or @unchecked Sendable to access LiveKit classes.The minimum required Swift version is defined in Package.swift (currently Swift 5.9+ for backwards compatibility).
CocoaPods support is deprecated. The main CocoaPods trunk repo as well as LiveKit podspecs repo will become read-only and stop receiving updates starting in 2027.It is strongly recommended to migrate to Swift Package Manager to ensure access to the latest features and security updates.For existing CocoaPods users, see the CocoaPods guide.
Version 2 contains breaking changes from Version 1:
  • Full Swift 6 concurrency support
  • Improved thread safety
  • Better memory management (requires weak references for SDK-managed objects)
  • Enhanced audio session management
  • New audio engine availability control for CallKit integration
Read the migration guide for detailed information.

Configuration

The SDK writes to OSLog by default (io.livekit.*) with a minimum log level of info. Logs can be filtered by level, category, etc. using Xcode console.Set log level:
LiveKitSDK.setLogLevel(.debug)
Set custom logger:
LiveKitSDK.setLogger(myCustomLogger)
Disable logging:
LiveKitSDK.disableLogging()
All logging methods must be called before any other logging is done, e.g., in App.init() or AppDelegate/SceneDelegate.
Capture warning/error logs:You can subclass OSLogger and override the log(...) method to capture specific log levels:
class MyLogger: OSLogger {
    override func log(level: LogLevel, message: String, file: String, function: String, line: Int) {
        if level == .warning || level == .error {
            // Send to analytics service
        }
        super.log(level: level, message: message, file: file, function: function, line: line)
    }
}

LiveKitSDK.setLogger(MyLogger())
To publish camera video at 60 FPS:
  1. Create a LocalVideoTrack with 60 FPS capture options:
let track = LocalVideoTrack.createCameraTrack(
    options: CameraCaptureOptions(fps: 60)
)
  1. Publish with matching encoding settings:
try await room.localParticipant.publish(
    videoTrack: track,
    publishOptions: VideoPublishOptions(
        encoding: VideoEncoding(maxFps: 60)
    )
)
High frame rates require more bandwidth and processing power. Ensure your use case justifies 60 FPS, and consider network conditions.
By default, LiveKit automatically manages the underlying AVAudioSession while connected. It sets the session to .playback category and switches to .playAndRecord when a local track is published.To configure AVAudioSession yourself:
// Disable automatic audio session configuration
AudioManager.shared.audioSession.isAutomaticConfigurationEnabled = false
Then configure and activate the session:
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, mode: .voiceChat)
try session.setActive(true)
AVAudioSession must be configured and activated with category .playAndRecord and mode .voiceChat or .videoChat before enabling/publishing the microphone.
When integrating with CallKit, proper timing and coordination between AVAudioSession and the SDK’s audio engine is crucial.
  1. Disable automatic audio session configuration:
// As early as possible, before connecting to a Room
AudioManager.shared.audioSession.isAutomaticConfigurationEnabled = false
try AudioManager.shared.setEngineAvailability(.none)
  1. Coordinate audio engine availability in your CXProviderDelegate:
func provider(_: CXProvider, didActivate session: AVAudioSession) {
    do {
        try session.setCategory(.playAndRecord, mode: .voiceChat, options: [.mixWithOthers])
        try AudioManager.shared.setEngineAvailability(.default)
    } catch {
        // Handle error
    }
}

func provider(_: CXProvider, didDeactivate _: AVAudioSession) {
    do {
        try AudioManager.shared.setEngineAvailability(.none)
    } catch {
        // Handle error
    }
}
See the CallKit example for full implementation details.

Audio & Video

Yes! Subscribed audio tracks are automatically played by the SDK. You don’t need to manually handle audio playback for remote participants.
You can pre-warm the audio engine to reduce microphone publishing latency:
AudioManager.shared.setRecordingAlwaysPreparedMode(true)
This keeps the audio engine ready, reducing the time it takes to start publishing when the microphone is enabled.
No. Publishing the camera track is not supported by iOS Simulator. You must test camera functionality on a real device.This is a limitation of the iOS Simulator, not the LiveKit SDK.
For better performance when displaying multiple VideoViews in a scroll view, disable rendering for views that scroll off-screen:
// When cell goes off-screen
cell.videoView.isEnabled = false

// When cell comes back on-screen
cell.videoView.isEnabled = true
UICollectionViewDelegate’s willDisplay / didEndDisplaying has been reported to be unreliable for this purpose. In some iOS versions, didEndDisplaying could get invoked even when the cell is visible.
For a robust implementation using NSHashTable and a timer, see the UIKit Minimal Example.
iOS screen sharing requires using a Broadcast Upload Extension with ReplayKit.See the iOS Screen Sharing instructions for complete setup details.

Thread Safety & Memory

Most core classes can be accessed from any thread, except:
  • VideoView - must be accessed from the main thread only (it’s a UI component)
All operations on VideoView (reading/writing properties, etc.) must be performed from the main thread.Other core classes (Room, Participant, Track, etc.) can be safely accessed from any thread.
Delegates are called on the SDK’s internal thread, not the main thread.Make sure any access to your app’s UI elements are from the main thread:
func room(_: Room, participant _: RemoteParticipant, didSubscribeTrack publication: RemoteTrackPublication) {
    guard let track = publication.track as? VideoTrack else { return }
    
    // Dispatch to main thread for UI updates
    DispatchQueue.main.async {
        self.remoteVideoView.track = track
    }
}
Alternatively, use @MainActor for methods that update UI.
It is recommended to use weak var when storing references to objects created and managed by the SDK:
weak var participant: RemoteParticipant?
weak var publication: TrackPublication?
weak var track: VideoTrack?
These objects are invalidated when the Room disconnects and will be released by the SDK. Holding strong references will prevent releasing Room and other internal objects, leading to memory leaks.
VideoView.track property does not hold a strong reference, so it’s not required to set it to nil, but it’s good practice.
No, VideoView.track property does not hold a strong reference, so it’s not required to set it to nil.However, setting it to nil when the view is no longer in use is still good practice for clarity.

Troubleshooting

When submitting to the App Store, you may see:
The archive did not include a dSYM for the LiveKitWebRTC.framework with the UUIDs [...]
This is expected. The LiveKitWebRTC.xcframework binary framework does not contain DSYMs.This will NOT prevent the app from being submitted to the App Store or passing the review process.If you need DSYMs (for custom builds), you can use the build script in DEBUG mode to generate them locally.
If your app targets macOS Catalina and crashes with “ReplayKit not found”:
  1. Explicitly add ReplayKit.framework to Build Phases > Link Binary with Libraries
  2. Set it to Optional (not Required)
This is only required for macOS 10.15 Catalina. Apps targeting macOS 11.0+ do not need this workaround.
Enable debug logging to see detailed connection information:
// Call in App.init() or AppDelegate
LiveKitSDK.setLogLevel(.debug)
Logs are written to OSLog with category io.livekit.* and can be filtered in Xcode console.Also verify:
  • Server URL is correct (use wss:// for production)
  • JWT token is valid and not expired
  • Network connectivity is available
  • Firewall settings allow WebRTC traffic

Resources

LiveKit provides several example repositories:All examples include source code and detailed README files.
Community Support:Bug Reports & Feature Requests:Documentation:
Yes! Contributions are welcome.
  • Join our Slack to discuss contributions
  • Submit pull requests on GitHub
  • Review the contribution guidelines in the repository
We appreciate PRs for bug fixes, features, documentation improvements, and examples.

Build docs developers (and LLMs) love