Skip to main content
The LiveKit Swift SDK provides full support for iOS applications with comprehensive video, audio, and data capabilities.

Platform Support

Minimum version: iOS 13.0+

VideoView

The SDK provides a UIKit-based VideoView class for rendering video tracks:
import LiveKit
import UIKit

class RoomViewController: UIViewController {
    lazy var remoteVideoView: VideoView = {
        let videoView = VideoView()
        view.addSubview(videoView)
        return videoView
    }()

    lazy var localVideoView: VideoView = {
        let videoView = VideoView()
        view.addSubview(videoView)
        return videoView
    }()
}

Video Rendering Modes

iOS supports two rendering modes:
  • Metal Rendering (default): Hardware-accelerated rendering using Metal
  • AVSampleBuffer Rendering: Alternative rendering using AVSampleBufferDisplayLayer
videoView.renderMode = .metal // or .sampleBuffer

Layout and Display Options

// Control how video fits within the view
videoView.layoutMode = .fill // .fill or .fit

// Mirror video (useful for front camera)
videoView.mirrorMode = .auto // .auto, .mirror, or .off

// Rotation override
videoView.rotationOverride = .r90

Pinch-to-Zoom (iOS Only)

iOS supports pinch-to-zoom gestures for camera tracks:
// Enable basic zoom
videoView.isPinchToZoomEnabled = true

// Auto-reset zoom when gesture ends
videoView.isAutoZoomResetEnabled = true

// Advanced options
videoView.pinchToZoomOptions = [.zoomIn, .resetOnRelease]
See VideoView+PinchToZoom.swift:1 for implementation details.

View Transitions

iOS supports animated transitions when switching cameras:
videoView.transitionMode = .crossDissolve // .none, .crossDissolve, or .flip
videoView.transitionDuration = 0.3

Camera Capture

Basic Camera Usage

// Enable camera with default options
try await room.localParticipant.setCamera(enabled: true)

// Custom camera options
let options = CameraCaptureOptions(
    position: .front,
    fps: 30,
    dimensions: .h1080_169
)
try await room.localParticipant.setCamera(enabled: true, captureOptions: options)

Switching Camera Position

// Check if switching is supported
let canSwitch = try await CameraCapturer.canSwitchPosition()

// Get current camera track
if let cameraTrack = room.localParticipant.cameraTrack {
    try await cameraTrack.capturer.switchCameraPosition()
}

Multitasking Camera Access (iOS 16+)

On iOS 16+, you can enable camera access when your app is in the background:
if let cameraCapturer = cameraTrack?.capturer as? CameraCapturer {
    if cameraCapturer.isMultitaskingAccessSupported {
        cameraCapturer.isMultitaskingAccessEnabled = true
    }
}
See CameraCapturer.swift:52 for details.

Screen Sharing

iOS supports two screen sharing modes via ReplayKit:

1. In-App Capture (Default)

Capture screen content within your app with no additional setup:
try await room.localParticipant.setScreenShare(enabled: true)
Limitations:
  • Only captures content within your app
  • Does not support app audio capture
  • User grants permission once per app execution

2. Broadcast Extension Capture

Capture system-wide screen content, even when users switch apps. Requires setup of a Broadcast Upload Extension.

Setup Steps

  1. Add Broadcast Upload Extension target in Xcode
  2. Set bundle identifier to <main-app-bundle-id>.broadcast
  3. Replace SampleHandler.swift:
import LiveKit

#if os(iOS)
@available(macCatalyst 13.1, *)
class SampleHandler: LKSampleHandler {
    override var enableLogging: Bool { true }
}
#endif
  1. Add App Groups capability to both targets: group.<main-app-bundle-id>

Capture App Audio

When using broadcast capture, you can capture app audio:
let roomOptions = RoomOptions(
    defaultScreenShareCaptureOptions: ScreenShareCaptureOptions(
        appAudio: true
    )
)

try await room.connect(url: url, token: token, roomOptions: roomOptions)
try await room.localParticipant.setScreenShare(enabled: true)
App audio is mixed with the microphone track when enabled.
For complete setup instructions, see the iOS Screen Sharing Guide.

Audio Session Management

LiveKit automatically manages the AVAudioSession on iOS:

Automatic Configuration

By default, the SDK configures the audio session appropriately:
  • .playback category when only receiving audio
  • .playAndRecord when publishing microphone

Manual Configuration

Disable automatic management for custom control:
AudioManager.shared.audioSession.isAutomaticConfigurationEnabled = false

// Configure manually
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, mode: .videoChat, options: [.mixWithOthers])
try session.setActive(true)

Audio Session Configurations

iOS provides several preset configurations:
// For playback only
AudioSessionConfiguration.playback

// For video calls (speaker)
AudioSessionConfiguration.playAndRecordSpeaker

// For voice calls (receiver)
AudioSessionConfiguration.playAndRecordReceiver
See AudioSessionConfiguration.swift:17 for available configurations.

Permissions

iOS requires explicit user permissions for camera and microphone access.

Info.plist Entries

Add these keys to your Info.plist:
<key>NSCameraUsageDescription</key>
<string>This app needs camera access for video calls</string>

<key>NSMicrophoneUsageDescription</key>
<string>This app needs microphone access for audio calls</string>

Requesting Permissions

import AVFoundation

// Request camera permission
let cameraStatus = AVCaptureDevice.authorizationStatus(for: .video)
if cameraStatus == .notDetermined {
    await AVCaptureDevice.requestAccess(for: .video)
}

// Request microphone permission
let micStatus = AVCaptureDevice.authorizationStatus(for: .audio)
if micStatus == .notDetermined {
    await AVCaptureDevice.requestAccess(for: .audio)
}

iOS Simulator Limitations

The following features are not supported on iOS Simulator:
  • Publishing camera tracks (camera capture)
  • Hardware-accelerated video encoding
Receiving and rendering video tracks works normally on the simulator.

SwiftUI Integration

The SDK provides SwiftUI components for iOS:
import SwiftUI
import LiveKit

struct VideoCallView: View {
    @StateObject var room = Room()

    var body: some View {
        VStack {
            if let videoTrack = room.remoteParticipants.first?.videoTracks.first?.track {
                SwiftUIVideoView(videoTrack)
                    .frame(maxWidth: .infinity, maxHeight: .infinity)
            }
        }
        .onAppear {
            Task {
                try await room.connect(url: url, token: token)
            }
        }
    }
}

Thread Safety

VideoView is a UI component and must be accessed from the main thread.
// Correct: Update VideoView on main thread
DispatchQueue.main.async {
    videoView.track = track
}

// Or use MainActor
Task { @MainActor in
    videoView.track = track
}
Other SDK classes (Room, Participant, Track) can be accessed from any thread.

ScrollView Performance

For collection views with many video cells, disable rendering for off-screen cells:
// Disable rendering when cell scrolls off screen
videoView.isEnabled = false

// Re-enable when cell becomes visible
videoView.isEnabled = true
Avoid using UICollectionViewDelegate’s willDisplay/didEndDisplaying as they can be unreliable. Instead, use a timer to periodically check visible cells.

App Store Submission

The LiveKitWebRTC.xcframework does not include DSYMs. You may see this warning during submission:
The archive did not include a dSYM for the LiveKitWebRTC.framework
This warning will not prevent your app from being submitted or passing review.

Example Projects

Next Steps

CallKit Integration

Integrate with iOS CallKit for native call UI

Screen Sharing Guide

Complete screen sharing setup guide

Build docs developers (and LLMs) love