Overview
Ant Media Server is built on a modular architecture that handles live streaming from ingestion through delivery. Understanding the core components helps you optimize performance, troubleshoot issues, and design scalable deployments.High-Level Architecture
Core Components
1. Application Adapter
The entry point for all streaming operations:AntMediaApplicationAdapter.java:117
- Accept incoming connections (RTMP, WebRTC, SRT)
- Authenticate publishers and viewers
- Manage stream lifecycle
- Coordinate between components
- Handle webhook callbacks
- Multi-threaded request handling
- Stream security filters
- Cluster communication
- Analytics integration
2. Stream Adaptors
Adaptors convert protocol-specific streams into a common format.RTMPAdaptor
Handles RTMP stream ingestion:RTMPAdaptor.java
- RTMP handshake negotiation
- FLV tag parsing
- Audio/video stream extraction
- Metadata processing
SRTAdaptor
Handles SRT (Secure Reliable Transport) streams:SRTAdaptor.java
- Automatic Repeat Request (ARQ)
- Forward Error Correction (FEC)
- Encryption support
- Latency control
WebRTC Adaptor
Manages WebRTC peer connections:IWebRTCAdaptor.java
- Signaling server integration
- STUN/TURN server communication
- RTP packet processing
- RTCP statistics collection
3. MuxAdaptor
The central media processing pipeline:MuxAdaptor.java
- Initialize muxers for each output format
- Coordinate transcoding operations
- Distribute packets to all active muxers
- Monitor stream health
- Manage adaptive bitrate profiles
4. Muxers
Muxers package media for different delivery formats:HLSMuxer
Creates HLS playlists and segments:HLSMuxer.java:40
- Generate M3U8 playlists
- Create TS or fMP4 segments
- Manage segment rotation
- Upload to S3/CDN
- Support for low-latency HLS
HLSMuxer.java:174-206
Mp4Muxer
Records streams to MP4 files:Mp4Muxer.java
- Fast-start MP4 (moov before mdat)
- Automatic file rotation
- Date/time stamped filenames
- Cloud storage upload
RecordMuxer
Coordinates all recording formats:RecordMuxer.java
WebMMuxer
Creates WebM format recordings:WebMMuxer.java
5. Encoder/Transcoder
Handles video and audio transcoding for adaptive bitrate: Encoder Selection (AppSettings.java:1377):h264_nvenc- NVIDIA NVENCh264_qsv- Intel Quick Synch264_vaapi- Linux VA-APIh264_videotoolbox- Apple VideoToolbox
libx264- H.264 software encoderlibopenh264- Cisco OpenH264libvpx- VP8/VP9 encoder
6. DataStore
Persists stream metadata and configuration: Implementation:- MongoDB - Production deployments
- MapDB - Development/testing
- Redis - Cluster mode metadata
- Broadcast information (stream ID, status, quality)
- Encoder settings
- Stream statistics
- Viewer counts
- VoD metadata
- Subscriber/token information
7. Storage Client
Handles cloud storage integration: Supported providers:- Amazon S3
- Azure Blob Storage
- Google Cloud Storage
- S3-compatible (MinIO, DigitalOcean Spaces)
- Upload HLS segments
- Upload MP4 recordings
- Upload preview images
- Delete on completion
- Manage storage classes
Streaming Pipeline
Ingest to Delivery Flow
Packet Flow Detail
-
Receive (AppSettings.java:1001-1003)
- Protocol-specific adaptor accepts connection
- Authenticates publisher
- Extracts codec parameters
-
Process (Muxer.java:918)
- Initialize streams in all muxers
- Configure bitstream filters
- Setup time base mapping
-
Encode (if ABR enabled)
- Decode source stream
- Encode to multiple bitrates
- Apply encoder-specific parameters
-
Mux (Muxer.java:1204)
- Convert timestamps
- Apply bitstream filters
- Write to container format
-
Deliver
- HLS: Upload segments to storage
- WebRTC: Forward via SRTP
- Recording: Write to local/remote file
Cluster Architecture
For high availability and scalability:Cluster Components
Origin Servers:- Accept publisher connections
- Perform transcoding
- Generate HLS/DASH
- Upload to shared storage
- Serve viewers
- Pull streams from origins
- Cache segments locally
- Reduce origin load
- Synchronized stream metadata
- Cluster node status
- Viewer distribution
Performance Considerations
Resource Usage
Per Stream (SFU Mode - No Transcoding):- CPU: 5-10%
- RAM: 50-100 MB
- Bandwidth: Source bitrate × viewer count
- CPU: 50-150% (software) or 10-20% (hardware)
- RAM: 200-500 MB
- Bandwidth: Sum of all rendition bitrates × viewer count
Optimization Strategies
1. Use Hardware EncodingMonitoring & Debugging
Key Metrics
Server Level:- CPU and RAM usage
- Network throughput
- Active stream count
- Encoder queue depth
- Bitrate (video/audio)
- Frame rate
- Packet loss
- Viewer count
- Encoding speed
REST API Monitoring
Logging
Component-specific logging:Security Architecture
Stream Security
Token-based Authentication (AppSettings.java:1083-1093):Transport Security
- WebRTC: DTLS/SRTP encryption (built-in)
- HTTPS: SSL/TLS for HLS/DASH delivery
- RTMPS: TLS encryption for RTMP
- SRT: Built-in AES encryption
Next Steps
Streaming Protocols
Explore supported protocols in detail
Adaptive Bitrate
Configure ABR transcoding
Ultra-Low Latency
Implement WebRTC streaming
API Reference
View REST API documentation
