Overview
The VideoDecoder class provides hardware-accelerated H.264/AVC video decoding using Android’s MediaCodec API. It runs in a separate thread and renders decoded frames directly to a Surface.
Package: org.client.scrcpy.decoder
Video Codec: video/avc (H.264/AVC)
Initialization
start()
Starts the decoder worker thread.
Example:
VideoDecoder videoDecoder = new VideoDecoder();
videoDecoder.start();
stop()
Stops the decoder and releases resources.
Example:
@Override
protected void onDestroy() {
if (videoDecoder != null) {
videoDecoder.stop();
}
super.onDestroy();
}
Configuration
Configures the MediaCodec decoder with stream parameters.
public void configure(Surface surface, int width, int height,
ByteBuffer csd0, ByteBuffer csd1)
Target surface for rendering decoded frames
Codec-specific data 0 (SPS - Sequence Parameter Set)
Codec-specific data 1 (PPS - Picture Parameter Set)
Example:
VideoPacket.StreamSettings settings = VideoPacket.getStreamSettings(configData);
videoDecoder.configure(surface, 1920, 1080, settings.sps, settings.pps);
Internal Implementation:
// From source (VideoDecoder.java:68-78)
MediaFormat format = MediaFormat.createVideoFormat("video/avc", width, height);
format.setByteBuffer("csd-0", csd0);
format.setByteBuffer("csd-1", csd1);
mCodec = MediaCodec.createDecoderByType("video/avc");
mCodec.configure(format, surface, null, 0);
mCodec.start();
Calling configure() while already configured will stop the current decoder and create a new one. This allows dynamic reconfiguration for resolution changes.
Decoding
decodeSample()
Queues an encoded video sample for decoding.
public void decodeSample(byte[] data, int offset, int size,
long presentationTimeUs, int flags)
Byte array containing the encoded video frame
Starting position in the data array
Number of bytes to decode
Presentation timestamp in microseconds
MediaCodec flags (e.g., MediaCodec.BUFFER_FLAG_KEY_FRAME)
Example:
// Decode a key frame
videoDecoder.decodeSample(
packet, // data
VideoPacket.getHeadLen(), // offset
packet.length - VideoPacket.getHeadLen(), // size
0, // presentationTimeUs
VideoPacket.Flag.KEY_FRAME.getFlag() // flags
);
Frame Processing:
// From source (VideoDecoder.java:86-98)
int index = mCodec.dequeueInputBuffer(-1);
if (index >= 0) {
ByteBuffer buffer;
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.LOLLIPOP) {
buffer = mCodec.getInputBuffers()[index];
buffer.clear();
} else {
buffer = mCodec.getInputBuffer(index);
}
if (buffer != null) {
buffer.put(data, offset, size);
mCodec.queueInputBuffer(index, 0, size, presentationTimeUs, flags);
}
}
VideoPacket Structure
Video data is transmitted using the VideoPacket structure.
Header (10 bytes):
Byte 0: Type (0 = VIDEO)
Byte 1: Flag (frame type)
Bytes 2-9: Presentation timestamp (long)
Total packet structure:
[Type][Flag][Timestamp][Data...]
1B 1B 8B Variable
Flag Types
Value: 0 - Regular P-frame or B-frame
Value: 1 - I-frame (keyframe)
Value: 2 - Configuration data (SPS/PPS)
StreamSettings
Extracted from CONFIG packets containing SPS/PPS data.
public static class StreamSettings {
public ByteBuffer pps; // Picture Parameter Set
public ByteBuffer sps; // Sequence Parameter Set
}
Example:
byte[] configData = new byte[dataLength];
System.arraycopy(packet, VideoPacket.getHeadLen(), configData, 0, dataLength);
VideoPacket.StreamSettings settings = VideoPacket.getStreamSettings(configData);
// Settings contain:
// settings.sps - Sequence Parameter Set
// settings.pps - Picture Parameter Set
Usage Example
Complete example of setting up and using the video decoder:
public class VideoPlayer {
private VideoDecoder videoDecoder;
private Surface surface;
public void startVideoPlayback() {
// 1. Initialize decoder
videoDecoder = new VideoDecoder();
videoDecoder.start();
// 2. Wait for configuration packet
VideoPacket configPacket = receivePacket();
if (configPacket.flag == VideoPacket.Flag.CONFIG) {
// Extract SPS/PPS
byte[] configData = extractData(configPacket);
VideoPacket.StreamSettings settings =
VideoPacket.getStreamSettings(configData);
// Configure decoder
videoDecoder.configure(surface, 1920, 1080,
settings.sps, settings.pps);
}
// 3. Decode frames
while (isPlaying) {
VideoPacket packet = receivePacket();
if (packet.flag == VideoPacket.Flag.KEY_FRAME ||
packet.flag == VideoPacket.Flag.FRAME) {
videoDecoder.decodeSample(
packet.data,
VideoPacket.getHeadLen(),
packet.data.length - VideoPacket.getHeadLen(),
packet.presentationTimeStamp,
packet.flag.getFlag()
);
}
if (packet.flag == VideoPacket.Flag.END) {
break;
}
}
// 4. Cleanup
videoDecoder.stop();
}
}
Threading Model
The VideoDecoder uses an internal Worker thread:
// From source (VideoDecoder.java:104-125)
@Override
public void run() {
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
while (mIsRunning.get()) {
if (mIsConfigured.get()) {
int index = mCodec.dequeueOutputBuffer(info, 0);
if (index >= 0) {
// Render frame onto Surface
mCodec.releaseOutputBuffer(index, true);
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM)
== MediaCodec.BUFFER_FLAG_END_OF_STREAM) {
break;
}
}
} else {
// Wait for configuration
Thread.sleep(5);
}
}
}
The decoder uses hardware acceleration. Ensure that:
- The device supports H.264/AVC hardware decoding
- The Surface is properly initialized before calling
configure()
- Frame rates don’t exceed device capabilities
Key Points:
- Decoded frames render directly to the Surface (zero-copy)
- Input buffer dequeue timeout is set to
-1 (wait indefinitely)
- Output buffer dequeue timeout is
0 (non-blocking)
- Setting
releaseOutputBuffer(index, true) renders the frame
See Also