Module Overview
The server module (server/) runs on the remote Android device and handles screen/audio capture, encoding, and input event injection.
Package : org.server.scrcpy
Location : server/src/main/java/org/server/scrcpy/
Deployment : Packaged as APK, renamed to scrcpy-server.jar, executed via app_process
Entry Point: Server.java
The main entry point for the server process.
File : Server.java (104 lines)
Main Method
public static void main ( String ... args ) throws Exception {
Thread . setDefaultUncaughtExceptionHandler (
new Thread. UncaughtExceptionHandler () {
@ Override
public void uncaughtException ( Thread t, Throwable e) {
Ln . e ( "Exception on thread " + t, e);
}
});
// Clean up old server JAR
try {
Process cmd = Runtime . getRuntime (). exec (
"rm /data/local/tmp/scrcpy-server.jar" );
cmd . waitFor ();
} catch ( IOException | InterruptedException e ) {
e . printStackTrace ();
}
Options options = createOptions (args);
scrcpy (options);
}
The server automatically deletes its own JAR file after startup to prevent version conflicts on subsequent connections.
Command Line Arguments
The server accepts the following arguments from app_process:
Position Parameter Description Example 0 IP Address Client IP for connection validation /127.0.0.11 Max Size Maximum dimension (multiple of 8) 19202 Bit Rate Video encoding bitrate (bps) 80000003 Tunnel Forward Use ADB forward (optional) true
Initialization Flow
private static void scrcpy ( Options options) throws IOException {
Workarounds . apply (); // Initialize system service wrappers
final Device device = new Device (options);
try ( DroidConnection connection = DroidConnection . open (ip)) {
ScreenEncoder screenEncoder = new ScreenEncoder (
options . getBitRate ());
// Start event controller (async)
startEventController (device, connection);
// Start screen streaming (sync)
screenEncoder . streamScreen (device,
connection . getOutputStream ());
}
}
Core Components
DroidConnection
Manages the TCP socket connection to the client.
File : DroidConnection.java (93 lines)
Socket Setup
private static Socket listenAndAccept () throws IOException {
ServerSocket serverSocket = new ServerSocket ( 7007 );
Socket sock = null ;
try {
sock = serverSocket . accept ();
} finally {
serverSocket . close ();
}
return sock;
}
public static DroidConnection open ( String ip) throws IOException {
socket = listenAndAccept ();
// Validate client IP (with warning only)
if ( ! socket . getInetAddress (). toString (). equals (ip)) {
Ln . w ( "socket connect address != " + ip);
}
return new DroidConnection (socket);
}
Control Event Reception
Reads 20-byte touch/key events from the client:
public int [] NewreceiveControlEvent () throws IOException {
byte [] buf = new byte [ 20 ];
int n = inputStream . read (buf, 0 , 20 );
if (n == - 1 ) {
throw new EOFException ( "Event controller socket closed" );
}
// Convert 20 bytes to 5 integers
final int [] array = new int [ 5 ];
for ( int i = 0 ; i < array . length ; i ++ ) {
array[i] = ((( int ) (buf[i * 4 ]) << 24 ) & 0xFF000000 ) |
((( int ) (buf[i * 4 + 1 ]) << 16 ) & 0xFF0000 ) |
((( int ) (buf[i * 4 + 2 ]) << 8 ) & 0xFF00 ) |
(( int ) (buf[i * 4 + 3 ]) & 0xFF );
}
return array; // [action, button, x, y, pointerId]
}
ScreenEncoder
Encodes screen content to H.264 video stream.
File : ScreenEncoder.java (268 lines)
Encoding Configuration
private static MediaFormat createFormat ( int bitRate, int frameRate,
int iFrameInterval) {
MediaFormat format = new MediaFormat ();
format . setString ( MediaFormat . KEY_MIME , "video/avc" );
format . setInteger ( MediaFormat . KEY_BIT_RATE , bitRate);
format . setInteger ( MediaFormat . KEY_FRAME_RATE , frameRate); // 60 fps
format . setInteger ( MediaFormat . KEY_COLOR_FORMAT ,
MediaCodecInfo . CodecCapabilities . COLOR_FormatSurface );
format . setInteger ( MediaFormat . KEY_I_FRAME_INTERVAL , iFrameInterval); // 10s
// Repeat frames to maintain quality during idle periods
format . setLong ( MediaFormat . KEY_REPEAT_PREVIOUS_FRAME_AFTER ,
MICROSECONDS_IN_ONE_SECOND * REPEAT_FRAME_DELAY / frameRate);
return format;
}
Stream Pipeline
Send Device Resolution
First 16 bytes contain width and height: int [] buf = new int []{screenWidth, screenHeight};
byte [] array = new byte [ buf . length * 4 ];
// Convert to bytes and send
outputStream . write (array, 0 , array . length );
Start Audio Capture
Launch audio encoder in separate thread: private void startAudioCapture ( OutputStream outputStream) {
new Thread (() -> {
AudioEncoder audioEncoder = new AudioEncoder ( 128000 );
audioEncoder . streamScreen (outputStream);
}). start ();
}
Create MediaCodec
Configure H.264 encoder with screen dimensions: MediaCodec codec = MediaCodec . createEncoderByType ( "video/avc" );
setSize (format, videoRect . width (), videoRect . height ());
codec . configure (format, null , null , CONFIGURE_FLAG_ENCODE);
Setup Screen Capture
Create input surface and start capture: Surface surface = codec . createInputSurface ();
capture . start (surface);
codec . start ();
Encode Loop
Read encoded frames and send to client: boolean alive = encode (codec, outputStream);
Encoding Loop
private boolean encode ( MediaCodec codec, OutputStream outputStream) {
MediaCodec . BufferInfo bufferInfo = new MediaCodec. BufferInfo ();
boolean eof = false ;
while ( ! consumeRotationChange () && ! eof) {
int outputBufferId = codec . dequeueOutputBuffer (bufferInfo, - 1 );
eof = ( bufferInfo . flags & BUFFER_FLAG_END_OF_STREAM) != 0 ;
if (outputBufferId >= 0 ) {
ByteBuffer outputBuffer = codec . getOutputBuffer (outputBufferId);
if ( bufferInfo . size > 0 && outputBuffer != null ) {
byte [] data = new byte [ outputBuffer . remaining ()];
outputBuffer . get (data);
// Determine frame type
VideoPacket . Flag flag ;
if (( bufferInfo . flags & BUFFER_FLAG_CODEC_CONFIG) != 0 ) {
flag = VideoPacket . Flag . CONFIG ; // SPS/PPS
} else if (( bufferInfo . flags & BUFFER_FLAG_KEY_FRAME) != 0 ) {
flag = VideoPacket . Flag . KEY_FRAME ;
} else {
flag = VideoPacket . Flag . FRAME ;
}
VideoPacket packet = new VideoPacket (
MediaPacket . Type . VIDEO , flag,
bufferInfo . presentationTimeUs , data);
outputStream . write ( packet . toByteArray ());
}
codec . releaseOutputBuffer (outputBufferId, false );
}
}
return ! eof;
}
AudioEncoder
Encodes device audio to AAC stream.
File : AudioEncoder.java (247 lines)
Audio Capture
Uses direct audio capture from system output:
private final AudioCapture capture =
new AudioDirectCapture ( AudioSource . OUTPUT );
private static MediaFormat createFormat ( int bitRate) {
MediaFormat format = new MediaFormat ();
format . setString ( MediaFormat . KEY_MIME , "audio/mp4a-latm" );
format . setInteger ( MediaFormat . KEY_BIT_RATE , bitRate); // 128 kbps
format . setInteger ( MediaFormat . KEY_CHANNEL_COUNT , 2 );
format . setInteger ( MediaFormat . KEY_SAMPLE_RATE , 48000 );
return format;
}
Callback-Based Encoding
Uses MediaCodec.Callback for efficient asynchronous encoding (API 23+):
private class EncoderCallback extends MediaCodec.Callback {
@ Override
public void onInputBufferAvailable ( MediaCodec codec , int index ) {
ByteBuffer buffer = codec . getInputBuffer (index);
int r = capture . read (buffer, bufferInfo);
if (r <= 0 ) {
end ();
return ;
}
codec . queueInputBuffer (index, bufferInfo . offset ,
bufferInfo . size , bufferInfo . presentationTimeUs ,
bufferInfo . flags );
}
@ Override
public void onOutputBufferAvailable ( MediaCodec codec , int index ,
MediaCodec . BufferInfo info ) {
ByteBuffer outputBuffer = codec . getOutputBuffer (index);
byte [] data = new byte [ outputBuffer . remaining ()];
outputBuffer . get (data);
AudioPacket packet = new AudioPacket (
MediaPacket . Type . AUDIO , flag,
info . presentationTimeUs , data);
outputStream . write ( packet . toByteArray ());
codec . releaseOutputBuffer (index, false );
}
}
ScreenCapture
Captures screen content using Android’s display capture APIs.
File : ScreenCapture.java (85 lines)
Virtual Display Creation
public void start ( Surface surface) {
ScreenInfo screenInfo = device . getScreenInfo ();
Rect videoRect = screenInfo . getVideoSize (). toRect ();
try {
// Prefer DisplayManager API (modern approach)
virtualDisplay = ServiceManager . getDisplayManager ()
. createVirtualDisplay (
"scrcpy" ,
videoRect . width (),
videoRect . height (),
0 , // density
surface
);
Ln . d ( "Display: using DisplayManager API" );
} catch ( Exception displayManagerException ) {
// Fallback to SurfaceControl (older method)
try {
display = createDisplay ();
setDisplaySurface (display, surface, deviceRect, videoRect);
} catch ( Exception surfaceControlException ) {
throw new AssertionError ( "Could not create display" );
}
}
}
On Android 12+, secure displays cannot be created with shell permissions. The implementation handles this by checking SDK version and adapting the display creation strategy.
EventController
Injects touch and keyboard events into the Android system.
File : EventController.java (270 lines)
Control Loop
public void control () throws IOException {
turnScreenOn (); // Ensure screen is on
while ( true ) {
int [] buffer = connection . NewreceiveControlEvent ();
if (buffer != null ) {
long now = SystemClock . uptimeMillis ();
// Check if this is a keycode event (X=0, Y=0)
if (buffer[ 2 ] == 0 && buffer[ 3 ] == 0 ) {
injectKeycode (buffer[ 0 ]);
} else {
// Touch event with multi-touch support
int action = buffer[ 0 ];
int button = buffer[ 1 ];
Point point = new Point (buffer[ 2 ], buffer[ 3 ]);
long pointerId = buffer[ 4 ];
Point physicalPoint = device . NewgetPhysicalPoint (point);
injectTouch (action, pointerId, physicalPoint, button);
}
}
}
}
Multi-Touch Support
Implements pointer state tracking for simultaneous touches:
private boolean injectTouch ( int action, long pointerId,
Point point, int button) {
long now = SystemClock . uptimeMillis ();
// Get or create pointer index for this ID
int pointerIndex = pointersState . getPointerIndex (pointerId);
if (pointerIndex == - 1 ) {
Ln . w ( "Too many pointers for touch event" );
return false ;
}
Pointer pointer = pointersState . get (pointerIndex);
pointer . setPoint (point);
pointer . setPressure ( 1.0f );
// Configure pointer properties
pointerProperties[pointerIndex]. toolType =
MotionEvent . TOOL_TYPE_FINGER ;
int source = InputDevice . SOURCE_TOUCHSCREEN ;
// Handle ACTION_POINTER_UP/DOWN for secondary pointers
boolean pointerUp = action == MotionEvent . ACTION_UP ;
int actionType = action & MotionEvent . ACTION_MASK ;
if (actionType == MotionEvent . ACTION_POINTER_UP ) {
pointerUp = true ;
}
pointer . setUp (pointerUp);
int pointerCount = pointersState . update (
pointerProperties, pointerCoords);
// Adjust action for secondary pointers
if (pointerCount > 1 ) {
if (action == MotionEvent . ACTION_UP ||
actionType == MotionEvent . ACTION_POINTER_UP ) {
action = MotionEvent . ACTION_POINTER_UP |
(pointerIndex << MotionEvent . ACTION_POINTER_INDEX_SHIFT );
} else if (action == MotionEvent . ACTION_DOWN ||
actionType == MotionEvent . ACTION_POINTER_DOWN ) {
action = MotionEvent . ACTION_POINTER_DOWN |
(pointerIndex << MotionEvent . ACTION_POINTER_INDEX_SHIFT );
}
}
// Create and inject event
MotionEvent event = MotionEvent . obtain (
lastMouseDown, now, action, pointerCount,
pointerProperties, pointerCoords,
0 , button, 1f , 1f , 0 , 0 , source, 0 );
return injectEvent (event);
}
Event Injection
private boolean injectEvent ( InputEvent event) {
return device . injectInputEvent (event,
InputManager . INJECT_INPUT_EVENT_MODE_ASYNC );
}
private boolean injectKeycode ( int keyCode) {
return injectKeyEvent ( KeyEvent . ACTION_DOWN , keyCode, 0 , 0 )
&& injectKeyEvent ( KeyEvent . ACTION_UP , keyCode, 0 , 0 );
}
System Service Wrappers
The server module includes wrappers for hidden Android system APIs:
Available Wrappers
ServiceManager Access system services via reflection
DisplayManager Manage displays and virtual displays
InputManager Inject input events
SurfaceControl Low-level surface manipulation
WindowManager Window and display properties
PowerManager Screen power state control
Location: server/src/main/java/org/server/scrcpy/wrappers/
Server Packaging and Deployment
Build Process
The server module uses a custom Gradle task to package the APK:
tasks.register('copyServer', Copy) {
def buildType = gradle.startParameter.taskNames.any {
it.endsWith('Release') } ? 'Release' : 'Debug'
dependsOn 'deleteServer'
dependsOn("assemble${buildType}")
def release_file = 'build/outputs/apk/release/server-release-unsigned.apk'
def debug_file = 'build/outputs/apk/debug/server-debug.apk'
def file_dest = '../app/src/main/assets/'
if (buildType == "Debug") {
from file(debug_file)
} else {
from file(release_file)
}
into file(file_dest)
rename { fileName -> 'scrcpy-server.jar' }
}
Deployment Flow
Build
Gradle builds the server module as a standard Android APK
Copy
APK is copied to client assets and renamed to .jar
Extract
Client extracts JAR from assets at runtime
Push
Client pushes JAR to /data/local/tmp/ via ADB
Execute
Client launches server using app_process: CLASSPATH = /data/local/tmp/scrcpy-server.jar \
app_process / org.server.scrcpy.Server \
/<client_ip> < max_siz e > < bitrat e >
Cleanup
Server deletes its own JAR on startup
The .jar extension is used even though the file is an APK because app_process can execute DEX code from any file, and this naming avoids confusion with installable APKs.
Rotation Handling
The server detects screen rotation and signals the client to reconfigure:
public interface RotationListener {
void onRotationChanged ( int rotation );
}
@ Override
public void onRotationChanged ( int rotation) {
rotationChanged . set ( true );
}
// In encoding loop
while ( ! consumeRotationChange () && ! eof) {
// Encode frames...
}
// When rotation detected, break loop and restart encoder
Frame Rate Control
Default: 60 fps
I-frame interval: 10 seconds
Repeat frame delay: 6 frames (100ms at 60fps)
Bitrate Settings
Video: User-configurable (typically 2-8 Mbps)
Audio: Fixed at 128 kbps
Memory Management
Packet size validation prevents memory exhaustion:
if (size > 4 * 1024 * 1024 ) { // 4 MB limit
// Disconnect - packet too large
serviceCallbacks . errorDisconnect ();
}
Client Module Learn about decoding and UI components
Architecture Overview Understand the complete system design