> 技术文档 > Android屏幕共享+WebSocket实现传输截图_android kotlin 使用websocket,投屏,代码

Android屏幕共享+WebSocket实现传输截图_android kotlin 使用websocket,投屏,代码

大致流程为:客户端通过MediaProjectionManager进行屏幕共享,获取屏幕的截图,然后通过WebSocket发送到服务端,然后通过浏览器访问服务端查看传输的截图。

客户端代码分为3部分:

  1. ScreenShotShareActivity:作用用户的交互界面,有2个按钮,一个是开启屏幕共享,另一个是结束屏幕共享,开启后会启动一个前台服务(这个是系统要求的,屏幕共享的逻辑必须在Service进行)
  2. ScreenShotEncoder:用于实现屏幕共享的图片截图处理,并设置结果回调,相当于一个工具类,会在Service中调用。
  3. ScreenCaptureService:前台服务,启动后会监听ScreenShotEncoder的结果,然后通过WebSocket将结果同步给远端服务器。

服务端代码分为2部分:
服务端通过Node js来实现,需要安装socket.io,2个代码文件:

  1. server.js:作为服务响应
  2. index.html:前端页面,用来展示客户端截图结果

客户端实现

清单文件配置

ScreenShotShareActivity + ScreenCaptureService 声明

<activity android:name=\".demo.sharescreen.ScreenShotShareActivity\" /> <service android:name=\".demo.sharescreen.ScreenCaptureService\" android:exported=\"false\" android:foregroundServiceType=\"mediaProjection\" />

权限声明

<uses-permission android:name=\"android.permission.INTERNET\" /><uses-permission android:name=\"android.permission.FOREGROUND_SERVICE\" /><uses-permission android:name=\"android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION\" />

添加依赖

app/build.gradle

implementation(\'io.socket:socket.io-client:1.0.0\')

根目录/settings.gradle

dependencyResolutionManagement { repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS) repositories { google() mavenCentral() maven { url \'https://maven.webrtc.org\' } // ✅ 加这里 }}

ScreenShotShareActivity

class ScreenShotShareActivity : AppCompatActivity() { private var isBound = false private var binder: ScreenCaptureService.LocalBinder? = null private var imageView: ImageView? = null private val screenCaptureLauncher = registerForActivityResult( ActivityResultContracts.StartActivityForResult() ) { result -> if (result.resultCode == Activity.RESULT_OK && result.data != null) { startScreenEncoding(result.resultCode, result.data!!) } else { Toast.makeText(this, \"屏幕录制权限被拒绝\", Toast.LENGTH_SHORT).show() } } private val connection = object : ServiceConnection { override fun onServiceConnected(name: ComponentName?, service: IBinder?) { binder = service as? ScreenCaptureService.LocalBinder binder?.setImageCallback(object : ScreenShotCaptureCallback { override fun onJpegImageReady(jpegData: String) {  // 采集端本地测试  val decodedBytes = Base64.decode(jpegData, Base64.NO_WRAP)  val bitmap = BitmapFactory.decodeByteArray(decodedBytes, 0, decodedBytes.size)  runOnUiThread { imageView?.setImageBitmap(bitmap)  } } }) isBound = true } override fun onServiceDisconnected(name: ComponentName?) { isBound = false } } override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) imageView = ImageView(this).apply { scaleType = ImageView.ScaleType.FIT_CENTER } setContentView(LinearLayout(this).apply { orientation = LinearLayout.VERTICAL addView( Button(this@ScreenShotShareActivity).apply {  text = \"开始屏幕录制\"  click { requestScreenCapture()  } } ) addView( Button(this@ScreenShotShareActivity).apply {  text = \"停止屏幕录制\"  click { stopScreenCapture()  } } ) addView(imageView, LinearLayout.LayoutParams( LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT )) }) } private fun stopScreenCapture() { if (isBound) unbindService(connection) val intent = Intent(this@ScreenShotShareActivity, ScreenCaptureService::class.java) stopService(intent) } /** * 请求屏幕共享权限 */ private fun requestScreenCapture() { val projectionManager = getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager val intent = projectionManager.createScreenCaptureIntent() screenCaptureLauncher.launch(intent) } /** * 开始屏幕共享,启动前台服务 ScreenCaptureService */ private fun startScreenEncoding(resultCode: Int, data: Intent) { val serviceIntent = Intent(this, ScreenCaptureService::class.java).apply { putExtra(\"resultCode\", resultCode) putExtra(\"data\", data) } ContextCompat.startForegroundService(this, serviceIntent) bindService(serviceIntent, connection, Context.BIND_AUTO_CREATE) }}

ScreenCaptureService

/** * 前台服务,需要添加权限: *  *  */class ScreenCaptureService : Service() { private var screenShotEncoder:ScreenShotEncoder? = null private var screenShotCallback: ScreenShotCaptureCallback? = null private val socket by lazy { val opts = IO.Options().apply { transports = arrayOf(\"websocket\") // ⭐ 避免 xhr-poll 错误 reconnection = true reconnectionAttempts = 5 timeout = 5000 } IO.socket(\"http://192.168.28.101:3000\",opts) } private fun startForeground(notification: Notification) { if (VERSION.SDK_INT >= Build.VERSION_CODES.Q) { startForeground( 1, notification, ServiceInfo.FOREGROUND_SERVICE_TYPE_MEDIA_PROJECTION ) } else { startForeground(1, notification) } } override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int { startForeground(createNotification()) val resultCode = intent?.getIntExtra(\"resultCode\", Activity.RESULT_CANCELED) ?: return START_NOT_STICKY val data = intent.getParcelableExtra<Intent>(\"data\") ?: return START_NOT_STICKY initSocketIO() screenShotEncoder = ScreenShotEncoder(this, resultCode, data) screenShotEncoder?.startCapturing { base64Jpeg -> // 回调给外部activity,可以用做测试或者打印log screenShotCallback?.onJpegImageReady(base64Jpeg) // 发送给远端服务器 socket.emit(\"frame\", base64Jpeg) } return START_STICKY } private fun initSocketIO() { try { // 连接状态监听 socket.on(Socket.EVENT_CONNECT) { Log.d(\"ScreenCaptureService\", \"✅ Socket 已连接到服务器\") } socket.on(Socket.EVENT_CONNECT_ERROR) { args -> Log.e(\"ScreenCaptureService\", \"❌ Socket 连接失败: ${args.getOrNull(0)}\") } socket.on(Socket.EVENT_DISCONNECT) { Log.w(\"ScreenCaptureService\", \"⚠️ Socket 已断开连接\") } socket.connect() } catch (e: URISyntaxException) { Log.e(\"ScreenCaptureService\", \"❌ Socket URI 格式错误\", e) } } private fun createNotification(): Notification { val channelId = \"screen_capture\" val manager = getSystemService(Context.NOTIFICATION_SERVICE) as NotificationManager if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { val channel = NotificationChannel(channelId, \"屏幕共享\", NotificationManager.IMPORTANCE_LOW) manager.createNotificationChannel(channel) } return NotificationCompat.Builder(this, channelId) .setContentTitle(\"屏幕共享中\") .setContentText(\"正在采集屏幕内容\") .setSmallIcon(android.R.drawable.ic_menu_camera) .build() } override fun onBind(intent: Intent?): IBinder? { return LocalBinder() } override fun onDestroy() { super.onDestroy() screenEncoder?.stopEncoding() screenShotEncoder?.stopCapturing() socket.disconnect() Log.d(\"ScreenCaptureService\",\"服务销毁\") } inner class LocalBinder : Binder() { fun stopCapture() { stopSelf() } fun setImageCallback(callback: ScreenShotCaptureCallback) { screenShotCallback = callback } }}interface ScreenShotCaptureCallback { fun onJpegImageReady(jpegData: String)}

ScreenShotEncoder

class ScreenShotEncoder( private val context: Context, private val resultCode: Int, private val data: Intent, private val width: Int = 720, private val height: Int = 1280, private val dpi: Int = 320) { private lateinit var mediaProjection: MediaProjection private var imageReader: ImageReader? = null private var handlerThread: HandlerThread? = null private var handler: Handler? = null private var isCapturing = false fun startCapturing(onFrameReady: (String) -> Unit) { setupMediaProjection() setupImageReader(onFrameReady) isCapturing = true } fun stopCapturing() { isCapturing = false imageReader?.close() mediaProjection.stop() handlerThread?.quitSafely() } private fun setupMediaProjection() { val projectionManager = context.getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager mediaProjection = projectionManager.getMediaProjection(resultCode, data) } private fun setupImageReader(onFrameReady: (String) -> Unit) { imageReader = ImageReader.newInstance(width, height, PixelFormat.RGBA_8888, 2) handlerThread = HandlerThread(\"ScreenShotThread\").also { it.start() } handler = Handler(handlerThread!!.looper) mediaProjection.createVirtualDisplay( \"ScreenShotCapture\", width, height, dpi, 0, imageReader!!.surface, null, handler ) handler?.post(object : Runnable { override fun run() { if (!isCapturing) return val image = imageReader?.acquireLatestImage() if (image != null) {  val planes = image.planes  val buffer = planes[0].buffer  val pixelStride = planes[0].pixelStride  val rowStride = planes[0].rowStride  val rowPadding = rowStride - pixelStride * width  val bitmap = createBitmap(width + rowPadding / pixelStride, height)  bitmap.copyPixelsFromBuffer(buffer)  image.close()  val outputStream = ByteArrayOutputStream()  bitmap.compress(Bitmap.CompressFormat.JPEG, 70, outputStream)  val byteArray = outputStream.toByteArray()  val base64Image = Base64.encodeToString(byteArray, Base64.NO_WRAP)  onFrameReady(base64Image)  bitmap.recycle() } handler?.postDelayed(this, 100) // 每 100ms 截图一帧(约10帧/s) } }) }}

启动App后,看到界面如下:
Android屏幕共享+WebSocket实现传输截图_android kotlin 使用websocket,投屏,代码

点击开始录制,会弹一个权限弹窗
Android屏幕共享+WebSocket实现传输截图_android kotlin 使用websocket,投屏,代码
同意后,进入录制
Android屏幕共享+WebSocket实现传输截图_android kotlin 使用websocket,投屏,代码

服务端实现

第一步:准备环境

安装 Node.js
如果你尚未安装,请去 https://nodejs.org/ 下载并安装 LTS 版本。
安装完成后,打开终端输入:

node -vnpm -v

确认安装成功。

第二步:创建项目文件夹

mkdir webrtc-servercd webrtc-servernpm init -ynpm install express socket.io

确认安装的版本是要socket.io@2.4.1,如果不是可以卸载重装

npm uninstall socket.ionpm install socket.io@2.4.1

socket.io@2.4.1 是 Android 端 socket.io-client 最稳定兼容的版本

第三步:创建信令服务器 server.js

在webrtc-server文件夹下创建server.js文件,代码如下:

// server.jsconst express = require(\'express\');const app = express();const http = require(\'http\').createServer(app);const io = require(\'socket.io\')(http); // v2.4.1 不需要 CORS 配置app.use(express.static(__dirname + \'/public\'));io.on(\'connection\', socket => { console.log(\'🟢 Client connected:\', socket.id); socket.on(\'frame\', (data) => { console.log(`📥 接收到 frame,长度=${data.length}`); socket.broadcast.emit(\'frame\', data); }); socket.on(\'offer\', (data) => { socket.broadcast.emit(\'offer\', data); }); socket.on(\'answer\', (data) => { socket.broadcast.emit(\'answer\', data); }); socket.on(\'ice-candidate\', (data) => { socket.broadcast.emit(\'ice-candidate\', data); }); socket.on(\'disconnect\', () => { console.log(\'🔴 Client disconnected:\', socket.id); });});const PORT = 3000;http.listen(PORT, \'0.0.0.0\', () => { console.log(`🚀 Server running at http://0.0.0.0:${PORT}`);});

第四步:创建接收端前端页面

在 webrtc-server 目录下创建一个 public/index.html 文件:

  屏幕共享画面  #log { background: #f4f4f4; border: 1px solid #ccc; padding: 10px; margin-top: 10px; height: 100px; overflow-y: auto; font-family: monospace; font-size: 14px; } 

接收到的屏幕共享图像

const socket = io(); const img = document.getElementById(\"screenImg\"); const logEl = document.getElementById(\"log\"); function log(message) { const time = new Date().toLocaleTimeString(); const entry = `[${time}] ${message}`; console.log(entry); logEl.innerText += entry + \"\\n\"; logEl.scrollTop = logEl.scrollHeight; } socket.on(\"connect\", () => { log(\"✅ 已连接到服务器\"); }); socket.on(\"disconnect\", () => { log(\"❌ 与服务器断开连接\"); }); socket.on(\"frame\", (base64) => { log(\"🖼️ 收到一帧图像\"); img.src = \"data:image/jpeg;base64,\" + base64; }); socket.on(\"connect_error\", (err) => { log(\"⚠️ 连接错误: \" + err.message); });

第五步:运行服务器

node server.js

你会看到:
Android屏幕共享+WebSocket实现传输截图_android kotlin 使用websocket,投屏,代码
说明服务端运行成功了,并且有一个客户端加入了,这个就是我们的客户端发起的屏幕共享加入的。

验证效果

打开浏览器输入地址:http://192.168.28.101:3000/index.html
注意:这里的地址是局域网地址,我本机电脑的地址,和我手机是在同一个局域网内。

成功的话,可以看到客户端的共享内容:
Android屏幕共享+WebSocket实现传输截图_android kotlin 使用websocket,投屏,代码