Android WebRTC Screen Sharing: The Complete Implementation Guide 2025 — cover illustration

Android screen sharing over WebRTC in 2026 is no longer a weekend hack — Android 14 broke half the patterns that worked in 2023. If you ship video to Android users today, you need a foreground service of type mediaProjection running before you request the projection, you need a maintained WebRTC library (Google’s prebuilt is dead), and you need to design for the Android 14+ partial–app picker. This guide walks through the complete implementation a Fora Soft Android team uses on shipping products like BrainCert’s virtual classroom and VOLO.live’s 22,000–attendee translation system, with code skeletons, real bitrate budgets, and the five gotchas that bite teams in production.

Key takeaways

  • Stop using google–webrtc. The official prebuilt has been unmaintained since 2018. In 2026, use getstream/webrtc–android (recommended) or LiveKit’s fork.
  • Foreground service first, projection second. Android 14 enforces it; getting it backwards throws SecurityException and kills your app.
  • 15 fps at 720p, 1.5–2 Mbps is the realistic baseline for shared screens on cellular. 1080p only when WiFi is confirmed.
  • Android 14+ partial app capture changes the UX assumption — users can pick a single window. Listen for onCapturedContentResize and onCapturedContentVisibilityChanged.
  • Android 15 auto–terminates capture on lock screen, hides notification content during projection, and exposes a recording state callback so apps can defend sensitive views.
  • Fora Soft ships this faster by reusing a tested screen–capture module across our WebRTC engagements — estimates that beat hand–built quotes by 30–40% on time and budget.

Why we wrote this guide

Fora Soft has built WebRTC products since 2010, including the Android clients for BrainCert ($3M ARR LMS), VOLO.live (22,000+ attendees at Black Hat 2025), and Sprii (€365M+ live commerce). Screen sharing is a feature we have shipped on multiple Android codebases, in multiple Android versions, and we have learned where the surface bites.

This guide is what we hand to a new Android engineer starting on a screen–share feature in 2026. Every code skeleton compiles, every bitrate number is from production traffic, every gotcha is one we have hit and fixed.

Skip ahead. Jump to the Kotlin skeleton if you just need code. Read Android 14 changes if you have an existing implementation that broke. Use the troubleshooting matrix when something is on fire in production.

Need help shipping this? Book a call →

Screen sharing on Android — the 60–second overview

At a high level, the Android screen–share path has five moving parts:

1. MediaProjection. Android’s system API for capturing screen pixels. Requires user consent every session.

2. Foreground service. Android 14 mandates that media projection run inside a foreground service of type mediaProjection, started before the projection is granted.

3. ScreenCapturerAndroid. The WebRTC library’s VideoCapturer implementation that wraps a MediaProjection into a frame stream.

4. PeerConnection. The WebRTC abstraction that handles ICE, signaling, and SRTP encryption to the remote peer.

5. Signaling. Out–of–band channel (WebSocket, SIP, custom) that exchanges SDP offers/answers and ICE candidates.

Skip any one of those and the feature does not work. Get them in the wrong order on Android 14+ and the OS kills the app.

Picking a WebRTC library in 2026

This is the first decision and the one most teams get wrong. Google’s org.webrtc:google-webrtc has been unmaintained since 2018 and the prebuilt was removed when JCenter shut down. If a 2025 Stack Overflow answer tells you to use it, the answer is wrong for 2026.

Library Status Best for License
getstream/webrtc–android Active New projects, recent WebRTC features, Compose support BSD
webrtc–sdk/android (LiveKit) Active LiveKit–based apps, namespace isolation Apache 2.0
org.webrtc:google–webrtc Dead since 2018 Nothing — do not use BSD
LiveKit Android SDK Active High–level abstraction; managed infrastructure Apache 2.0
Daily Android SDK Active Daily.co customers; out–of–the–box rooms Proprietary

Our pick for new builds: io.getstream:stream-webrtc-android. Active, follows upstream WebRTC, kotlin–friendly APIs, Compose support, no licensing surprises.

// build.gradle.kts (Module: app)

dependencies {
    implementation("io.getstream:stream-webrtc-android:1.3.6")
    // optional Compose helpers
    implementation("io.getstream:stream-webrtc-android-ui-compose:1.3.6")
}

Always check the GetStream/webrtc–android repo for the latest version.

AndroidManifest.xml — the permissions and service declaration

In Android 14 and later, missing permissions or wrong service type kill the feature silently or with confusing exceptions. Get this right first.

<manifest ...>
  <uses-permission android:name="android.permission.INTERNET" />
  <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
  <uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION" />
  <uses-permission android:name="android.permission.POST_NOTIFICATIONS" />

  <application ...>
    <service
      android:name=".ScreenShareService"
      android:foregroundServiceType="mediaProjection"
      android:exported="false" />
  </application>
</manifest>

Two non–obvious bits: FOREGROUND_SERVICE_MEDIA_PROJECTION is required from Android 14; POST_NOTIFICATIONS from Android 13. Both must be in the manifest, and the notification permission must also be requested at runtime.

The foreground service — the order–of–operations that breaks Android 14

From Android 14, you must call startForeground before calling getMediaProjection. Doing it backwards throws SecurityException. The skeleton:

class ScreenShareService : Service() {
  override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
    val notification = buildNotification()
    startForeground(
      NOTIF_ID, notification,
      ServiceInfo.FOREGROUND_SERVICE_TYPE_MEDIA_PROJECTION
    )
    // NOW it's safe to grab the projection
    val resultCode = intent!!.getIntExtra(EXTRA_RESULT_CODE, -1)
    val data = intent.getParcelableExtra<Intent>(EXTRA_DATA)!!
    val mp = mediaProjectionManager.getMediaProjection(resultCode, data)
    startCapture(mp)
    return START_STICKY
  }
}

Critical bits: pass the projection’s resultCode and data Intent through to the service via putExtra; the projection intent is single–use, so if you grab it in the activity and toss it away, you lose it.

Requesting the projection — the user consent dance

The user must approve every projection session. There is no way to suppress the dialog. The flow is launch→intent, receive result, hand off to the foreground service.

// In your Activity (Compose or View)
private val projectionLauncher = registerForActivityResult(
  ActivityResultContracts.StartActivityForResult()
) { result ->
  if (result.resultCode == RESULT_OK && result.data != null) {
    val svc = Intent(this, ScreenShareService::class.java).apply {
      putExtra(EXTRA_RESULT_CODE, result.resultCode)
      putExtra(EXTRA_DATA, result.data!!)
    }
    ContextCompat.startForegroundService(this, svc)
  }
}

fun requestScreenShare() {
  val mpm = getSystemService(MediaProjectionManager::class.java)
  projectionLauncher.launch(mpm.createScreenCaptureIntent())
}

Wiring the capturer to a PeerConnection

Inside the service, hand the projection to ScreenCapturerAndroid, wrap it in a VideoSource, and add the resulting track to your PeerConnection.

private fun startCapture(projection: MediaProjection) {
  val eglBase = EglBase.create()
  val factory = PeerConnectionFactory.builder()
    .setVideoEncoderFactory(
      DefaultVideoEncoderFactory(eglBase.eglBaseContext, true, true)
    )
    .setVideoDecoderFactory(
      DefaultVideoDecoderFactory(eglBase.eglBaseContext)
    )
    .createPeerConnectionFactory()

  val capturer = ScreenCapturerAndroid(
    projectionData(projection),
    object : MediaProjection.Callback() {
      override fun onStop() { stopSelf() }
    }
  )

  val sthelper = SurfaceTextureHelper.create("CaptureThread", eglBase.eglBaseContext)
  val source = factory.createVideoSource(true) // isScreencast = true
  capturer.initialize(sthelper, this, source.capturerObserver)
  capturer.startCapture(1280, 720, 15) // width, height, fps

  val track = factory.createVideoTrack("screen0", source)
  peerConnection.addTrack(track, listOf("stream0"))
}

Key flag: createVideoSource(true) — the isScreencast argument tells WebRTC to use the screen–optimized encoder profile (lower keyframe rate, content–adaptive bitrate). Forgetting this is the most common reason screen shares look terrible.

Bitrate, frame rate, resolution — the budget that actually works

Most teams pick numbers that look good on the marketing page (1080p, 30 fps) and ship something that drops frames on cellular. The numbers below are from real production traffic on shipped Fora Soft Android apps.

Use case Resolution FPS Bitrate Network
Slides / static 1280×720 5–10 500–800 kbps 3G+
App walkthrough 1280×720 15 1.0–1.5 Mbps 4G/WiFi
Video playback 1280×720 24–30 2–3 Mbps WiFi
Code walkthrough 1920×1080 10–15 1.5–2.5 Mbps WiFi
High–quality 1080p 1920×1080 30 3–5 Mbps WiFi only

Default for production: 1280×720 at 15 fps, target bitrate 1.5 Mbps, max 2.5 Mbps. This works on virtually all 4G connections, looks good for 95% of screen content, and does not melt mid–range device CPUs.

Set max bitrate explicitly via RtpSender.parameters.encodings[i].maxBitrateBps. Without this, WebRTC’s congestion control will starve other audio and lower–priority streams in the same connection.

Codecs — what to ship in 2026

Codec choice in Android WebRTC depends on what the device hardware supports and what the remote peer can decode.

H.264. Universal hardware support on Android. Best default for screen sharing in 2026 because every device can encode in hardware. Trade–off: weaker compression than VP9 on text–heavy content.

VP8. The historical WebRTC default. Universal software support; hardware support varies. Slightly better quality than H.264 at the same bitrate on screen content.

VP9. Better compression on text–heavy screens. Hardware support is patchy on mid–range Android. Acceptable as an upgrade path with software fallback.

AV1. Best compression but 3–5× the CPU vs VP9. Hardware decoders for AV1 are rare on Android in 2026; most devices software–decode, which kills battery. AV1 SCC (Screen Content Coding) is interesting for ultra–low–bitrate screen sharing (100–500 kbps) but mainstream adoption is 2027–2028.

Practical default: H.264 with VP8 fallback. Negotiate via SDP munging; let the remote browser/app pick. Skip AV1 for screen share until hardware decoders are common, except for niche ultra–low–bandwidth scenarios.

Capturing system audio alongside the screen

Android 10+ ships AudioPlaybackCaptureConfiguration, which lets you capture system audio playback during a media projection session. Useful when sharing screens with video playback or app audio cues.

val config = AudioPlaybackCaptureConfiguration.Builder(projection)
  .addMatchingUsage(AudioAttributes.USAGE_MEDIA)
  .addMatchingUsage(AudioAttributes.USAGE_GAME)
  .build()

val audioRecord = AudioRecord.Builder()
  .setAudioPlaybackCaptureConfig(config)
  .setAudioFormat(...)
  .setBufferSizeInBytes(bufferSize)
  .build()

Caveats: apps marked android:allowAudioPlaybackCapture="false" opt out (banking, DRM apps usually do). Mic input remains a separate AudioRecord; mix the two streams server–side or client–side depending on use case.

Android 14 changes — partial app capture and resize callbacks

Android 14 introduced two big changes that break naive implementations:

1. Partial app capture. The OS shows a picker so the user can share a single app instead of the whole screen. Your code does not need to do anything to support this — the projection just returns frames for the chosen app. But UX assumptions break: users think they are sharing slides, you think they are sharing the home screen.

2. New callbacks. MediaProjection.Callback#onCapturedContentResize(width, height) fires when the captured app changes size. onCapturedContentVisibilityChanged(isVisible) tells you when the captured content disappears (user backgrounded it).

projection.registerCallback(object : MediaProjection.Callback() {
  override fun onStop() { stopCapture() }
  override fun onCapturedContentResize(w: Int, h: Int) {
    capturer.changeCaptureFormat(w, h, 15)
  }
  override fun onCapturedContentVisibilityChanged(visible: Boolean) {
    if (!visible) showResumePrompt()
  }
}, handler)

Without these, your shared app at 1080×1920 might shrink to 800×600 because the user pulled it into split–screen, and your remote viewer sees a stretched, blurry mess.

Android 15 — lock–screen termination and recording detection

Android 15 tightened privacy further:

Auto–termination on lock. If the user locks the device, the projection stops automatically. Resume requires re–consent. Plan UX around this: prompt the user to re–share when they unlock.

Notification masking. Notification body is hidden during projection. Users see “Notification” instead of the actual content.

Recording detection callback. Apps can now detect when they are being recorded via MediaProjection.Callback#onRecordingStateChanged(). Sensitive screens (banking, password input, two–factor codes) should detect this and refuse to render.

Status bar chip. Android 15 shows a persistent chip at the top of the status bar during projection. Users can tap it to stop sharing. Your app cannot suppress this.

Content protection — FLAG_SECURE and DRM

If you are building an app that handles sensitive data — banking, healthcare, paid streaming — you need to know what other apps’ FLAG_SECURE means for your screen capture, and what FLAG_SECURE on your own app means for being captured.

Capturing protected content. Apps that set FLAG_SECURE on their windows render as black in your captured frames. Hardware DRM (Widevine L1) is enforced at the chipset level and bypasses any software workaround.

Protecting your own content. If your app shows things others should not capture (PINs, OTPs, PII), set window.setFlags(FLAG_SECURE, FLAG_SECURE) on those Activities. On Android 15, also implement the recording state callback as a defense in depth.

Signaling, ICE, and TURN — the connectivity that screen share needs

WebRTC needs a signaling channel to exchange SDP offers/answers and ICE candidates. WebRTC does not provide signaling; you bring your own (WebSocket is the dominant choice).

ICE servers. Configure both STUN and TURN. STUN handles direct–peer connectivity in friendly NATs. TURN is mandatory for mobile networks because most cellular CGN setups block direct peering. Real production deployments use 2–3 STUN and 1–2 TURN servers across regions.

TURN providers. Twilio Network Traversal Service, Xirsys, Cloudflare Calls TURN, or self–host with coturn. Plan for 0.5–2¢ per minute of relayed traffic; this can dwarf signaling and SFU costs at scale.

Reconnect strategy. Mobile screen–share sessions are long–lived (often 10–30 minutes). The signaling WebSocket will drop. Implement exponential–backoff reconnect with heartbeats, and re–negotiate SDP if the new connection has different IP topology.

For deeper architecture detail see our WebRTC architecture guide and production WebRTC systems page.

Orientation, backgrounding, and screen rotation

Three runtime conditions silently break screen sharing if not handled.

Rotation. The device rotating sends a configuration change. The capturer will keep producing frames in the original orientation until you call capturer.changeCaptureFormat(width, height, fps) with the new dimensions. Hook into onConfigurationChanged at the service or attach a DisplayManager.DisplayListener.

Backgrounding the source app. The user may share an app, then switch away. On Android 14+ this fires onCapturedContentVisibilityChanged(false); show a UI prompt to ask the user to return.

Doze and battery optimizations. The foreground service notification protects you from being killed for inactivity, but Doze can throttle network. For long sessions, request battery optimization exemption (REQUEST_IGNORE_BATTERY_OPTIMIZATIONS) with clear user disclosure — Google Play has policy guardrails on misuse.

Troubleshooting matrix — symptoms to causes

Symptom Likely cause Fix
SecurityException on getMediaProjection Foreground service not started first (Android 14+) Call startForeground before getMediaProjection
Black frames received Captured app uses FLAG_SECURE or DRM Cannot fix — document for users
Choppy frames on cellular Bitrate too high; congestion control starving Cap bitrate at 1.5–2 Mbps; consider VP8
Capture stops after 30 sec No foreground service; OS killed projection Run capture inside a foreground service
Stretched / squashed image after rotation No rotation handler Call changeCaptureFormat on configuration change
No connection on cellular No TURN server; CGN blocking direct peer Configure TURN with udp + tcp + tls
App crashes on Android 15 after lock Capture terminated; no resume handling Listen for onStop and prompt user to re–share

Testing strategy — how to validate before shipping

Screen sharing is hard to unit test — it depends on real graphics, real network, and real users tapping a system dialog. Our testing pyramid:

Unit tests. Mock the MediaProjection; test the lifecycle — intent handling, service start order, callback wiring. Catches the “wrong order” bugs.

Instrumented tests on Firebase Test Lab. Run on real Android 11, 12, 13, 14, 15 devices. Use UiAutomator to confirm the screen capture dialog appears and your service starts. Lab runs are flaky but catch device–family regressions.

Manual matrix. Test on Pixel 6+ (Android 14–15), Samsung Galaxy A–series (Android 13–14), Xiaomi Redmi (Android 12–13). These three families catch 90% of compatibility issues.

Network shaping. Use adb shell tc or proxies like Charles Network Conditioner to simulate 3G, lossy 4G, and rapid network switches. Most field bugs reproduce here.

Production analytics. Log key WebRTC stats (bitrate, packets lost, frame rate) per session to a backend. Without this you cannot debug user reports.

How major apps approach Android screen sharing

Zoom. Proprietary codec stack tuned for real–time, hardware encoding when available, aggressive bitrate adaptation under congestion. Caps at 720p on cellular by default.

Google Meet. WebRTC–based; on Android, the native app uses VP8 for screen content, with H.264 fallback. Targets 1.5 Mbps for 720p/15fps screen share.

Microsoft Teams. Custom video pipeline; corporate–network optimized; conservative 1–1.5 Mbps for 720p/15fps. Strong fallback to text–only and audio when video drops.

Discord. WebRTC–based; 30 fps default for gaming, with adaptive downscaling to 15 fps under stress. VP8 preferred for screen share.

Slack Huddles. WebRTC; conservative 500–1000 kbps for 720p; prioritizes accessibility on weak networks over visual fidelity.

All of them moved to SFU–based architectures rather than pure peer–to–peer. For details on SFU vs P2P, see our WebRTC architecture guide.

Case studies — how we shipped this on real Fora Soft projects

BrainCert — WebRTC virtual classroom LMS. 100,000+ paid customers, $3M ARR. Tutors share their Android device screens to demonstrate problems. Our build runs at 1280×720 / 15 fps / 1.2 Mbps target with TURN failover; consistent quality even on sub–par 4G in tutoring markets across India and SE Asia.

Telemedicine builds. Doctors sometimes share Android screens to walk patients through portal flows. We default to FLAG_SECURE on the doctor’s app to prevent the patient capturing back; we log recording detection events for compliance audit.

Sprii — live shopping. Hosts share product close–ups; we run higher fps (24) for smooth motion at 720p. AV stays in sync via WebRTC’s transport–wide CC.

Fora Soft takeaway

Screen sharing on Android is more about lifecycle correctness than codec choice. Get the foreground service order right, handle Android 14’s new callbacks, plan for re–consent on Android 15, and the rest is tuning. Our Agent Engineering process means we re–use this work across engagements — estimates beat hand–built quotes by 30–40% on time and budget.

Get an Android WebRTC estimate →

AV1 mainstream by 2028. Hardware encoders are arriving on flagship Snapdragons; mid–range will lag. Plan for VP8/H.264 as the default through 2027, AV1 as opt–in.

AI–augmented capture. Background blur, object tracking, and on–device automatic redaction are becoming standard. The 2027 generation of WebRTC apps will run a small ML model on every captured frame.

Android 16 privacy hardening. Expect more granular capture intents (commercial recording vs personal sharing), audit logging requirements, and possibly permission revocation under inactivity.

WebRTC–NV. Better congestion control (gcc–next), native AV1 hardware hints, RTC scalability extensions. Android typically lags desktop by 2–3 years.

MoQ (Media over QUIC). Long–term replacement candidate for parts of WebRTC. Not production–ready in 2026 but worth tracking for 2028+ builds.

When NOT to ship Android WebRTC screen sharing yourself

Honest framing: this is one of the harder Android features to ship correctly. Three scenarios where you should buy or partner instead of building.

1. You need it in 4 weeks. A team new to WebRTC underestimates the testing matrix. Use LiveKit Cloud, Daily, or 100ms; ship in 2 weeks; replace later if you must.

2. You ship one product, in one country. The fixed cost of building this competes badly with paying $0.005–0.01 per participant–minute to a managed SFU.

3. You have no Android engineer experienced in foreground services. Hire one or partner; do not learn this on your customers.

WebRTC architecture guide for business 2026 — SFU vs MCU vs mesh, how to pick.

AI + WebRTC — how smart agents are changing real–time communication — the next layer up the stack.

Fora Soft WebRTC development services — what we ship for clients.

WebRTC development cost breakdown — what budgets actually look like.

Agora.io alternatives compared — if you are choosing a managed SFU.

Frequently asked questions

Can I capture the screen without showing the user a permission dialog?

No. The MediaProjection consent dialog is enforced by the OS and cannot be suppressed on production devices. Only privileged system apps (preinstalled by the OEM with the CAPTURE_VIDEO_OUTPUT permission) can bypass it; consumer apps cannot.

Why do my captured frames show up as black?

The captured app or window has FLAG_SECURE set, or it is rendering DRM–protected content (Netflix, Disney+, banking apps). This is enforced at the OS / hardware level and cannot be worked around. Document the limitation; do not promise users they can share their banking app.

Do I need a foreground service even if my app is in the foreground?

From Android 14 onwards, yes — even if your app is currently visible, you must run the projection inside a foreground service of type mediaProjection. The OS will throw SecurityException otherwise.

What FPS should I use for screen sharing?

15 fps is a good default for most content. Use 5–10 fps for static documents/slides, 24–30 fps for video playback or animation. Higher FPS uses more bandwidth and CPU and rarely improves user perception of static content.

Can I share the screen without WebRTC?

Yes — you can capture frames via MediaProjection and send them over RTSP, RTMP, HLS, or HTTP. WebRTC is preferred for low–latency, peer–to–peer or SFU–based scenarios; RTMP is preferred for one–to–many broadcast where 3–6 seconds latency is acceptable.

How do I capture audio along with the screen?

Use AudioPlaybackCaptureConfiguration (Android 10+) with the projection to capture system audio. Mix it with the microphone AudioRecord if you want both. Apps that opt out via allowAudioPlaybackCapture="false" cannot be captured; this is at the source app’s discretion.

What is the maximum resolution I can capture?

Limited only by the device display resolution. In practice, you should downscale to 1280×720 or 1920×1080 for transmission — native resolution (e.g. 3200×1440) is wasteful and the remote viewer will downscale anyway.

Is screen sharing supported on Android 9 and below?

MediaProjection has been available since Android 5.0 (API 21). However, foreground service requirements changed dramatically across Android 8, 10, 13, and 14, so a single codebase that supports old and new versions ends up branching heavily. In 2026, supporting below Android 10 is rarely worth the effort — over 95% of active Android devices run 11+.

Can I do this without writing native code?

Yes — LiveKit, Daily, and 100ms all provide higher–level Android SDKs that wrap MediaProjection and WebRTC behind a simple API. If you do not need fine–grained control, this is the fastest path.

Architecture

WebRTC architecture for business 2026

SFU vs MCU vs mesh; what to pick for your scale.

AI + RTC

AI agents on WebRTC — the playbook

Where real–time media meets agentic AI; cost, latency, framework picks.

Cost

WebRTC development cost

What teams actually pay to ship WebRTC features in 2026.

Vendor

Agora.io alternatives compared

If you are choosing a managed SFU instead of building.

Case study

BrainCert — WebRTC virtual classroom

Live screen sharing in production at $3M ARR.

Case study

VOLO.live — real-time translation system

22,000+ attendees served at Black Hat 2025.

Sum up — ship Android WebRTC screen sharing in 2026

Pick a maintained WebRTC fork (getstream/webrtc–android), declare a foreground service of type mediaProjection, start that service before you ask the user for projection consent, hand the projection to ScreenCapturerAndroid, and add the resulting track to a PeerConnection with isScreencast=true. Default to 720p / 15 fps / 1.5 Mbps. Listen for the Android 14+ resize and visibility callbacks; plan for Android 15’s lock–screen termination; design TURN failover for cellular. Test on Pixel, Samsung, and Xiaomi.

Fora Soft has shipped this pattern across multiple production Android apps including BrainCert, VOLO.live, and Sprii. If you are starting fresh or trying to fix a stalled WebRTC project, the next step is a 30–minute scoping call.

Building WebRTC on Android?

Get a fixed–price estimate in 48 hours.

We will scope the screen–share feature, recommend the right WebRTC stack, and price the work fixed — with estimates that beat agency rates by 30–40% thanks to our Agent Engineering process.

Book your 30-min call →
  • Technologies