Getting video stream from Android’s display
This is something that has been tried to be achieved in various ways. What other people usually do is to take screenshots at regular intervals and stitch them together to make a video out of it. What I’m doing here is pretty different an much better than that approach.
And so I also present a way to capture video frames from Android’s default display and do further processing as one pleases.
I’ll broadly use two main APIs viz. MediaCodec (added in API level 16) and DisplayManager (added in API level 19). So this will limit our app to a minimum API level of 19 which is Kitkat. And further if you want to mirror the output of secure windows as well, then you’ll have to push your apk to /system/priv-app/ which will require having root access on your phone.
My logic is :
- Create a video encoder.
- Get an input
Surface
from the encoder usingcreateInputSurface()
method of the encoder object. - Pass this surface to DisplayManager so that the display manager will route it’s output to this surface.
- Use the dequeOutputBuffer() method which will return you the H.264 encoded frames of your video.
- And if you want to get raw video frames you can further pass these AVC encoded frames to a video decoder and get the raw frames. However as of now I’m not covering that in this blog post.
Let’s start with the code :
First we need to create an encoder, configure it and get an input surface out of it.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
MediaFormat mMediaFormat = MediaFormat.createVideoFormat("video/avc", 1280, 720); //For a Nexus 5, the max bit-rate is 4 Mbps mMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 3000000); //For a Nexus 5, the range of frame-rate is from 15 to 30 mMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15); mMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface); mMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10); Log.i(TAG, "Starting encoder"); encoder = MediaCodec.createByCodecName(CodecUtils.selectCodec(CodecUtils.MIME_TYPE).getName()); encoder.configure(mMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); Surface surface = encoder.createInputSurface(); encoder.start(); |
We will then pass the above created surface to createVirtualDisplay() method.
1 2 3 |
DisplayManager mDisplayManager = (DisplayManager) mContext.getSystemService(Context.DISPLAY_SERVICE); mDisplayManager.createVirtualDisplay("OpenCV Virtual Display", 960, 1280, 150, surface, DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC | DisplayManager.VIRTUAL_DISPLAY_FLAG_SECURE); |
The DisplayManager will keep drawing the contents of the Android screen on our virtual display which in turn will feed the contents to video encoder through the surface.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
final int TIMEOUT_USEC = 10000; ByteBuffer[] encoderOutputBuffers = encoder.getOutputBuffers(); MediaCodec.BufferInfo info = new MediaCodec.BufferInfo(); while (!encoderDone) { int encoderStatus = encoder.dequeueOutputBuffer(info, TIMEOUT_USEC); if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) { // no output available yet if (VERBOSE) Log.d(TAG, "no output from encoder available"); } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) { // not expected for an encoder encoderOutputBuffers = encoder.getOutputBuffers(); if (VERBOSE) Log.d(TAG, "encoder output buffers changed"); } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) { // not expected for an encoder MediaFormat newFormat = encoder.getOutputFormat(); if (VERBOSE) Log.d(TAG, "encoder output format changed: " + newFormat); } else { ByteBuffer encodedData = encoderOutputBuffers[encoderStatus]; if (encodedData == null) { //something's wrong with the encoder break; } // It's usually necessary to adjust the ByteBuffer values to match BufferInfo. encodedData.position(info.offset); encodedData.limit(info.offset + info.size); encoder.releaseOutputBuffer(encoderStatus, false); } } |
The variable encodedData
contains the AVC encoded frame and will be updated in every loop.
Now it’s upto you what you want to do with this. You can write it to a file. You can stream it over the network (Although that will require a lot more efforts).
You can further pass it to a video decoder to get raw frame and convert it to a bitmap or something. I’ve implemented the passing of video frames to video decoder in one of my projects here :
https://github.com/omerjerk/RemoteDroid/blob/master/app/src/main/java/in/omerjerk/remotedroid/app/ServerService.java#L285