Browsed by
Tag: mediacodec

Take screenshot programmatically without root in Android

Take screenshot programmatically without root in Android

Here’s another post related to Android hacking.

One can find a lot of ways over the internet which will tell you how take the screenshot of your own app, and that’s like pretty easy too. That’s also allowed by the Android framework. Note that here I’m not talking about the screenshots you take (as a user) by pressing Power + Volume down keys (that screenshot is taken by the SystemUI which has extra privileges). Let’s talk about the case if you want to take screenshot programatically from your own app/service and want to cover the whole screen.

According to Android security, if you were to take the screenshot of the screen programmatically, the only views visible will be the ones created by your own app. It won’t contain the views created by any other app.

So, I present a way to take screenshots of other views from your own app programmatically. The trick here is to use android’s MediaProjection API. Also, do note that this sets the minimum API level to be 21.

What we’ll do is to render the Android’s display contents on a Surface using the MediaProjection API. But then the toughest part is to get the contents of this Surface and create a proper Bitmap out of that. There’s no direct way of doing that.

Over the past few days, I tried different ways and failed –

  1. Create the Surface from the encoder, then attach a decoder to the encoder output, and create a Bitmap from the decoder’s raw output. (Didn’t work and I wasn’t surprised)
  2. Create an OpenGL texture, create a Surface using that texture, then whatever is drawn on the Surface will automatically get passed to the OpenGL texture, and then call glReadPixels() to get raw pixel data to create the Bitmap. (Should’ve worked but for some reason, I was just getting a green colored image)
  3. Then I tried Android’s ImageReader API and that finally worked out.

Enough of the bullshit, let’s start with some code.

First of all, we need to initialize the object of the ImageReader class.

Now, we need to pass this surface to the MediaProjection API. Please go through this demo code to learn how to create the object of MediaProjection class.

Implement the ImageReader.OnImageAvailableListener in your class and do the following –

Also, make sure you have the following permission in your manifest.

In case you’re too lazy and when everything setup for you, here’s a small library I created – https://github.com/omerjerk/Screenshotter

Getting video stream from Android’s display

Getting video stream from Android’s display

This is something that has been tried to be achieved in various ways. What other people usually do is to take screenshots at regular intervals and stitch them together to make a video out of it. What I’m doing here is pretty different an much better than that approach.

And so I also present a way to capture video frames from Android’s default display and do further processing as one pleases.

I’ll broadly use two main APIs viz. MediaCodec (added in API level 16) and DisplayManager (added in API level 19). So this will limit our app to a minimum API level of 19 which is Kitkat. And further if you want to mirror the output of secure windows as well, then you’ll have to push your apk to /system/priv-app/ which will require having root access on your phone.

My logic is :

  • Create a video encoder.
  • Get an input Surface from the encoder using createInputSurface() method of the encoder object.
  • Pass this surface to DisplayManager so that the display manager will route it’s output to this surface.
  • Use the dequeOutputBuffer() method which will return you the H.264 encoded frames of your video.
  • And if you want to get raw video frames you can further pass these AVC encoded frames to a video decoder and get the raw frames. However as of now I’m not covering that in this blog post.

Let’s start with the code :

First we need to create an encoder, configure it and get an input surface out of it.

We will then pass the above created surface to createVirtualDisplay() method.

The DisplayManager will keep drawing the contents of the Android screen on our virtual display which in turn will feed the contents to video encoder through the surface.

The variable encodedData contains the AVC encoded frame and will be updated in every loop.
Now it’s upto you what you want to do with this. You can write it to a file. You can stream it over the network (Although that will require a lot more efforts).
You can further pass it to a video decoder to get raw frame and convert it to a bitmap or something. I’ve implemented the passing of video frames to video decoder in one of my projects here :
https://github.com/omerjerk/RemoteDroid/blob/master/app/src/main/java/in/omerjerk/remotedroid/app/ServerService.java#L285