Browsed by
Category: Android

Getting the instance of a system service without an application context in Android

Getting the instance of a system service without an application context in Android

I have been working on a project where I am supposed to execute some Java code, which uses Android APIs, outside the application environment. This means I don’t have access to any Android component i.e. Activity, Service, Broadcast Receiver etc. I was supposed to get the display’s width and height. Hence arose the need for hacking a way to achieve what is mentioned in the title, i.e. getting the instance of a system service without having an application context.

Normally, you need an instance of WindowManager service to get the device’s dimensions and the code would be something like this-

This works fine in normal situation. However, if you are not running your code inside an Android app, you don’t have an Android context. This means that you can’t call the method getSystemService() in the first place.

A large part of the Android OS is written in Java and that Java code also accesses the system services without having the application context. We’ll follow the same approach with our “app”.

There’s a way of doing inter-process communication in Android using AIDL. AIDL is short for Android Interface Definition Language. Basically, you define the same AIDL interface on the server side and the client side, and the client is able to call functions of a server service like a normal procedure call.

Specifically for getting a reference of WindowManager service, we do something like this. First of all, create an aidl folder next to the java folder in Android Studio. Create a package called android.view inside the newly created aidl directory. There, create a file named IWindowManager.aidl which should contain the methods you want to execute. For instance, if you want to get the device’s dimensions it could look like this –

Note: For any other system service, look for the corresponding aidl file for that service in the Android OS source tree.

After this, you need to get the IBinder object exposed by that service for IPC. For this, we need to call some hidden API methods via reflection.

After getting the IWindowManager instance, we can easily call its methods like a normal procedure call.

Lastly, the API exposed by the service using the AIDL interface is not exactly the same as the one exposed by the actual system service when you go the normal way. However, you can easily find the corresponding methods for doing one thing or the other, or may be even extra.

Create touch events programmatically in Android

Create touch events programmatically in Android

Creating a touch event in Android is pretty easy. Although, is the easy way the best? Let’s find out!
The simplest way is to use the “input” binary which you can use to create touch events all over the screen (via the shell user). The command to create a simple tap event is as simple as – adb shell input tap x y.

However, there’s a latency of more than 1 second while executing this command. If we’re making some automation related app, we definitely wouldn’t want this.

Another approach is to write values directly to the device node file available inside /dev for input. This approach has slightly lesser latency, but file writing operations are anyways expensive. On top of that, some devices/OEMs might have different locations/names for the device nodes which makes this approach a nightmare to implement.

Can we do better both in terms of consistency and latency? Definitely!

Luckily, there’s a class called InputManager in Android which has some interesting methods, although the methods of our interest are either private or hidden.

Now’s when Java’s Reflection API comes to the rescue. We’ll follow a series of hacky reflection API calls to get our input injection working.

First of all, we need the instance of InputManager class. For that, we’ll just invoke the getInstance method via reflection.

Next, we need to make obtain method of MotionEvent class accessible. Then, we will get the reflection reference of injectInputEvent method.

We’re all set now, and just need to write the code to actually pass the touch events to the Android system. The code involves creating a MotionEvent object, and calling the injectInputEvent via reflection.

Another awesome thing with this approach is that, the framework will itself figure out if the current event will be a normal tap, swipe or a long press.

The full implementation of this approach is present in RemoteDroid – https://github.com/omerjerk/RemoteDroid/blob/master/app/src/main/java/in/omerjerk/remotedroid/app/EventInput.java.

Where can you create touch events?

Of course, a random user (every app is run by its own user) can’t just create touch events on anywhere on the screen just like that. A particular app can create touch events on those views only which are owned by the same app.

But there’s a catch, shell user (uid 200) can create touch events all over the screen.

Now the question becomes, how would you start your app so that the app’s code is executed by the shell user. Either figure this out yourself or search my blog for something close to that. 😉

Execute Java code as a root user in Android

Execute Java code as a root user in Android

You can find a lot of ways on StackOverflow about executing shell commands as root in Android, and it goes as follows :

What if you want to execute some Java code which uses Android APIs as a root user ?

This is not as straightforward as executing a shell command with root.

The idea is to compile the Java class with a static main method just like a normal apk is compiled.

First of all, we need to create a class which is to be executed as the root user. This class can be placed along with the normal classes inside your Android app. This should look like this (of course, the name of the class can be anything) :

Note that the package name of the above class is “in.omerjerk.remotedroid.app” which will be used to identify the above class later.

The hack is to start the above class just like how the Android’s framework starts the app when we click on the launcher icon. We use app_process binary to load our java class. The app_process is in class path and can be called by executing app_process32 inside the shell.
But, here’s the trick. We will start app_process as a root user, which will in turn load our class, again having root as the executing user.

The above paragraph boils down to the following command which is supposed to be executed in a rooted device.

In the above command, replace in.omerjerk.remotedroid.app and Main with your package name and class name respectively.

That’s all you need and your Java class will get executed as the root user.

How to install an app to /system partition ?

How to install an app to /system partition ?

Note : This article is for you if you’re making an app for rooted android devices.

The normal behaviour being that the android’s package manager installs the apk file to /data partition. But for accessing hidden APIs and having extra privileges one may want the app to be installed in /system partition. For pre-kitkat devices the extra privileges folder is /system/app whereas for kitkat and post-kitkat devices the extra privileges folder is /system/priv-app.

The trick to install your app to /system partition is to first install it the normal way, then on the first run of the app, move the apk to /system/priv-app folder (/system/app for pre-kitkat devices). The following snippet of code makes your life easy to achieve this.

Use String.format() in your code to format the above commands as shown :

Execute the above formatted String after replacing the apk name, package name and the activity name to yours as shown :

I use libsuperuser to execute the SU commands in android. The way how to execute these commands is trivial and totally up to you.

Working example : https://github.com/omerjerk/RemoteDroid/blob/master/app/src/main/java/in/omerjerk/remotedroid/app/MainActivity.java#L31

Getting video stream from Android’s display

Getting video stream from Android’s display

This is something that has been tried to be achieved in various ways. What other people usually do is to take screenshots at regular intervals and stitch them together to make a video out of it. What I’m doing here is pretty different an much better than that approach.

And so I also present a way to capture video frames from Android’s default display and do further processing as one pleases.

I’ll broadly use two main APIs viz. MediaCodec (added in API level 16) and DisplayManager (added in API level 19). So this will limit our app to a minimum API level of 19 which is Kitkat. And further if you want to mirror the output of secure windows as well, then you’ll have to push your apk to /system/priv-app/ which will require having root access on your phone.

My logic is :

  • Create a video encoder.
  • Get an input Surface from the encoder using createInputSurface() method of the encoder object.
  • Pass this surface to DisplayManager so that the display manager will route it’s output to this surface.
  • Use the dequeOutputBuffer() method which will return you the H.264 encoded frames of your video.
  • And if you want to get raw video frames you can further pass these AVC encoded frames to a video decoder and get the raw frames. However as of now I’m not covering that in this blog post.

Let’s start with the code :

First we need to create an encoder, configure it and get an input surface out of it.

We will then pass the above created surface to createVirtualDisplay() method.

The DisplayManager will keep drawing the contents of the Android screen on our virtual display which in turn will feed the contents to video encoder through the surface.

The variable encodedData contains the AVC encoded frame and will be updated in every loop.
Now it’s upto you what you want to do with this. You can write it to a file. You can stream it over the network (Although that will require a lot more efforts).
You can further pass it to a video decoder to get raw frame and convert it to a bitmap or something. I’ve implemented the passing of video frames to video decoder in one of my projects here :
https://github.com/omerjerk/RemoteDroid/blob/master/app/src/main/java/in/omerjerk/remotedroid/app/ServerService.java#L285

My work at Cube26

My work at Cube26

Cube26 has it’s innovations in the areas of Image Processing, Machine Vision, Machine Learning. They basically focus on gesture and image based controlling of the device. At this time Cube26 has partnerships with 6 Indian phone manufacturers including Micromax, Intex and Spice.

I’m in the Android development team of Cube26. At the time of writing this blog I’m working on the release of Micromax A290. We currently don’t have access to the whole source but some parts of the OS.

Below is the list of the features that I implemented/partly implemented in the Canvas A290 release :

  1. Low light reading flux mode : On the first day at Cube26 I started working on this flux mode. For the beta release I implemented it with the Launcher. Later I implemented it in the SystemUI.
  2. Stamina Mode : Ah. There was a lot of code involved in the Stamina mode. It’s task was to kill all the background apps, remove all apps from the recents, disable GPS, WiFi, Bluetooth. At 14% mode it will be automatically enabled. And at the 4% mode the it’ll automatically kill the RF of the phone by enabling Airplane mode.
  3. Vault : Vault itself was the part of a feature called Eyeris. In Eyeris, your phone will scan your retina and if it matches to the owner’s retina it will automatically unlock the phone and decrypt it. I implemented the functionality of Eyeris in Gallery to hide the images and in the messaging app to hide the messages.
  4. Camera : For some of the later days I kept working on implementing some features in the Camera app. The features included say cheese to capture a selfie, Anti Shake mode, Front and Back mode. Implementing all of them took a little long time.
  5. Look away to pause : To tell you the truth I didn’t wrote it’s Image Processing related code. The whole Image Processing code was written by some other genius soul. I just implemented it in the Gallery app (It’s one of the best features). The implementation is that if while watching a video if you look away to somewhere else the video will automatically pause.
  6. Auto Collage Maker : The core code of this feature was also written by someone else. I just improved upon it and enhanced it and merged it with the Gallery app.

Other projects:

  1. App for LAVA tablets to search internet for education related content given the topic as a query.
  2. Launcher to search your whole phone providing suggestions in realtime. Search was based on Lucene.