Last year I was presented with a pair of Android phones and asked to determine how financial accounts of the phones’ respective owners had been compromised when two-factor authentication (2FA, or multi-factor authentication) was employed by each user for the compromised accounts. Now, there are a number of ways this could potentially have happened depending on how 2FA is implemented including SIM swapping, 2FA bombing, and a compromised “token” (i.e., the phone). To make a long story short, I eventually discovered each phone had been compromised by a banking Trojan, which, among other things, had the ability to nab SMS messages.
During my examination, one of the things I noticed was a lack of information on how things would look should an Android phone be compromised by malware. Sure, examiners know what to look for: excessive battery usage, unusual .apk names, and the permissions an app requests in its AndroidManifest.xml file. How would that look like from an examiner’s point of view, though? Googling for information about the particular Trojan I was investigating (and other Android malware) only found reverse-engineering (RE) analysis, which I am not knocking in any way. Having an analyst reverse a piece of malware to find out what it does is extremely helpful and provides clues to a device examiner on where they should start looking (yes, start), but an examiner should know how the items in RE analysis translate to on-device artifacts, be able to interpret them accordingly, and be able to pivot to other parts of the phone as clues are discovered.
A Stat & Setup
Earlier this year Kaspersky released its annual mobile malware report, which is based on information they receive from their users. While there was a downward trend in mobile malware infection volume in 2021, most likely due to workers returning to the office post-COVID lockdowns, the sophistication of attackers appears to have increased (Pegasus, anyone?). Google’s malware issues in the Play Store still persist, and then there are 1-click and zero-click vulnerabilities for both OS’s, and the list probably continues. Suffice to say, mobile malware is not going away, and will continue to increase in complexity as Apple & Google continue to harden their respective OSs. So, it is important that examiners always be on the lookout for malware. It can be the culprit in account compromises, the source of rogue SMS messages and phone calls, excessive phone bills, or worse.
In light of the above, I decided to take a look at a recent piece of Android malware from the perspective of both a user and an examiner. I thought this would be a great exercise for my own personal knowledge and, hopefully, could help others as I think documentation about Android malware from these perspectives is slightly under-represented. 🙂
To accomplish this I used a text Pixel 3 running Android 10 & 12, with the latter being the latest version of 12 that could run on the device. For connectivity, I applied cellular service to the device and put it on an isolated VLAN. I did not sign in to Google so no apps direct from the Play Store, but I did side-load several apps on the phone to make it look “lived in” and signed into a couple of them (Signal & WhatsApp). All of this happened a few times (more on that later). Also, I used a few dummy accounts to chat with the phone to generate some data.
Most of what is discussed in this post will require a FFS extraction, so an examiner’s ability to see the artifacts discussed will depend on the extraction capabilities of the tool(s) of choice. During this exercise I did have access to the full file system.
The scenario I had in mind was one that is similar to what I encountered last year. Odd behavior or something malicious had occurred, and the question posed was whether or not the phone was compromised in any fashion.
I do want to go ahead and level-set on a couple of matters. First, there will be no RE in this article as that is not its intent. There are plenty of other talented and knowledgeable people out there who do RE daily, and their work product can easily be found with a quick Google search. So, I tip my hat to those folks, say “thank you,” and will continue on in my lane.
Second, the methods and artifacts described in this post are not the end all be all. The intent here is to give an examiner a few ideas on where to start looking, and how things could potentially look once they do. Each piece of Android malware is different. They have different capabilities and how the malware operates will be dependent on the phone on which it lands (i.e. phone state and settings), how much a user allows it to get away with during operation, and how a threat actor may operate it. Thus, artifacts described in this post will likely look different from infection to infection.
(Pablo) Escobar – Not The One You’re Thinkin’
No, no, not the infamous 70’s/80’s drug dealer. In early March a suspicious .apk came to light on Virus Total and was highlighted by MalwareHunterTeam on Twitter. Eventually, the .apk was analyzed by Cyble. The findings in the report are interesting, but, in short, the app is bad news. Especially the part about being able to steal Google Authenticator codes. Security consultants encourage the use of 2FA as one layer of protection to help prevent account compromises, but Escobar has, apparently, figured out a way to get around that issue where Google Authenticator is concerned.
To see this app (hopefully) in action, I first loaded it on the Pixel running Android 10 by pushing the .apk file to the phone via adb and then attempting an install from there. Prior to installation I had disabled Play Protect from the Settings menu. The video below is a short clip that shows what happened.
There is a lot to unpack in this video clip. First, note that the phone was connected to the isolated VLAN, which means it was able to reach out to the wider Internet. Second, even though I had disabled Play Protect, it turned itself back on right after installation. At the time of my testing, the hash values for the .apk had been published, so it’s likely that the hash had been banned by Google on the backend and caused Play Protect to remove it (since it could “phone home”).
But the biggest thing was what happened in the middle. Escobar needed access to Accessibility Services, which it used to grant itself the desired permissions in rapid succession. Not only does Accessibility permissions allow that to occur (allowing Escobar to click “grant” to the permissions), but it also allows for other things such as being able to track screen presses (key logging), interacting with apps without user intervention, and overlaying content on the screen (credential harvesting via a fake login page displayed by the malware). Figure 1, which is a still from the video, shows these warnings as presented to the user.
This type of behavior (rapid succession of granting permissions) may be a memorable event, one that device owners may recall if questioned.
After this failure. I re-flashed the phone with Android 12, and repeated the process of side-loading apps and attempting to install the malware, and got the same results. I finally realized that the only way I would get the malware to successfully install on the device was to install it while the phone was offline. See video below.
The behavior was the same as seen in the first video, but Play Protect was slow enough to where I could turn it off so the malware could complete its installation. So, from an advice perspective, you can never go wrong by telling an Android user to make sure they keep Play Protect active on their device since it does provide a level of, though not complete, protection.
Also note the app removed itself from the most recently used apps. Sneaky.
Once the malware was installed, I applied cellular service to the Pixel, slipped it in my back pocket, and started carrying around, using it periodically over a few weeks.
Shortly after install and some usage I noticed some odd behavior. See Figure 2.
This phone got really hot. Not enough to the point that I believed the it would spontaneously combust, but there was a definite & noticeable increase in temperature, even when the phone was in my back pocket. Eventually, I noticed one correlation between my using the phone and the increase in temperature, which led me to believe the phone was working overtime doing…something. My mind immediately went to the battery and battery usage, which is a place examiners typically like to look for indications of activity by a malicious app.
Android 12 has a new feature that Kevin Pagano wrote about: Battery Usage. In short, Battery Usage tracks battery usage, per app, for a 24-hour period. The file that powers Battery Usage is battery-usage-db-v4 (/USERDATA/data/com.google.android.settings.intelligence/databases), and it has a single table of interest: BatteryState. See Figure 3.
The nice thing about this database is that it does keeps things tidy when reporting battery usage per-app. Since the phone was getting noticeably warm, one of the things we would want to know is what is consuming the battery power. With that in mind, the highlights, to me, are appLabel, packageName, consumePower, foregroundUsageTimeInMS, and backgroundUsageTimeInMs. A quick SQL query can pull it all together:
BatteryState.appLabel AS “App Name”,
BatteryState.packageName AS “Package Name”,
BatteryState.consumePower AS “Power Consumed”,
BatteryState.foregroundUsageTimeInMs AS “Foreground Usage Since Last Boot (ms)”,
BatteryState.backgroundUsageTimeInMs AS “Background Usage Since Last Boot (ms)”
ORDER BY “Power Consumed” DESC
If you do not want to mess with a SQL query, ALEAPP will easily parse this database for you.
See Figure 4.
Two things an examiner should expect to see are phone idling and screen usage. There is no package name associated with either, which is, again, expected. Depending on the habits of the user, these items will, most likely be somewhere near or at the top of the results of this query. During testing I spent a good amount of time messaging and cruising the web using Chrome, with occasional phone calls. Even with that usage, Escobar is consuming enough power to put it in the same league as Phone idle and the screen, which was a nice find. Also note the package name for Escobar is, well…unique, and the associated App Label, McAfee, doesn’t really go with the package name. This is a strange package name/label combination, a red flag that should be investigated further. Switching back over to the BatteryState table I filter the table by the suspicious package name and have a look at the values in foregroundUsageTimeInMs and backgroundUsageTimeInMs. See Figure 5.
Notice that the app is consuming a decent amount of power, but there is absolutely no background usage. One thing I will mention is that after I installed Escobar, I was unable to open the app again, in the traditional sense. Pressing on the app icon in the app tray caused the screen to flash, and then I was taken back to the the app tray screen. I was also unable to access the app from Settings > Apps; any attempt at doing so would back track me out of settings back to the Home Screen. This behavior is likely a defensive mechanism of the app since the author clearly does not want the phone user messing with it. From an investigative standpoint, questions about unusual app behavior such as this would be good to pose to the device user.
Just based on the examination of this database, we now have an app label/name (McAfee), an associated package name that does not seem to match the label (com.escobar.pablo), and a decent amount of foreground battery usage. But, what if the device being examined is not on Android 12 or just simply does not have battery-usage-db-v4? This would describe a majority of Android phones out in the wild at the moment. Usage Stats may be of help, but in this case Escobar only had entries for two days, neither of which was in the past 24-hours as was seen in battery-usage-db-v4.
Again, we are talking about battery usage and history, and Google has a nice developer utility, Battery Historian, that may be able to help. Battery Historian is designed to, among other things, help developers identify issues with their apps causing excessive battery drain. Battery Historian utilizes Docker, so, if you are unfamiliar with Docker not to worry because Google provides a fairly simple tutorial on how to get it and Battery Historian setup; there is some light command line work needed, but it is worth it. A few adb commands need to be run on the device being examined in order to extract the type of data Battery Historian can ingest. Again, Google provides those, but, they are:
adb shell dumpsys batterystats > %PATH_TO_DESIRED_WORKSATION_LOCATION%/batterystats.txt
adb bugreport %PATH_TO_DESIRED_WORKSTATION_LOCATION%/bugreport.zip
If the device is on Android 6.x or lower the latter command changes a bit:
adb bugreport %PATH_TO_DESIRED_WORKSTATION_LOCATION%/bugreport.txt
Once these are extracted, you can open them in Battery Historian. Figure 6 shows the main screen once the data is imported.
The screen is super busy, but since we have a package name, we can filter in certain areas in Battery Historian using the name. Figure 7 Shows the visual graph using “escobar” as a filter.
The screen practically went blank, but there is another another clue. The row App Processor wakeup has multiple entries during the 24-hour period represented in the graph and appear to be at somewhat regular intervals, although there is some definite gaps. Compared to other apps I used heavily (Chrome, Messages, and Signal), this is an outlier, and warrants the question “Why is this app waking the processor so much?” When comparing it to other apps on the phone the only other one that showed this much wakeup activity was Google itself, and the Carrier Services application (com.google.ims). This deserves further scrutiny. Remember this because it will be discussed later when looking at app permissions.
Below the graph, is an area where an examiner could isolate stats on a per-app basis. The red box in Figure 8 shows how Escobar’s battery usage looks compared to others that re on the device. Looking at Escobar by itself is good, but putting things in context is a better. Obviously, this will be different from exam to exam depending on how a user typically uses their phone.
The Tables correspond to the tabs on the left in the blue box. Again, since we know we were dealing with a physically hot phone and Escobar was waking the processor, I first looked at CPU Usage By App in user space and sorted highest-> lowest amount. See Figure 9.
Escobar had the highest level of usage compared to other apps on the phone, and it also had the highest User Time amongst all the apps even though I did not interact with it once after the initial installation (because I couldn’t – Escobar prevented me from opening it after installation). Again, depending on the user and the capabilities of the malware, this may not always be the case. Examiners will need to know user habits and examine this data contextually with what is on the device.
Battery Historian has many other features in it that could be useful, so it is worth exploring further. For example, you could isoloate Escobar here to see the amount of data the app has transferred via WiFi or cellular, its usage of sensors, and its CPU usage.
Privacy Dashboard & Digital Wellbeing
Android 12 introduced Privacy Dashboard. If you need a primer you can read about it here, but just know that it is designed to keep track of apps that use a phone’s location, microphone, or camera during a 24-hour period. Figure 10 shows Privacy Dashboard as it would appear to a user, and reveals “McAfee” using my location quite often.
I have to applaud Google for including this feature, as it does give a user some insight to what’s happening on their phone. As a user, I would ask why my (alleged) anti-virus utility would need my location, and why it is needed so much. I would really freak out if I saw it using my camera or microphone (two things it has permissions to do). During testing I was hoping Privacy Dashboard would catch Escobar using microphone and/or camera, but it never did.
One thing I noticed was that my location was being used each time my phone was unlocked or when the screen came on while locked, and then, in periods of constant use (e.g. I was chatting with one of my other fake personas so the phone was unlocked and actively being used) it would continue to use my location at various intervals. In Android 12, there are two places to partially confirm my suspicions: Privacy Dashboard and Digital Wellbeing. Digital Wellbeing keeps track of, among other things, device unlocks, so it would be a matter of comparing the data in those two places. For a quick win, I used ALEAPP as I know it is able to parse both. See Figures 11 (Digital Wellbeing) & 12 (Privacy Dashboard).
As can be seen, the location usage roughly coincides with device unlocks. There is also the issue of the app using my location consistently while the phone is actively being used. This data point is concerning, especially when coupled with the information learned from analyzing the battery usage (a lot of battery usage and a potentially mismatched app name/package name). If this was a matter of stalker-ware, I would be even more concerned.
Keeping score, we have highly unusual (bordering on excessive) battery usage and highly suspect location usage by an app who’s label does not appear to go with its package name. At this point there should be a lot of Googling happening to see if we could find any reference to the package name associated with this behavior. However, if this is a new piece of mobile malware we would likely turn up nothing during our searches.
Recall the installation of Escobar seen in the videos. As part of the installation process I had to enable accessibility services so the app could grant itself a litany of permissions. Abuse of Accessibility Services is something that has been associated with Android malware over the past few years, and it is important to keep that in mind when hunting for Android malware artifacts. Information about Accessibility Service settings is kept in settings_secure.xml located in /USERDATA/system/%USER_NUMBER%/. Since Android 12 is being examined, the file will be in the new ABX format, so using Alex Caithness’ Python script is ideal here for another quick win. Alex’s script outputs the file in a single line, so, for purposes of this article, I have formatted the file for easier reading. If the device being examined was Android 11 or below, this file would be in the normal XML format and easily readable.
settings_secure.xml contains several tag values for Accessibility Services, a couple of which are pertinent here. The first is accessibilty_enabled. See Figure 13.
Based on my testing, the value is “1” if any portion of Accessibility Services is enabled (red box in Figure 13). Keep in mind that Accessibility Services being enabled isn’t necessarily malicious, and users/apps may have a legitimate purpose for having one or more services enabled. Thus, the enabled functionality needs to be evaluated within context. Here the value is “1” because the service was enabled during Escobar’s installation. If at some point Accessibility Services is turned off, then the value reverts to 0.
The second value of interest is enabled_accessibility_services. See Figure 14.
This tag value keeps track of what apps have Accessibility Services enabled. Escobar’s package name is seen in Figure 14, which makes sense here since I enabled Accessibility Services for it during installation. To further test this tag value, I installed two apps that had Accessibility Services capabilities on a separate phone, enabled the service for both, and examined. I found that both package names were present, so it is possible the enabled_accessibility_services tag value may have multiple package names. As with the previous tag value, this data point should be evaluated in context.
Going back to the installation again, after enabling Accessibility Services for Escobar, it granted itself a bunch of permissions. Typically, when discussing permissions, examiners think of the AndroidManifest.xml file, and this would make sense for phones running anything below Android 6. Permissions on those phones were granted en masse at the time an app was installed, and, once granted, there was no way to revoke the permissions without uninstalling the app. However, Android 6 introduced a new permission model in which certain permissions were granted at the time of an app’s first run, and this model has continued through Android 12. These certain permissions are called runtime or “Dangerous” permissions, and you can read more about them here. Dangerous permissions are those that Google thinks are the most invasive, and considers them important enough to explicitly ask the user about when an app is first run. They include accessing location, SMS messages, call logs, phone contacts, and files on the phone.
When an app is first run or when a user logs into an app for the first time, a user is asked whether or not they will grant permission to the app to perform certain actions or access certain data. Figure 15 is representative of what a user would see when an app is requesting access to runtime permissions.
The important thing to remember is that there can be a difference between the Dangerous permissions an app requests in its AndroidManifest.xml file and what permissions it actually has on the device. In the case of malware, it is important to know what permission(s) an app actually has in order to understand if it had the ability(ies) to perform its intended function(s). An examiner can determine the Dangerous permissions for each app by looking at the file runtime-permissions.xml. The file’s location will vary depending on what version of Android is being examined:
Android 10 & below: /data/system/users/%USER_NUMBER%/
Android 11 & 12: /data/misc_de/%USERNUMBER%/apexdata/com.android.permission/
Again, the device being used here was running Android 12, so runtime-permissions.xml was found in the latter file path, and is in the ABX format. Figure 16 shows the permissions Escobar had on the test device.
There are two permissions I want to highlight. The first is DISABLE_KEYGUARD. This permission allows an app to disable a phone’s passcode. A good example of this is when a locked phone receives a phone call. The phone app is able to disable the phone passcode in order to present the user with the phone interface so that they can answer the call. It is able to do so because it has the DISABLE_KEYGUARD permission. Here, Escobar (“McAfee”) has that permission. Why it has this permission is a good question.
The second permission is WAKE_LOCK. This permission allows an app to keep the phone’s CPU awake. Normally, an app would use this permission to keep the CPU awake so that it can complete a particular operation, but would then allow the CPU to go back to sleep. Google stresses in its developer documentation that this permission should be used sparingly. If you recall from Battery Historian (Figure 7), Escobar woke the processor at what appeared to be regular intervals, and now we know how it was able to do it; it has the WAKE_LOCK permission. Regardless, the permission coupled with the behavior seen in Battery Historian is highly unusual.
The remaining permissions are somewhat self-explanatory. The app has access to just about everything important to a user (i.e., they’re “Dangerous”). It can use the microphone to record audio, use the camera, read & send SMS messages, get the phone’s location, get information about accounts on the phone, access the call logs, make phone calls, and read & write to the phone’s storage. The videos at the beginning of this post shows Escobar granting itself all of these permissions in rapid succession.
And The App
I was hoping that Escobar would actually do something during this exercise, but I ended up slightly disappointed. There could be any number of reason why it never did. One thing I will note is that, according to its “author,” it is currently in beta, so that is one possible explanation.
Another may be that I did not sign into any Google account. There could be nuermous others. Malware researchers will tell you how frustrating it can be sometimes and after this exercise, I fully understand what they mean. Regardless, there wasn’t much to look at.
The app data…if you can call it that…for this particular piece of malware resides in /USERDATA/data/com.escobar.pablo. The file structure is rather simple. See Figures 17, 18 & 19.
There was not much to look at, but there are a few files that could be helpful. The first was pref_store, which was located in the ~/app_webview/ directory path. See Figure 20.
The timestamp seen in Figure 20 is the time I installed Escobar. This could be helpful when attempting to determine the window of time a device was compromised and when any sensitive data was potentially at risk.
The next file is sharedPrefs.xml which is located in ~/shared_prefs. See Figure 21.
Based on what is known about Escobar, I suspect this file contains the status of certain settings. Escobar has the ability to disable Play Protect (remember I had disabled it) and uninstall itself as a protection mechanism (red box) should the operator decide its necessary.
The last file is actually a database, Web Data, which is located in ~/app_webview/Default/ See Figure 22.
During the exercise I ran FFS extractions on the infected device multiple times, and each time this database was empty. But, the table names in the database are notable. Many of these tables are the same as those from the Web Data database seen in other Chromium-based web browsers. The same goes for another database in the same location, Cookies (not pictured, but seen in Figure 18). The location of this database within this particular location is suspicious. However, since it is known Escobar is designed to steal sensitive data, stealing data from a web browser makes sense.
The files in this package and discussed here are unique to Escobar, but there is point to showing them here. Examiners should fully examine the contents of any potential malware discovered in an attempt to discover any artifacts it may be left behind. You never know what will be found.
At this point in the exam I would be uploading the .apk (recovered from the phone) to an online sandbox (Joe Sandbox, AnyRun), uploading the hash of the .apk to VirusTotal, or calling for help from a reverse engineer. Probably all three.
Android malware is not going away, and it its complexity is increasing. Examiners should be dillgent during their examinations since they could encouter malware in any exam they perform.
Android malware can be difficult to detect, but it is not impossible. Remember the Malware Paradox: malware can hide, but it must run. When it runs on Android, there are several places examiners can check to detect its presence. These artifacts, individually, don’t mean too much, but examining them collectively and within context can reveal the presence of malware.
Determining if the malware was successful in its mission is another matter all together.