Venmo. The App for Virtual Ballers.

I recently went on a trip which required hanging out in a couple of airport terminals. While waiting on my flights I saw the usual scene: a sea of people staring down at their phones. I am not going to delve into the obvious security concerns (whole different topic), but I was able to see many home screens from my vantage point, and I noticed a few consistent apps. One was a blue square/circle with a white ‘V’ in it: Venmo, the social media payment app.

Full disclosure: I am not a Venmo person. I live in Apple’s walled garden and like to use Apple Pay whenever I can. I am reluctant to provide any merchant or service any financial information as I have been caught up in a couple of data breaches, specifically, Target and Home Depot. While those two incidents involved POS weaknesses, the pain they caused made me extremely paranoid when it comes to my financial information.

Not everyone has that attitude, however. While conducting testing for this post I was amazed at how many people in my contact list use Venmo. My spouse even uses it (she had to show me the ropes when I first fired up the app). As of April of 2019, Venmo reported 40 million active users with payment volume around $21 million; it is expected that payment volume will hit $100 million for the entirety of 2019. Comparatively, it sits behind its parent company, PayPal, and Amazon Pay. Third isn’t too shabby in the mobile payment space.

Because Venmo is owned by PayPal, users can, at many merchants, use Venmo where PayPal is accepted, but using it at merchants isn’t Venmo’s main draw. It is being able to pay people for whatever reason without having to use cash…especially since most people don’t accept cards and a majority of people, I would argue, don’t carry cash around. Venmo is also a social media app. It has a feed, similar to other social media feeds, that shows you what your “friends” are buying and selling. For example, in my Venmo social feed I saw payments for “schnacks,” daycare, date nights, booze, haircuts, groceries, picking up prescription medications, pizza, and a corn hole tournament. These were person-to-person transactions, not person-to-merchant. Paying people for whatever just by clicking a few buttons can be a lot easier than paying with cash (again, who does that nowadays???) or trying to figure out who accepts cards and who doesn’t. For some small merchants, it may be cheaper to use Venmo versus accepting cards and having to pay bank/card transaction fees.

There is also a public feed that shows what people (who you are not friends with) are buying and selling. This is notable as the default setting for transactions is “public” (more on that later).

This idea of a social feed is important. File it away for now because I will bring it up later when discussing iOS.

Like everything else, people have found a way to use Venmo for questionable activities. I found a site, Vicemo, that uses Venmo’s public API to track what it thinks are payments for booze, strippers, and narcotics. In 2017 there was a study that found a third of Venmo users have paid for weed, cocaine, and Adderall with Venmo, and that 21 percent of Venmo users have used it to make bets on sporting events.

Is This Useful?

Generally speaking, having access to this data can be beneficial as it can be considered pattern-of-life (POL) data. A lot can be gleaned from knowing what a person purchases or sells, and from/to whom they are purchasing or selling. Just think about the entry in my social feed regarding the prescription medication. Based on that entry, I may be able to deduce that person has a particular medical condition if the payment note has enough information in it. I could also look in my public feed and find similar bits of information.

This data can be a gold mine for investigators, and I would argue an investigator’s imagination is the limit for its application. For example, think of the drug dealer who is using Venmo for payments. Want to know who the buyers are or who the source is? Just pull the Venmo data, do some analysis, and start slinging subpoenas/court orders/search warrants for Venmo account information. What about the local fence who operates in your jurisdiction? Want to know who the fence sold some stolen property to? Same scenario.

My point is people do not randomly give money away. They are paying for things they need, want, or are significant to them in some fashion. Understanding this allows an investigator to draw conclusions by taking this data and applying some deductive reasoning.

POL data is extremely useful. There has been a lot of research during the past couple of years that has shown just how much data about our lives is collected by our mobile devices. This data, when interpreted correctly, can be powerful.

The Setup

As usual, just a few notes on the hardware/software used.

Pixel 3:   Android 10 (patch level 9/5/19), rooted, Venmo version 7.40.0 (1539)

iPhone SE:  iOS 13.2, non-jailbroken, Venmo version 7.40.1 (1)

For testing I used a couple of pre-paid cards that I picked up at a Wal-Mart. I had to register the cards first with their respective “banks”, and then load them into Venmo; if you were to use a card issued to you the registration part would not be necessary. Also, you have the option to link an actual bank account to your Venmo account, which is something I did NOT do.

I will add that during my analysis, I did not find where either OS stored the actual card information, even though I could see it within the UI. It may be there, encoded somehow, but I didn’t see it.


Before we go down the forensic road, I want to show what a transaction screen looks like in the Android UI. It will help make things clearer.

Figure 1 shows the payment screen. In the red box is the transaction recipient. While I didn’t do it during testing, and you can add more than one recipient here. The green box is obvious…the amount of money I am sending. The blue box shows the note that is attached to the payment. Venmo requires that there be something in this field, even if it is an emoji. The orange box is the share-level of the transaction. It can be set to Public (everyone on Venmo sees it), Friends Only, or Private (just you and the recipient see the transaction). The default is Public, but you can change the default setting from the Settings menu.

Figure 1.png
Figure 1.  Android payment UI.

The purple box dictates what happens to this transaction. If you hit “Pay” the money is sent. If you hit “Request” a request goes to the transaction recipient who can either pay or decline to pay the requested amount. More on those in a bit.

There is one other way to send someone money. An intended transaction recipient can display a QR code on their device, which the sender can scan, fill out the note and amount fields, and then hit “Pay” or “Request.”

Venmo data in Android resides in /data/data/com.venmo. Because of new technical hurdles with Android 10 on the Pixel 3, I had to copy the folder over to the /sdcard/Download folder via shell and then run an ADB pull from there.

Unlike my previous post about Wickr, Venmo databases are pretty straight forward. There are three files of interest, with the biggie being venmo.sqlite. The database can be found in ./databases folder. See Figure 2.

Figure 2.png
Figure 2.  Available databases in Android.

There is an additional database of interest here, too, named mparticle.db, which I will discuss shortly.

This database contains all of the transaction information along with any comments about transactions and any responses to the transactions. Take a look at the screen shot in Figure 3, which shows a completed transaction. The iOS device paid the Android device $5.00 on 10/31/2019 at 9:56 PM (EDT). The note sent with the payment was “For the testing stuff. My first payment.” The Android device responded to the comment (no time stamp is seen).


Figure 3.png
Figure 3. Android screenshot of a completed payment.

Figure 4 shows the table marvin_stories. You will notice transactions are called stories, and that timestamps are stored in Zulu time in human-readable format.

Figure 4.png
Figure 4.  The marvin_stories table.

The columns of interest here are story_data_created, story_note, and story_audience. However, if you want to get all of that information and more in one place, you need to look at the contents of the last column, story_blob. See Figure 5.

Figure 5
Figure 5.  Transaction information from the story_blob column.

The contents of these cells are broken up in to three parts. Actor (transaction sender), transaction information, target (transaction recipient). If you recall from Figure 2, the iOS device sent the Android device $5.00. So, the actor, the iOS device, is seen in the green box. There you can see the action (“pay” the Android device), the display name, friend status, the user ID for that account, last name, how many mutual friends we may have, and the actual user name (Josh-Hickman-19).

In the purple box you find the transaction information: the amount of money (5.0 – Venmo drops tailing zeros occasionally), the date the transaction was created and completed, the transaction ID (more on that in a moment), the note that came with the payment, and the fact that it was “settled.” The status of “settled” is the only status I have seen and I have not been able to change the value in this field.

The orange box has the transaction recipient, or the target, which is the Android device (ThisIs-DFIR). It has the same information as the green box.

When a transaction is completed in Venmo, both parties get an email notification. The transaction ID seen in the purple box in Figure 5, also appears in the email (Figure 6, red box).

Figure 6
Figure 6.  Email with transaction ID.

The column story_likes_blob is simple, and is seen in the red box in Figure 7. If there is a value present in this cell it means someone “liked” the transaction. An example of how that appears in the UI is seen in Figure 8.

Figure 7.png
Figure 7.  story_likes_blob column.


Figure 8.png
Figure 8.  A “liked” transaction.

Values in the column are also stored in JSON. The “like” for the transaction in Figure 8 is displayed in Figure 9. As you can see, much of the information in the transaction JSON data is also present here.

Figure 9.png
Figure 9.  A “like” in the database.

A note about the privacy level of transactions. After a transaction is completed, it goes into a social feed. The feed it goes into is dependent upon what privacy level setting was chosen at the time the transaction was sent; however, the transaction privacy level can be changed by any party that was part of the transaction at any time. If the privacy level is changed, the transaction is removed from the original feed and placed in the new feed.

The reason privacy level is important is that people not part of the transaction may be able comment on it (because they can see it). Their ability to do so is dependent upon the privacy level. For example, if I set the privacy level of a transaction as “Private,” only the other person (or persons if there is more than one party) can comment on the transaction. If I set the privacy level to “Friends Only,” then my “friends” in Venmo can comment on the transaction along with the parties involved. If I set the privacy level to “Public” anyone on Venmo can comment on the transaction.

Figure 10 shows a transaction that has a comment. As you can see in the blue box, I set the privacy level of this particular transaction to “Friends Only,” which means any of my Venmo “friends” could add a comment to this transaction if they chose to do so.

Figure 10.png
Figure 10.  A transaction with a comment.

Getting back to the database, the table comments contains information about comments made on transactions. See Figure 11.

Figure 11
Figure 11.  The comments table.

Four columns are important here. The first is obvious: the comment_message column (orange box). This contains all of the comments made. The column next to it, created_time is the time the comment was left (blue box). The column in the red box, comment_story_owner, is important as it links the comment back to the transaction. The values in this column correspond to the values the story_id column in the marvin_stories table. Note that some transactions may not have entries in the table if no one comments on them.

The column seen the green box, comment_user_blob contains JSON data, and contains much of the same JSON data seen in previous figures. This documents who made the comment. If you are in a hurry and don’t want to look at the JSON data, you can clearly see them username value of the user who made the comment in the cell. 🙂

The last table in this database is table_person, and contains data about anyone with whom you have conducted a transaction. The data is simple. See Figure 12.

Figure 12
Figure 12.  table_person.

The next file of interest is also a database: mparticle.db. It resides in the same location as venmo.sqlite. The only data I was able to generate in this database appears in the attributes table. See Figure 13.

Figure 13
Figure 13.  Attributes from the mparticle.db.

The columns attribute_key, attribute_value, and created_time (stored in Unix Epoch) are straight forward. One note about the time, though. Users are able to change their display names and phone numbers, and if they do, the created_time values will also change. I changed my first and last name, but left the phone number. The created_time for the phone number in Figure 13 corresponds to when I created the account, but the other values correspond to the time I last made changes to my first and last name.

The last file is venmo.xml, and it resides in ./shared_prefs folder. This file contains some basic information along with an additional nugget or two. Figure 14 shows the first part of the file.

Figure 14.png
Figure 14  vemo.xml, Part 1.

In the green box is the userID for my “top friend.” Seeing how I had only one friend for this particular account, the value relates back to the account on the iOS device. The blue box has the timestamp for the last time time I sync’d my contacts, the yellow box has the email account associated with the Android account, and the red box contains the userID and last last name associated with the Android account.

Figure 15 has three items, the full name (display name) and phone number associated with the account (red box), and the display first name (blue box). The names seen here correspond to the values seen in the attributes table in the mparticle.db.

Figure 15.png
Figure 15.  venmo.xml, Part 2.

Figure 16 has two items. The first is seen in the green box, and is the user name on the Android account. The other, the user_balance_string value, corresponds to the amount of money this account has in Venmo, which can be seen in the blue box in Figure 17.

Figure 16.png
Figure 16.  venmo.xml, Part 3.
Figure 17.png
Figure 17.  Venmo balance.

When a transaction is completed in Venmo, the money goes into what I like to think of as a staging area. This staging area holds the funds, and you can either use them in other transactions, or transfer them to a bank account if you have one linked. You can also have the balance sent to a credit card, but the credit card has to be a certain type, and my pre-paid card didn’t meet the criteria.

Paying someone via “Pay” or scanning a QR code is not the only way a user can get paid. A user can request payment. Remember the “Request” button in Figure 1? Well, there is a way to tell if that was the method by which funds were transferred. See Figure 18.

Figure 18.png
Figure 18.  The “charge” action.

Figure 18 has the same data in the same green, purple, and orange boxes as seen in Figure 5. The only difference here is the “action” field at the top of the green box has changed from “pay” to “charge” (red arrow). This indicates the transaction occurred because the Actor (the transaction sender in the green box) box requested payment, and the Target (the transaction recipient in the orange box) granted the request. As a side note, you can see that there are more digits in the “amount” field this time (purple box).

I tried requesting payment and declining the request three times to see what was left behind. Interestingly enough, there is no record of those three requests anywhere. Venmo may keep this data server side, but you will not get it from the device.

That’s it for Android. I had to triple check everything because I thought I was missing something, again, because of the amount of work involved in the Wickr post. That being said, there are other capabilities of Venmo I did not test, such as Facebook, Snapchat, and Twitter integration, along with linking my bank account. As I previously mentioned, I was surprised I did not find any references to the bank card used for transactions; it may be there, but I did not find it.


Good news: Venmo data is available in an encrypted iOS backup in iOS 13.2, so I assume that is the same for previous iOS versions. This is great because that means all an examiner needs to do is create a backup, or use their preferred forensic extraction tool to get this data. Nothing else was necessary.

iOS contains the same data as the Android version, just in a different format. There was also a few additional bits of data I found that could be useful in certain situations, and I will review that as well.

Transaction information is stored in the net.kortina.labs.Venmo/Documents folder. Remember when I mentioned the idea of a social feed being important? Well, iOS is why. Unlike Android, iOS stores Venmo transactions in three different files. Which file it stores a transaction in depends on the privacy level set at the time of the transaction. See Figure 19.

Figure 19.png
Figure 19.  Contents of the net.kortina.labs.Venmo/Documents folder.

The files FriendsFeed, PrivateFeed, and PublicFeed“should look familiar. These are the three privacy level settings for Venmo transactions. These three files are binary plist files. Figure 20 shows the same transaction as seen in Figure 5, and comes from the PrivateFeed file.

Figure 20.png
Figure 20.  Same transaction as Figure 5.

Reading this is a little tricky as the data is not as neatly grouped as that in the Android database. First, the file stores data from the bottom up, chronologically speaking. Second, this transaction was found in the PrivacyFeed file (remember the privacy level setting seen in the orange box in Figure 1). To help see the comparison to Android, I grouped the actor, transaction information, and target data using the same green, purple, and orange boxes. In the red box is the comment the Android account left for this transaction, complete with the timestamp; see Figure 21 for how this looks in the iOS UI.

Figure 21.PNG
Figure 21.  Comment from Figure 20.

One thing you will notice is that the “action” and status fields are missing. I looked in this file and the only time these fields showed was the most recent transaction that occurred with the privacy level set to Private (at the top of the PrivacyFeed file). The rest of the transactions in this feed did not have an action or status associated with them. I may not necessarily be reading this bplist file correctly, so if someone else knows, please let me know so I can update this post.

Figure 22 was the most recent “Private” transaction on the iOS account. Figure 23 shows the data in the bplist file with the usual groupings (green box = actor, purple box = transaction information, orange box = target), and Figure 24 shows that same transaction in the venmo.sqlite database from the Android device. Notice the “action” and “status” fields are present in Figure 23.

Figure 22.png
Figure 22.  Last “private” transaction on the iOS account.
Figure 23.png
Figure 23.  bplist entry for the transaction in Figure 22.
Figure 24.png
Figure 24.  Android entry for the comparison.

Figure 25 shows the “Request” transaction from Figure 18. Again, I used the same colored boxes to the group the data. Note the red box at the bottom; it contains the privacy level setting for the transaction.

Figure 25.png
Figure 25.  The “request” transaction from the iOS point of view.

Since we are on transactions I will head over to the “FriendsFeed” for just a moment. This is an additional bit of information that I saw in iOS that was not present in Android. This feed contains a lot of information about your Venmo friends’ transactions. The data is not as verbose as the data about my transactions, but you can see when transactions occur, the transaction information (time, note, and transaction ID – purple box), the actor (green box) and target (orange box). Notice the action and status fields are missing. Because the target and actor are not aware of this blog post, I have redacted pertinent information. See Figure 26.

Figure 26
Figure 26.  A transaction by one of my Venmo friends.

If you recall from Figure 19, there is a database present named Model.sqlite. This database contains information about a user’s Venmo friends. The table ZMANAGEDUSER contains friend information. See Figure 27.

Figure 27.png
Figure 27.  Venmo friends.

These columns are fairly self-explanatory. ZCREATEDAT is the time the user account was created. ZDISPLAYNAME is the display name for the account. ZFAMILYNAME is the user’s last name. ZGIVENNAME is the user’s first name. ZIDENTIFER is the user’s userID, and ZUSERNAME is the user’s user name.

The PublicFeed file contains the same data as the Private and Friends versions, but includes data about people with whom the user is not friends with.

The next interesting file is net.kortina.labs.Venmo.plist which is found in the ./Library/Preferences folder. The top part of the file is seen in Figure 28.

Figure 28.png
Figure 28.  net.kortina.labs.Venmo.plist, Part 1.

The data in the red square is the iOS equivalent to the Android data seen in Figure 13. The blue square references Venmo friends. I expanded it in Figure 29.

Figure 29.png
Figure 29. plist friends from net.kortina.labs.Venmo.plist.

The data in Figure 29 comes from the user’s contact list on device.

Figure 30 is the rest of the net.kortina.labs.Venmo.plist. The data in the red squares are the display name, the email address associated with the account, and the phone number associated with the account. You can see, based on the data in the blue square, that I changed my default privacy level to “Friends.”

Figure 30.png
Figure 30. net.kortina.labs.Venmo.plist, Part 2.

Figure 31 shows the file found in the same directory as the previous file. The app version and the first time the app was opened can be found in the red and blue squares, respectively.

Figure 31.png
Figure 31.

That’s it for iOS. The data is readily available, and includes data that is not necessarily present in Android. As with Android, I did not find any information on the card attached to the account, and I did not test any of the linking or integration abilities.


Virtual payments are a thing, no doubt about it. Amazon Pay, PayPal, Venmo, Zelle and many others along with the platform-specific pay services like Apple Pay and Google Pay are all trying to help people pay for things. Venmo is ranked 3rd in that group, and stands out from the crowd by mixing payment services with social media capabilities.

Venmo contains POL data, which can be valuable, and the data is versatile. It can be used to extrapolate information about the user which can be useful for investigators.

Ridin’ With Apple CarPlay

I have been picking on Google lately.  In fact, all of my blog posts thus far have focused on Google things.  Earlier this year I wrote a blog about Android Auto, Google’s solution for unifying telematic user interfaces (UIs), and in it I mentioned that I am a daily CarPlay driver.  So, in the interest of being fair, I thought I would pick on Apple for a bit and take a look under the hood of CarPlay, Apple’s foray into automotive telematics.

Worldwide, 62 different auto manufacturers make over 500 models that support CarPlay.  Additionally, 6 after-market radio manufacturers (think Pioneer, Kenwood, Clarion, etc.) support CarPlay.  In comparison, 41 auto manufacturers (again, over 500 models – this is an increase since my earlier post) and 19 after-market radio manufacturers support Android Auto.  CarPlay runs on iPhone 5 and later.  It has been a part of iOS since its arrival (in iOS 7.1), so there is no additional app to download (unlike Android Auto).  A driver simply plugs the phone into the car (or wirelessly pairs it if the car supports it) and drives off; a wired connection negates the need for a Bluetooth connection.  The toughest thing about CarPlay setup is deciding how to arrange the apps on the home screen.

In roughly 5 years’ time CarPlay support has grown from 3 to 62 different auto manufacturers.  I can remember shopping for my 2009 Honda (in 2012) and not seeing anything mentioned about hands-free options.  Nowadays, support for CarPlay is a feature item in a lot of car sales advertisements.  With more and more states enacting distracted driving legislation, I believe using these hands-free systems will eventually become mandatory.

Before we get started, let’s take a look at CarPlay’s history.

Looking in the Rearview Mirror

The concept of using an iOS device in a car goes back further than most people realize.  In 2010 BMW announced support for iPod Out, which allowed a driver to use their iPod via an infotainment console in select BMW & Mini models.

iPod Out-1
Figure 1.  iPod Out.  The great-grandparent of CarPlay.
iPod Out-2
Figure 2.  iPod Out (Playback).

The iPod connected to the car via the 30-pin to USB cable, and it would project a UI to the screen in the car.  iPod Out was baked in to iOS 4, so the iPhone 3G, 3GS, 4, and the 2nd and 3rd generation iPod Touches all supported it.  While BMW was the only manufacturer to support iPod Out, any auto manufacturer could have supported it; however, it just wasn’t widely advertised or adopted.

In 2012 Siri Eyes Free was announced at WWDC as part of iOS 6.  Siri Eyes Free would allow a user to summon Siri (then a year old in iOS) via buttons on a steering wheel and issue any command that one could normally issue to Siri.  This differed from iPod Out in that there was no need for a wired-connection.  The car and iOS device (probably a phone at this point) utilized Bluetooth to communicate.  The upside to Siri Eyes Free, beyond the obvious safety feature, was that it could work with any in-car system that could utilize the correct version of the Bluetooth Hands-Free Profile (HFP).  No infotainment center/screen was necessary since it did not need to project a UI.  A handful of auto manufacturers signed on, but widespread uptake was still absent.

At the 2013 WWDC Siri Eyes Free morphed in to iOS in the Car, which was part of iOS 7.  iOS in the Car can be thought of as the parent of CarPlay, and closely resembles what we have today.  There were, however, some aesthetic differences, which can be seen below.

Figure 3.  Apple’s Eddy Cue presenting iOS in the Car (Home screen).
Figure 4.  Phone call in iOS in the Car.
FIgure 5.  Music playback in iOS in the Car.
Screen Shot 2013-06-10 at 12.59.52 PM
Figure 6.  Getting directions.
Screen Shot 2013-06-10 at 2.09.12 PM
Figure 7.  Navigation in iOS in the Car.

iOS in the Car needed a wired connection to the vehicle, or so was the general thought at the time.  During the iOS 7 beta, switches were found indicating that iOS in the Car could, potentially, operate over a wireless connection, and there was even mention of it possibly leveraging AirPlay (more on that later in this post).  Unfortunately, iOS in the Car was not present when iOS 7 was initially released.

The following spring Apple presented CarPlay, and it was later released in iOS 7.1.  At launch there were three auto manufactures that supported it:  Ferrari, Mercedes-Benz, and Volvo.  Personally, I cannot afford cars from any of those companies, so I am glad more manufacturers have added support.

CarPlay has changed very little since its release.  iOS 9 brought wireless pairing capabilities to car models that could support it, iOS 10.3 added recently used apps to the upper left part of the screen, and iOS 12 opened up CarPlay to third party navigation applications (e.g. Google Maps and Waze).  Otherwise, CarPlay’s functionality has stayed the same.

With the history lesson now over, there are a couple of things to mention.  First, this research was conducted using my personal phone, an iPhone XS (model A1920) running iOS 12.2 (build 16E227).  So, while I do have data sets, I will not be posting them online as I did with the Android Auto data.  If you are interested in the test data, contact me through the blog site and we’ll talk.

Second, at least one of the files discussed (the cache file in the locationd path) is in a protected area of iPhone, so there are two ways you can get to it:  jailbreaking iPhone or using a “key” with a color intermediate between black and white. The Springboard and audio data should be present in an iTunes backup or in an extraction from your favorite mobile forensic tool.

Let’s have a look around.

Test Drive

I have been using CarPlay for the past two and a half years.  A majority of that time was with an after-market radio from Pioneer (installed in a 2009 Honda), and the last six months have been with a factory-installed display unit in a 2019 Nissan.  One thing I discovered is that there are some slight aesthetic differences in how each auto manufacturer/after-market radio manufacturer visually implements CarPlay, so your visual mileage may vary.  However, the functionality is the same across the board.  CarPlay works just like iPhone.

Figure 8 shows the home screen of CarPlay.

IMG_0769 2
Figure 8.  CarPlay’s home screen.

The home screen looks and operates just like iPhone, which was probably the idea.  Apple did not want users to have a large learning curve when trying to use CarPlay.  Each icon represents an app, and the apps are arranged in rows and columns.  Unlike iPhone, creating folders is not an option, so it is easy to have multiple home screens. The icons are large enough to where not much fine motor skill is necessary to press one, which means you probably won’t be hunting for or pressing the wrong app icon very often.

The button in the orange box is the home button.  It is persistent across the UI, and it works like the iPhone home button:  press it while anywhere and you are taken back to the home screen.  The area in the blue box indicates there are two home screens available, and the area in the red box shows the most recently used apps.

Most of the apps should be familiar to iPhone users, but there is one that is not seen on iPhone:  the Now Playing app.  This thing is not actually an app…it can be thought of more like a shortcut.  Pressing it will bring up whatever app currently has control of the virtual sound interface of CoreAudio (i.e. whatever app is currently playing or last played audio if that app is suspended in iPhone’s background).

Swiping left, shows my second home screen (Figure 9).  The area in the red box is the OEM app.  If I were to press it, I would exit the CarPlay UI and would return to Nissan Connect (Nissan’s telematic system); however, CarPlay is still running in the background.  The OEM app icon will change depending on the auto maker.  So, for example, if you were driving a Honda, this icon would be different.

IMG_0771 1.jpg
Figure 9.  The second batch of apps on the second home screen.

A user can arrange the apps any way they choose and there are two ways of doing this, both of which are like iPhone.  The first way is to press and hold an app on the car display unit, and then drag it to its desired location.  The second way is done from the screen seen in Figure 10.

Figure 10.  CarPlay settings screen.

The screen in Figure 10 can be found on iPhone by navigating to Settings > General > CarPlay and selecting the CarPlay unit (or units – you can have multiple)…mine is “NissanConnect.”  Moving apps arounds is the same here as it is on the display unit (instructions are present midway down the screen).  Apps that have a minus sign badge can be removed from the CarPlay home screen.  When an app is removed it is relegated to the area just below the CarPlay screen; in Figure 10 that area holds the MLB AtBat app, AudioBooks (iBooks), and WhatsApp.  If I wanted to add any relegated apps to the CarPlay home screen I could do so by pushing the plus sign badge.  Some apps cannot be relegated:  Phone, Messages, Maps, Now Playing, Music, and the OEM app.  Everything else can be relegated.

One thing to note here.  iOS considers the car to be a USB accessory, so CarPlay does have to abide by the USB Restricted Mode setting on iPhone (if enabled).  This is regardless of whether the Allow CarPlay While Locked toggle switch is set to the on position.

The following screenshots show music playback (Figure 11), navigation (Figure 12), and podcast playback (Figure 13).

Figure 11.  Music playback.
Figure 12.  Navigation in CarPlay.
Figure 13.  Podcast playback.

Messages in CarPlay is a stripped-down version of Messages on iPhone.  The app will display a list of conversations (see Figure 14), but it will not display text of the conversations (Apple obviously doesn’t want a driver reading while driving).  Instead, Siri is used for both reading and dictating messages.

Figure 14.  Messages conversation list.

Phone is seen in Figures 15; specifically, the Favorites tab.  The tabs at the top of the screens mirror those that are seen on the bottom in the Phone app on iPhone (Favorites, Recents, Contacts, Keypad, and Voicemail).  Those tabs look just like those seen in iPhone.

Figure 15.  Phone favorites.
Figure 16.  The keypad in Phone.

If I receive a phone call, I can answer it in two ways:  pressing the green accept button (seen in Figure 17) or pushing the telephone button on my steering wheel.  Answering the call changes the screen to the one seen in Figure 18.  Some of the items in Figure 18 look similar to those seen in iOS in the Car (Figure 4).

Figure 17.  An incoming call.
Figure 18.  An active phone call.

Most apps will appear like those pictured above, although, there may be some slight visual/functional differences depending on the app’s purpose, and, again, there may be some further visual differences depending on what car or after-market radio you are using.

Speaking of purpose, CarPlay is designed to do three things:  voice communication, audio playback, and navigation.  These things can be done fairly well through CarPlay, and done safely, which, I believe, is the main purpose.  Obviously, some popular apps, such as Twitter or Facebook, don’t work well in a car, so I don’t expect true social media apps to be in CarPlay any time soon if at all (I could be wrong).

Now that we have had a tour, let’s take a look under the hood and see what artifacts, if any, can be found.

Under the Hood

After snooping around in iOS for a bit I came to a realization that CarPlay is forensically similar to Android Auto:  it merely projects the apps that can work with it on to the car’s display unit, so the individual apps contain a majority of the user-generated data.  Also, like Android Auto, CarPlay does leave behind some artifacts that may be valuable to forensic examiners/investigators,  and, just like any other artifacts an examiner may find, these can be used in conjunction with other data sources to get a wholistic picture of a device.

One of the first artifacts that I found is the cache.plist file under locationd.  It can be found in the private > var > root > Library > Caches > locationd path.  cache.plist contains the times of last connect and last disconnect.  I did not expect to find connection times in the cache file of the location daemon, so this was a pleasant surprise.  See Figure 19.

Figure 19.  Last connect and last disconnect times.

There are actually three timestamps here, two of which I have identified.  The timestamp in the red box is the last time I connected to my car. It is stored in CF Absolute Time (aka Mac Absolute Time), which is the number of seconds since January 1, 2001 00:00:00 UTC.  The time, 576763615.86389804, converts to April 12, 2019 at 8:06:56 AM (EDT).  I had stopped at my favorite coffee shop on the way to work and when I hopped back in the car, I plugged in my iPhone and CarPlay initialized.  See Figure 20.

Figure 20.  Time of last connect.

The time stamp in the green box just under the string CarKit NissanConnect, is a bit deceptive.  It is the time I disconnected from my car.  Decoding it converts it to April 12, 2019 at 8:26:18 AM (EDT).  Here, I disconnected from my car, walked into work, and badged in at 8:27:14 AM (EDT).  See Figure 21.

Figure 21.  Time of last disconnect.

The time in the middle, 576764725.40157998, is just under a minute before the timestamp in the green box.  Based on my notes, it is the time I stopped playback on a podcast that I was listening to at the time I parked.  I also checked KnowledgeC.db (via DB Browser for SQLite) and found an entry in it for “Cached Locations,” with the GPS coordinates being where I parked in my employer’s parking lot.  Whether the middle timestamp represents the time the last action was taken in CarPlay is a good question and requires more testing.

The next file of interest here is the file.  It can be found by navigating to the private > var > mobile > Library > Preferences path.  See Figure 22.

Figure 22.  carplay.plist

The area in the red box is of interest.  Here the name of the car that was paired is seen (NissanConnect) along with a GUID.  The fact that the term “pairings” (plural) is there along with a GUID leads me to believe that multiple cars can be paired with the same iPhone, but I wasn’t able to test this as I am the only person I know that has a CarPlay capable car.  Remember the GUID because it is seen again in discussing the next artifact.  For now, see Figure 23.

Figure 23.  Main CarPlay setting page in iOS.

Figure 23 shows the settings page just above the one seen in Figure 10.  I show this merely to show that my car is labeled “NissanConnect.”

The next file is 10310139-130B-44F2-A862-7095C7AAE059-CarDisplayIconState.plist.  It can be found in the private > var > mobile > Library > Springboard path.  The first part of the file name should look familiar…it is the GUID seen in the file.  This file describes the layout of the home screen (or screens if you have more than one).  I found other files in the same path with the CarDisplayIconState string in their file names, but with different GUIDs, which causes me to further speculate that multiple CarPlay units can be synced with one iPhone.  See Figure 24.

Figure 24.  CarPlay Display Icon State.

The area in the red and blue boxes represent my home screens.  The top-level Item in the red box, Item 0, represents my first home screen, and the sub-item numbers represent the location of each icon on the first home screen.  See Figure 25 for the translation.

Figure 25.  Home screen # 1 layout.

The area in the blue box in Figure 24 represents my second home screen, and, again, the sub-item numbers represent the location of each icon on the screen.  See Figure 26 for the translation.

Figure 26.  Home screen # 2 layout.

The entry below the blue box in Figure 24 is labeled “metadata.”  Figure 27 shows it in an expanded format.

Figure 27.  Icon state “metadata.”

The areas in the green and purple boxes indicate that the OEM app icon is displayed, and that it is “Nissan” (seen in Figure 26).  The areas in the orange and blue boxes describe how the app icon layout should be (four columns and two rows).  The area in the red box is labeled “hiddenIcons,” and refers to the relegated apps previously seen in Figure 10.  As it turns out, the items numbers also describe their position.  See Figure 28.

Figure 28.  Hidden icon layout.

Notice that this file did not describe the location of the most recently used apps in CarPlay (the area in the upper left portion of the display screen).  That information is described in, which is found in the same path.  See Figure 29.

Figure 29.  Springboard and most recently used apps.

Just like the app icon layout previously discussed, the item numbers for each most recently used app translate to positions on the display screen.  See Figure 30 for the translation.

IMG_0769 1
Figure 30.  Most recently used apps positions.

The next file is the, which is found in the private > var > mobile > Library > Preferences path.  This file had a bunch of data in it, but there are three values in this file that are relevant to CarPlay.  See Figure 31.

Figure 31.  Celestial.

The string in the green box represents what app had last played audio within CarPlay prior to iPhone being disconnected from the car.  The area in blue box is self-explanatory (I had stopped my podcast when I parked my car).  The item in the red box is interesting.  I had been playing a podcast when I parked the car and had stopped playback.  Before I disconnected my iPhone, I brought the Music app to the foreground, but did not have it play any music, thus it never took control of the virtual sound interface in CoreAudio. By doing this, the string in the red box was generated.  Just to confirm this, I tested this scenario a second time, but did not bring the Music app to the foreground; the value nowPlayingAppDisplayIDUponCarPlayDisconnect was not present in the second plist file.  I am sure this key has some operational value, although I am not sure what that value is.  If anyone has any idea, please let me know.

As I mentioned earlier in this post, Siri does a lot of the heavy lifting in CarPlay because Apple doesn’t want you messing with your phone while you’re driving.  So, I decided to look for anything Siri-related, and I did find one thing…although I will say that this  is probably not exclusive to CarPlay.  I think this may be present regardless of whether it occurs in CarPlay or not (more testing).  In the path private > var > mobile > Library > Assistant there is a plist file named PreviousConversation (there is no file extension but the file header indicates it is a bplist).  Let me provide some context.

When I pick up my child from daycare in the afternoons, I will ask Siri to send a message, via CarPlay, to my spouse indicating that my child and I are on the way home, and she usually acknowledges.  The afternoon before I extracted the data from my iPhone (04/11/2019), I had done just that, and, after a delay, my spouse had replied “Ok.”

PreviousConversation contains the last conversation I had with Siri during this session. When I received the message, I hit the notification I received at the top of the CarPlay screen, which triggered Siri.  The session went as so:

Siri:                 “[Spouse’s name] said Ok.  Would you like to reply?”

Me:                  “No.”

Siri:                 “Ok.”

See Figure 32.

FIgure 32.  Session with Siri.

The area in the red box is the name of the sender, in this case, my spouse’s (redacted) name.  The orange box was spoken by Siri, and the blue box is the actual iMessage I received from my spouse.  The purple box is what was read to me, minus the actual iMessage.  Siri’s inquiry (about my desire to reply) is seen in Figure 33.

Figure 33.  Would you like to reply?

Figure 34 contains the values of the message sender (my spouse).  Inside of the red box the field “data” contains the iMessage identifier…in this case, my spouse’s phone number.  The field “displayText” is my spouse’s name (presumably pulled from my Contact’s list).  Figure 35 has the message recipient information:  me.

Figure 34.  Message sender.
Figure 35.  Message recipient (me) plus timestamp.

Figure 35 also has the timestamp of when the message was received (orange box), along with my spouse’s chat identifier (blue box).

Figure 36.  Siri’s response.

Figure 36 shows Siri’s last response to me before the session ended.

Interesting note:  this plist file had other interesting data in it.  One thing that I noticed is that each possible response to the inquiry “Would you like to reply?” had an entry in here:  “Call” (the message sender), “Yes” (I’d like to reply), and “No” (I would not like to reply).  It might be a good research project for someone.  🙂

The next artifact actually comes from a file previously discussed:  While examining this file I found something interesting that bears mentioning in this post.  My iPhone has never been paired via Bluetooth with my 2019 Nissan.  When I purchased the car, I immediately started using CarPlay, so there has been no need to use Bluetooth (other than testing Android Auto).  Under the endointTypeInfo key I found the area seen in Figure 37.

Figure 37.  What is this doing here?

The keys in the red box contain the Bluetooth MAC address for my car.  I double-checked my Bluetooth settings on the phone and the car, and the car Bluetooth radio was turned off, but the phone’s radio was on (due to my AppleWatch).  So, how does my iPhone have the Bluetooth MAC address for my car?  I do have a theory, so stay with me for just a second.  See Figure 38.

Figure 38.  AirPlay indicator.

Figure 38 shows the home screen of my iPhone while CarPlay is running.  Notice that the AirPlay/Bluetooth indicator is enabled (red box).  Based on some great reverse engineering, it was found that any device that uses the AirPlay service will use its MAC address in order to identify itself (deviceid).  Now, see Figure 39.

Figure 39. Virtual Audio Interfaces for AirPlay and CarPlay.

Figure 39 shows two files, both of which are in the Library > Audio > Plugins > HAL path.  The file on the left is the info.plist file for the Halogen driver (the virtual audio interface) for AirPlay and the file on the right is the info.plist file for the Halogen driver for CarPlay.  The plug-in identifiers for each (both starting with EEA5773D) are the same.  My theory is that CarPlay may be utilizing AirPlay protocols in order to function, at least for audio.  I know this is a stretch as those of us that use AirPlay know that it typically is done over a wireless connection, but I think there is a small argument to be made here.  Obviously, this requires more research and testing, and it is beyond the scope of this post.


CarPlay is Apple’s attempt at (safely) getting into your car.  It provides a singular screen experience between iPhone and the car, and it encourages safe driving.  While a majority of the user-generated artifacts are kept by the individual apps that are used, there are artifacts specific to CarPlay that are left behind.  The app icon layout, time last connected and disconnected, and last used app can all be found in these artifacts.  There are also some ancillary artifacts that may also be useful to examiners/investigators.

It has been a long time since I really dug around in iOS, and I saw a lot of interesting things that I think would be great to research, so I may be picking on Apple again in the near future.