sqlite

Follow-on to DFIR Summit Talk: Lucky (iOS) 13: Time To Press Your Bets (via @bizzybarney)

Facial Recognition in Photos

One facet of my DFIR Summit talk I want to expand upon is a look into the Photos application, and a few of the derivative pieces of that endeavor.  While trying to focus on the topic of facial recognition, it seemed prudent to include a brief progression from snapping a photo thru to a persons name being placed beside their face in the Photos application.  

When you use the Native camera and snap a photo, depending on user options, at least a few standard things occur.  It ultimately writes the newly taken photo to /private/var/mobile/Media/DCIM/1**APPLE/IMG_0001.HEIC / .JPG.  As the photo is taken, the Photos.sqlite database is updated to reflect a lot of the metadata about the photo,  which will be covered a bit later.  Additionally, the “PreviewWellImage.tiff” is created.  The “PreviewWellImage.tiff” represents the photo you see when you open your Photos application and see a preview of the most recent image, which is the photo just taken by the camera in this instance.

The beginning of the user’s photos reside in the ../100APPLE/ directory, but this directory iterates upwards (101APPLE, 102APPLE, etc) as more and more photos and videos are saved.  If iCloud syncing is turned on by the user, then several others behaviors occur - but that is for another time.

Let’s focus on the analysis and intelligence built into the Photos application.  I’m able to type text strings and my photos are immediately searched for matching objects within them.  There is a section of my Photos dedicated to “People” where a name has been associated with a face that Apple has analyzed.  

Some of the analysis pieces occurring with the Photos happens in the mediaanalysis.db file.  This file is analyzing and scoring the media files and producing results that seems to feed into other pieces of the analysis.  Some scoring results to highlight are ones that focus on Humans, Faces, and Pets.  

Path: /private/var/mobile/Media/MediaAnalysis/mediaanalysis.db

The ‘Results’ table of the mediaanalysis.db file contains the ‘assetId’ which represents a media file, a ‘resultsType’ which is the specific analytical type and score value found in that media file, and a BLOB (binary large object) which is a binary .plist (bplist).  You can see in the image below, the ‘assetId’ 3 has numerous ‘resultsType’ associated with it and the BLOB for ‘resultsType’ of 1 is selected and on the right you can see the bplist.  

That bplist can be printed to a text file by saving the bplist as a file and then using ‘plutil’ to print it to a text file. As you can see below beside the red star, the printed text is a clean presentation of that .bplist and it tells us that ‘resultsType’ of 1 is associated with Faces based on the scoring.  

I repeated that process for the remaining pieces of data and wrote a brief description of the results type for each, although my device still had a few types that did not have any results.

After running a SQL Query against this database, you can sort the results to potentially see just files that have the results type for ‘humanBounds’ and ‘humanConfidence’.  The ‘localIdentifier’ column from the “assets’ table is a UUID which matches up to the ZGENERICASSETS table of the Photos.sqlite.  

Here is the query for the mediaanalysis.db file.  It’s big and ugly but please test it out if you’re interested, but this piece just seems to build into what we will see later in the Photos.sqlite where this all comes together.

select
	a.id,
	a.localIdentifier as "Local Identifier",
	a.analysisTypes as "Analysis Types",
	datetime(a.dateModified+978307200, 'unixepoch') as "Date Modified (UTC)",
	datetime(a.dateAnalyzed+978307200, 'unixepoch') as "Date Analyzed (UTC)",
CASE
	when results.resultsType = 1 then "Face Bounds / Position / Quality"
	when results.resultsType = 2 then "Shot Type"
	when results.resultsType = 3 then "Duration / Quality / Start"
	when results.resultsType = 4 then "Duration / Quality / Start"
	when results.resultsType = 5 then "Duration / Quality / Start"
	when results.resultsType = 6 then "Duration / Flags / Start"
	when results.resultsType = 7 then "Duration / Flags / Start"
	when results.resultsType = 15 then "Duration / Quality / Start"
	when results.resultsType = 19 then "Duration / Quality / Start"
	when results.resultsType = 22 then "Duration / Quality / Start"
	when results.resultsType = 23 then "Duration / Quality / Start"
	when results.resultsType = 24 then "Duration / Quality / Start"
	when results.resultsType = 25 then "Duration / Quality / Start"
	when results.resultsType = 27 then "Duration / Quality / Start"
	when results.resultsType = 36 then "Duration / Quality / Start"
	when results.resultsType = 37 then "Duration / Quality / Start"
	when results.resultsType = 38 then "Duration / Quality / Start"
	when results.resultsType = 39 then "Duration / Quality / Start"
	when results.resultsType = 48 then "Duration / Quality / Start"
	when results.resultsType = 8 then "UNK"
	when results.resultsType = 11 then "UNK"
	when results.resultsType = 13 then "UNK"
 	when results.resultsType = 21 then "UNK” 
	when results.resultsType = 26 then "UNK"
	when results.resultsType = 31 then "UNK"
	when results.resultsType = 42 then "UNK"
	when results.resultsType = 45 then "UNK"
	when results.resultsType = 49 then "UNK"
	when results.resultsType = 9 then "Attributes - junk"
	when results.resultsType = 10 then 'Attributes - sharpness'
	when results.resultsType = 12 then "Attributes - featureVector"
	when results.resultsType = 14 then "Attributes - Data"
	when results.resultsType = 16 then "Attributes - orientation"
	when results.resultsType = 17 then 'Quality'
	when results.resultsType = 18 then "Attributes - objectBounds"
	when results.resultsType = 20 then "Saliency Bounds and Confidence"
	when results.resultsType = 28 then "Attributes - faceId / facePrint"
	when results.resultsType = 29 then "Attributes - petsBounds and Confidence"
	when results.resultsType = 30 then "Various Scoring Values"
	when results.resultsType = 32 then "Attributes - bestPlaybackCrop"
	when results.resultsType = 33 then "Attributes - keyFrameScore / keyFrameTime"
	when results.resultsType = 34 then "Attributes - underExpose"
	when results.resultsType = 35 then "Attributes - longExposureSuggestionState / loopSuggestionState"
	when results.resultsType = 40 then "Attributes - petBounds and Confidence"
	when results.resultsType = 41 then "Attributes - humanBounds and Confidence"
	when results.resultsType = 43 then "Attributes - absoluteScore/ humanScore/ relativeScore"
	when results.resultsType = 44 then "Attributes - energyValues/ peakValues"
	when results.resultsType = 46 then "Attributes - sceneprint/ EspressoModelImagePrint"
	when results.resultsType = 47 then "Attributes - flashFired, sharpness, stillTime, texture"
end as "Results Type",
	hex(results.results) as "Results BLOB"
from assets a
left join results on results.assetId=a.id

Before diving into the Photos.sqlite file, I want to first point out the file recording the text strings as an apparent result of Apple’s object analysis of the photos.  This file stores the text results which empower our ability to search text strings in the Photos application and return results.  The text strings are not necessarily a result of any specific user activity, but instead an output from analysis automatically being deployed by Apple against the user’s media files.  

Path: /private/var/mobile/Media/PhotoData/Caches/search/psi.sqlite

The ‘word_embedding’ table within psi.sqlite contains columns ‘word’ and ‘extended_word’ which are just strings stored in BLOB’s.  Using DB Browser for SQLite you can export the table to a CSV and it prints the strings from the BLOB’s pretty cleanly.  Separately there is also a table named ‘collections’ that has ‘title’ and ‘subtitle’ columns that appear to be a history of the Memories and Categories that have been used or ones that will be used.

The last table in psi.sqlite to mention for this piece is the ‘groups’ table.  Within the groups table the ‘content_string’ contains some really interesting data.  I initially set out to find just the words “Green Bay” as it was something populated in my text search for the letter “g”.  What I found was far more interesting.  I did find “Green Bay” but additionally found in one of the other BLOB’s, “Green Bay Packers vs. Miami Dolphins”.  That BLOB has a little extra flavor added by Apple for me.  Whether they simply used my geo coordinates baked into my photos from being at Lambeau Field, or analyzed the content of the photos and found the various Miami Dolphins jerseys - I’m not sure.  But a very interesting artifact that is absolutely accurate, dropped in there for me.  Thanks Apple!

Now let’s tackle Photos.sqlite, but only as it pertains to facial recognition and associating photos of people to an actual name.  Because quite honestly this singular file is nearly a full time job if someone wanted to parse every inch of it, and maintain that support.

Path: /private/var/mobile/Media/PhotoData/Photos.sqlite

My instance of Photos.sqlite is a beast, weighing in at over 300MB and containing 67 tables packed full of data about my Photos.  We are going to focus on two tables - ZDETECTEDFACE and ZPERSON.  

ZDETECTEDFACE 

This table contains values that indicate features about faces to include an estimate of age, hair color, baldness, gender, eye glasses, and facial hair.  Additionally there are indicators for if the left or right eyes were closed, and X and Y axis measurements for the left eye, right eye, mouth and center.  So the data in this table is extremely granular, and was quite fun to work through.  Who doesn’t like looking at old photos?

ZPERSON

This table contains a count for the number of times Apple has been able to identify a certain face from the media files.  So in my device, I am recognized by name for hundreds of photos, but there are also photos of me where it hasn’t associated my name with my face.   For each face identified, a UUID (Unique Identifier) is assigned.  So although the analytics piece may not be able to connect a face with a name, it can group all identified instances of the unknown faces as being the same person. 

If there is an association made between the person’s face and a saved contact, the ZCONTACTMATCHINGDICTIONARY column’s BLOB data can possibly reveal a full name and the phone number.  This again can be achieved by printing the bplist to a .txt file.

select
	zga.z_pk,
	zga.ZDIRECTORY as "Directory",
	zga.ZFILENAME as "File Name",
CASE
	when zga.ZFACEAREAPOINTS > 0 then "Yes"
	else "N/A"
	end as "Face Detected in Photo",
CASE 
	when zdf.ZAGETYPE = 1 then "Baby / Toddler"
	when zdf.ZAGETYPE = 2 then "Baby / Toddler"
	when zdf.ZAGETYPE = 3 then "Child / Young Adult"
	when zdf.ZAGETYPE = 4 then "Young Adult / Adult"
	when zdf.ZAGETYPE = 5 then "Adult"
end as "Age Type Estimate",
case
	when zdf.ZGENDERTYPE = 1 then "Male"
	when zdf.ZGENDERTYPE = 2 then "Female"
	else "UNK"
end as "Gender",
	zp.ZDISPLAYNAME as "Display Name", 
	zp.ZFULLNAME as "Full Name",
	zp.ZFACECOUNT as "Face Count",
CASE	
	when zdf.ZGLASSESTYPE = 3 then "None"
	when zdf.ZGLASSESTYPE = 2 then "Sun"
	when zdf.ZGLASSESTYPE = 1 then "Eye"
	else "UNK"
end as "Glasses Type",
CASE
	when zdf.ZFACIALHAIRTYPE = 1 then "None"
	when zdf.ZFACIALHAIRTYPE = 2 then "Beard / Mustache"
	when zdf.ZFACIALHAIRTYPE = 3 then "Goatee"
	when zdf.ZFACIALHAIRTYPE = 5 then "Stubble"
	else "UNK"
end as "Facial Hair Type",
CASE	
	when zdf.ZBALDTYPE = 2 then "Bald"
	when zdf.ZBALDTYPE = 3 then "Not Bald"
end as "Baldness",
CASE
	when zga.zlatitude = -180
	then 'N/A'
	else zga.ZLATITUDE
end as "Latitude",
CASE
	when zga.ZLONGITUDE = -180 
	then 'N/A' 
	else zga.ZLONGITUDE
end as "Longitude",
	datetime(zga.zaddeddate+978307200, 'unixepoch') as "Date Added (UTC)",
	ZMOMENT.ztitle as "Location Title"
from zgenericasset zga
left join zmoment on zmoment.Z_PK=zga.ZMOMENT
left join ZDETECTEDFACE zdf on zdf.ZASSET=zga.Z_PK
left join ZPERSON zp on zp.Z_PK=zdf.ZPERSON
where zga.ZFACEAREAPOINTS > 0

Below is a sample of the output of this analysis, paired with the photo the metadata came from.  You can see it is able to identify me and my two daughters by name, and accurately assess our genders, my sunglasses and facial hair.  

Please test, verify, and give me a shout on Twitter @bizzybarney with any questions or concerns.  

Socially Distant but Still Interacting! New and Improved Updates to macOS/iOS CoreDuet interactionC.db APOLLO Modules

The interactionC.db database certainly does not get as much as attention as its CoreDuet partner in crime, knowledgeC.db. However, I think it has quite a bit of investigative potential. I’ve written about it before in a prior blog, however I’d like to give it more attention here.

I spent this weekend updating the APOLLO modules to have more contextual support and better backwards compatibility with older iOS versions. This database was also introduced to the macOS side with 10.15. 

I’ve added a new query for this database for the ZKEYWORDS table. This tables appears to capture keywords that are contained in various Calendar (com.apple.mobilecal) events. It seems only select certain events as not all my calendar event has keywords in this table.

The main interactionsC.db query has many new updates including attachments and sender/recipient correlation. In general, this database is keeps track of “recent” contact interactions. As an example, I used this query on my iOS database that I copied off of my iPhone a few days ago and it shows ~28,000 entries going all the way back to January of this year, 6+ months! 

In the screenshot below is a Messages (com.apple.MobileSMS) conversation between Heather Mahalik and I. Items that are blurred are our phone numbers. The GUIDs are our Contact Person IDs, these can be correlated with information in the iOS Address Book database. Why some recipient information is blank, I’m not sure. I scrolled way back in our conversation history and the timestamps are spot on. The content of this conversation could (and should) be correlated with the Messages database (sms.db).

This data is not just for Messages and may include other application bundle IDs. Some that I’ve seen in my data include:

  • com.apple.InCallService – Phone Calls

  • com.apple.MobileSMS - Messages

  • com.apple.Preferences - Settings

  • com.apple.ScreenshotServicesService - Screenshots

  • com.apple.mobilecal - Calendar

  • com.apple.mobilemail - Mail

  • com.apple.mobilesafari - Safari

  • com.apple.mobileslideshow - Photos

Contact Interactions & Attachments

For another example, let’s take a look at some interactions for the Photos app (com.apple.mobileslideshow). Not every interaction will have attachments associated with them. The screenshot below contains some AirDrop activity from this device. Some images were AirDropped from the Photos app itself, while one image was AirDropped from within the Messages application as shown in the Target Bundle ID column. It also shows the contact information to whom it was sent - helpful! 

Some of these attachments have a type associates with them in the UTI column (HEIC/PNG), the other has an associated attachment ID, show in hex (0x DCCB49C2FC74461EAD90DAB0C537DBF7). This is a hex representation of the UUID for the image. It seems not every image will have this attachment ID.

This image UUID can be searched for in the Photos database, as shown below to determine which image exactly was AirDropped. 

This might be another good artifact to look for in addition to some of the unified logs I’ve highlighted previously in AirDropping some Knowledge. This is what Elwood’s Phone looks like after lots of AirDropping of various items for that blog. 

Disassociated but not Forgotten Attachments

It seems some attachments are no longer associated with those in the ZINTERACTIONS Table but are not necessarily removed from the ZATTACHMENTS Table. This has potential forensic use as well. APOLLO will not extract these as there doesn’t appear to be timestamps associated with it however, I HIGHLY recommend at least taking a brief look at this table when doing an investigation.

The example below shows may attachment files of different types were associated with a contact interaction.  

  • PDF

  • Images (PNG, HEIC, JPG)

  • Contact VCard

  • Text

  • Archives

  • Movies

  • Documents

Unfortunately, we lose the contact context here. However, we can still find useful tidbits of information like text, file types, URLs, image UUIDs, etc.

Other examples include text that was copy/pasted, or URLs like a Google search link or a YouTube video that was sent.

A Note about Accounts & Recipient Counts

One more example of Mail (com.apple.mobilemail) shows the ACCOUNT column. This GUID can be used to tie interactions to a specific email account in the Accounts databases (Accounts3.sqlite/Accounts4.sqlite).

The Recipient Count columns can be a bit misleading. The column I’ve labeled ‘Recipient Count’ is the amount of recipients on an interaction. This examples shows 2, however that does not include myself. This is an example of an email thread between Heather, Lee, Phil, and myself. I would have thought there would be at least a 3 in this column however that doesn’t appear to be the case. A good example to not make assumptions!

The incoming/outgoing sender/recipient count columns are even more misleading – I haven’t quite figured those out but I believe this might be the total amount of interactions versus the amount of “recipients” on those interactions.

InteractionsC.db can really be a useful database, especially when it is used along with all the other data that APOLLO extracts – context can be everything!

Extensive knowledgeC APOLLO Updates!

While helping some investigators out I realized that my some of my APOLLO knowledgeC modules needed a bit of updating. Naturally I thought it would be quick, but it turned into quite an extensive update. I’ve included lots of brand-new modules as well as updates to ones that I’ve had before. 

Most of the updates to the older ones provided better backwards compatibility with older versions of macOS and iOS as well as adding additional contextual items to some of the queries from ZSTRUCTUREDMETADATA. Regression testing was performed on iOS 11, 12, and 13 and macOS 10.13, 10.14, and 10.15. Of course, please let me know if you run into a knowledgeC “stream” that I’ve not created a module for, or any issues that you might come across. 

I’ve highlighted a few modules below using my iOS 13.5 device. However, they may also apply to macOS and older iOS versions as well – review the modules for more documentation.

New Modules:

  • knowledge_activity_level_feedback.txt

  • knowledge_airplay_prediction.txt

  • knowledge_calendar_event_title.txt

  • knowledge_charging_smart_topoff_checkpoint.txt

  • knowledge_dasd_battery_temperature.txt

  • knowledge_device_locked_imputed.txt

  • knowledge_discoverability_usage.txt

  • knowledge_event_tombstone.txt

  • knowledge_inferred_microlocation_visit.txt

  • knowledge_knowledge_sync_addition_window.txt

  • knowledge_photos_edit_all.txt

  • knowledge_photos_deletes_all.txt

  • knowledge_photos_deletes_recent.txt

  • knowledge_photos_engagement.txt

  • knowledge_photos_share_airdrop.txt

  • knowledge_photos_share_all.txt

  • knowledge_photos_share_extension.txt

  • knowledge_segment_monitor.txt

  • knowledge_siri_activites.txt

  • knowledge_siri_flow_activity.txt

  • knowledge_sync_addition_window.txt

  • knowledge_sync_deletion_bookmark.txt

  • knowledge_user_first_backlight_after_wakeup.txt

The knowledge_app_activity_passbook.txt module was added to conveniently look for Apple Wallet (com.apple.Passbook) activity. Shown below I’m switching between my Apple Cash card and my Apple Card (yes, I got one for “research”).

The knowledge_photos_deletes_all.txt module appears to keep track of when I deleted a photo from the Photos app. This output is fairly vague. However, it could be useful in evidence destruction cases. The output of this one is similar to the other knowledge_photos_* modules.

Want to know if a thing was AirDrop’ed, copied, searched for, or otherwise interacted with from the iOS ShareSheet? The knowledge_sharesheet_feedback.txt module will help with that! Shown below, this module is keeping track of:

  • Photo Markups (com.apple.MarkupUI.Markup.MarkupPhotoExtension) via Camera App (com.apple.camera)

  • File Copies (com.apple.UIKit.activity.CopyToPasteboard) in Photos (com.apple.mobileslideshow)

  • Sending a photo in Messages (com.apple.MobileSMS) via Photos app (com.apple.mobileslideshow)

  • Finding text in a webpage (com.apple.mobilesafari.activity.findOnPage) in Safari (com.apple.mobilesafari)

  • Airdrop Activity (com.apple.UIKit.activity.AirDrop) 

Some modules are fairly self-explanatory. The knowledge_system_airplane_mode.txt modules keeps track of whether Airplane Mode on the device is enabled or not.

The next two are associated with the iOS low power mode functionality. The first, knowledge_device_battery_saver.txt which shows that I’ve activated Low Power Mode via the Control Center and while knowledge_device_low_power_mode.txt shows that it was turned on about two seconds after.

Click for larger view.

Updated Modules:

  • knowledge_activity_level.txt

  • knowledge_app_activity.txt

  • knowledge_app_activity_calendar.txt

  • knowledge_app_activity_clock.txt

  • knowledge_app_activity_mail.txt

  • knowledge_app_activity_maps.txt

  • knowledge_app_activity_notes.txt

  • knowledge_app_activity_photos.txt

  • knowledge_app_activity_safari.txt

  • knowledge_app_activity_weather.txt

  • knowledge_app_install.txt

  • knowledge_app_intents.txt

  • knowledge_app_location_activity.txt

  • knowledge_audio_bluetooth_connected.txt

  • knowledge_audio_output_route.txt

  • knowledge_device_batterylevel.txt

  • knowledge_device_inferred_motion.txt

  • knowledge_device_is_backlit.txt

  • knowledge_device_locked.txt

  • knowledge_device_pluggedin.txt

  • knowledge_discoverability_signals.txt

  • knowledge_notification_usage.txt

  • knowledge_paired_device_nearby.txt

  • knowledge_portrait_entity.txt

  • knowledge_portrait_topic.txt

  • knowledge_app_relevantshortcuts.txt

  • knowledge_safari_browsing.txt

  • knowledge_settings_doNotDisturb.txt

  • knowledge_siri.txt

  • knowledge_standby_timer.txt

  • knowledge_widgets_viewed.txt

The module knowledge_app_inFocus.txt has added extensions context. The extensions below show a location sign-in alert (com.apple.AuthKitUI.AKLocationSignInAlert) via the Springboard (com.apple.springboard), access to the Camera (com.apple.camera) via Messages (com.apple.MobileSMS), and access to Photos (com.apple.mobileslideshow) via Messages. All the while, playing around with the Unc0ver Jailbreak (science.xnu.undecimus).

New with knowledge_app_webusage.txt are the “Digital Health” columns. These will show website visits and associated URLs on various apps (not just Safari or Chrome!). 

In this example I was using Twitter (via Safari) on a device with the macOS hardware UUID (or iOS UDID) in Device ID column - let’s say my laptop. On my iPhone, I was also on Twitter but this time the iOS application (com.atebits.Tweetie2) ordering a new t-shirt from Jailbreak Brewery

Additions to knowledge_audio_media_nowplaying.txt include:

  • Is AirPlay Video

  • Playing – Values likely for Stopped, Playing, Paused – I will test those and update those in a future update.

  • Duration

  • Elapsed

  • Identifier

  • Media Type – Audio, Music, Video, Podcast

  • Output Device IDs (Binary plist in hex)

This is only a small slice of knowledgeC examples (and a very small part of APOLLO) so I hope this gives you some incentive to give it a try!

Guest Post by @bizzybarney! A Peek Inside the PPSQLDatabase.db Personalization Portrait Database

The DFIR Twitter-sphere exploded this morning when @mattiaep mentioned /private/var/mobile/Library/PersonalizationPortrait/PPSQLDatabase.db. I’ve been doing some research work on this file and plan to present pieces of it during my talk at the upcoming SANS DFIR Summit. I reached out to @iamevltwin and asked if she would host a quick blog post and she graciously agreed - but I now owe her gin, and steak and cheese egg rolls. In all seriousness, a huge thank you to my good friend and mentor!

Check out the upcoming Summit agenda here.

The PersonalizationPortrait directory is native on iOS 13 and contains a few interesting files. One specifically, PPSQLDatabase.db, is loaded with data. From my research, the directory existed on iOS 12 but this database did not.  Some of the data is repetitive from other native locations, but there’s a lot of context that can be correlated from this database and pieces of this data might exist longer here than they do in other places. So what is the purpose of this database? Don’t forget as you poke at it, that it has ‘Personalization’ in its title. It resides in the native /Library directory, so it’s Apple. What are you up to Apple? 

My research on this is still ongoing, but one thing I know is Apple wants me to like my device and to feel comfortable while I’m constantly immersed in their operating system. When I open News I want to see certain things of interest. When I select a photo to send to a friend Apple gives me a short list of probable recipients. When I plug my car into CarPlay I appreciate when Apple weirdly predicts where I’m headed. Is this ‘Personalization’ just for Apple or are pieces of this filtering out to other parties for better advertisement tailoring? The Widgets screen to the left of my home screen on my iPhone is a dynamic representation of so many pieces of personal data. What’s next on my Calendar and where it’s occurring, which apps I recently used, News, Weather based on my location, where I last parked my car when using CarPlay, and ScreenTime totals I can’t hardly believe. 

I want to highlight a few pieces of the PPSQLDatabase.db file to show it’s potential, and pass along an appropriate amount of caution as well. This database is aggregating data from many sources and attribution must be done carefully.

The first table to peek into is loc_records. It is absolutely loaded with location data - mine had 1000 entries. The 1000 is curious, and I wonder if anyone else’s table is limited at 1000 or if mine just landed on that nice even number. @iamevltwin did a quick check and found 1000 as well on Mac and iOS, so it seems that is the limit for the table. In reviewing just the locations from this table without joining any other table data, I quickly recognize many of them as places I have definitely been with this device in my pocket. But…then there are those locations I definitely haven’t been, ever. So while having 1000 locations records to play with sounds like good fun, don’t forget to be diligent with attribution.

Most of these location records have a column named “clp_location” which is stored as BLOB data. At closer inspection the BLOB data is a binary plist (.bplist). 

Using plutil -p produces a more friendly output format to inspect, reminiscent of Cloud-V2.sqlite and other “Significant Location” files. But that doesn’t mean my device was at that place at that time, because I wasn’t in Las Vegas on June 2nd, 2020 which is what that timestamp converts to. I checked my photos based on the ‘Places’ feature, and I did take a photo at that location - but it was June 4th, 2019 at 5:35PM. Weird.

So why is a photo from a year ago popping up? Making a few joins and gathering a bit more data produces more context to this specific item. After hammering out a query to join it over to the ‘sources’ table, I am able to see that this entry is associated with ‘com.apple.mobileslideshow’ and more specifically ‘com.apple.proactive.PersonalizationPortrait.PhotosGraphDonation’. One word really jumps out there - ‘proactive.’ That makes me think Apple is doing something for me here, and without asking. I went to my Photos section and found a categorization of Las Vegas photos for June 2-7 in my ‘Months’ category, and the photo is in that album. As I scanned the other places in these ‘proactive’ listings I am also able to see photo groupings from a trip to Jamaica (Kingston) and to Huntington Beach, CA in their own categories. 

So you may be asking yourself at this point, “Why did I just read this? It’s a location in a photo.” Stick with me, and remember we are trying to figure out what this newly discovered file is doing and how far reaching it might be. 

A few lines down from the photos we have an address in New Jersey being attributed to com.apple.mobilemail. I tweaked the query to ‘localtime’ for the sake of presentation, but I received an email on May 31, 2020 at 11:17AM EDT regarding the June 2nd release of Cellebrite Physical Analyzer version 7.34. In the signature of that email was the US headquarters address for Cellebrite - 7 Campus Drive, Suite 210, Parsippany, NJ. 

This is just scratching the surface of a few items from one table with basic table joins made. Scanning other bundles represented here I can see some of my favorite places from Apple Maps, locations recently mentioned in iMessage conversations, and Calendar records of my standing 12:30PM Zoom meeting for Life has No Ctrl+Alt+Del. 

If you don’t have time to research but would like to hear more about it, tune in to my talk at the SANS DFIR Summit on July 16th! If you do, try out this query I used for this blog post to pry around at this one table from this database and let me know how it works on your data via Twitter @bizzybarney

select
 loc_records.id,
 	sources.bundle_id as "Bundle ID",
 	sources.group_id as "Group ID",
 	datetime(sources.seconds_from_1970, 'unixepoch') as "Source Time",
 	loc_records.cll_latitude_degrees || ", "|| loc_records.cll_longitude_degrees as "Coordinates",
 	loc_records.clp_name as "Name",
 	loc_records.clp_thoroughfare as "Road",
 	loc_records.clp_subThoroughfare as "Address #",
 	loc_records.clp_locality as "City",
 	loc_records.clp_subLocality as "Sub-locality",
 	loc_records.clp_administrativeArea as "Admin Area",
 	loc_records.clp_subAdministrativeArea as "Sub Admin Area",
 	loc_records.clp_postalCode as "Postal Code",
 	loc_records.clp_ISOcountryCode as "Country Code",
 	loc_records.clp_country as "Counrty",
 	hex (loc_records.clp_location) as "Location BLOB (hex)",
 	loc_records.extraction_os_build as "iOS Build Version",
 	loc_records.category as "Category",
 	loc_records.algorithm as "Algorithm",
 	loc_records.initial_score as "Initial Score",
 	CASE
 	WHEN loc_records.is_sync_eligible = 1 then "Yes"
	WHEN loc_records.is_sync_eligible = 0 then "No"
	end as "Sync Eligible"
from loc_records  
left join sources on loc_records.source_id=sources.id