Pentesting primer: The modern holes in iOS’s walled garden.
Hey everyone, lets jump headfirst into some mobile red team writing this week!
What began as a bit of a project became a nostalgia trip and pentest idea list. Craft is a part of red team work, and knowing the hardware, where the tech stack trickles down through the cracks, is just as import as what encompasses each application. I hope that parts of this guide can be helpful as you explore the edges of the new walled garden boundaries Apple has set for iOS. Do your best to prod away at the weakness’ until they’re explored, reported and fixed proper.
We’ll go over a comparison of iOS and Android from a historical perspective. Starting off we have a recognizable feature Apple added to its iOS platform in iOS 4 & 8, folders and widgets. User made folders for app categories came to Android 4.0. Widgets were technically in Android since its outset at 1.0. The folder itself is interesting as it allows users to get through homescreen clutter quickly.
Widgets are self-contained microservices or gadgets. They connect to data sources to get various information, or they can be functions. They can tell you the weather, give you an email preview, or turn your flashlight on and off. They can be as large as a full page calendar or as small as a button, saving you an app click.
The most interesting thing happened in iOS 14’s update. Apps are not longer represented by the single app. This means that apps no longer live and die on your device’s home screen since the icon can simply be removed, while the application persists in the phone’s storage. If an app is installed, you can rename it on your home screen, move it, change its size and delete it. If it’s deleted, it might still be on your device in the folder location if you only deleted the shortcut.
The most obvious security implication is forgetting you removed the shortcut and not the app. Since it still lives on your phone, it can still track and provide the developers information on usage and other allowed permissions. With widgets, the API’s they’re connecting to need to be secured to prevent an attacker from faking a request via the widget’s TCP packets. This is the tip of the iceberg when it comes to Android features being mirrored in iOS.
Control center was an idea from a few interesting sources we’ll talk about below. With control center, you can use a simple swipe motion to access a spot to toggle phone features without tapping through settings. This solves the problem of UI complexity, to put it geekily. Apple decided to adopt this from iOS 7 and on.
Android did it first before the homebrew scene did. With Android 1.6 you had access to a menu featuring ways to enable and disable things like antennas, wifi, and audio levels. It can be argued this sort of automation goes all the way back into ancient blackberry and even computer history, but we’re sticking with Android for the sake of congruity.
Aside from Android, it can be found in iOS through alternative means, namely the homebrerw scene in the era. The first brush of control center was an app by ‘BigBoss’, started from a jailbroken device as its precedent. Jailbreaking itself has been the source of a few areas where Apple has opened up its walls to more flexibility as iOS systems age and grow. [https://en.wikipedia.org/wiki/IOS_jailbreaking]
I’ll assume you know what Jailbreaking is. The next question to answer is ‘where does Jailbreaking take place?’ Everywhere. It’s another great example of the fundamental practice of reverse engineering. Take the engine out of one car, swap it with another one, and see what happens. At core, this is the concept of reversing. Even in walled garden environments, breaking things can be as easy as locating where software bugs exist, exploring the unwritten rules, and creating a way to leverage them in unexpected ways.
For the security piece, we’ll quickly look at the implications of Jailbreaking and why it’s a solution set as well as an issue set. Fundamentally, gaining access to all the toggles in a device is a topic around ownership, liability and control of the platform. iOS aimed to be a closed development sphere, with control lying in the hands of Apple. Now as we’ve progressed along with new features in our phones, that control has eventually loosened and developers have gained more experience in providing solutions with software. Reverse engineering provides all kinds of innovation, despite its legal opposition points. Aside from competition, I’d argue it’s the biggest drive for innovation because of the need for a one letter word. Security.
By having control over app features, user security is improved, developers are held accountable for their code, and applications get enough time and funding to be looked at by larger enterprises instead of users who may have a different idea of where to get their app revenue. Underhandedly charging users behind the scenes, bad. Making users aware of upfront charges, good. Jailbreaking your phone may open up a greater risk for users, and having security is a big driver for why I’m writing this blog in the first place.
Moving on, a feature universal to both Android and iOS and even some non-smartphones is test menus with telecom test codes. The purpose of this hidden functionality is to provide technical dashboards check the hardware functionality. Such test menus exist in plethora of forms on IoT and TV devices as well. Functionality here can be as simple as phone audio tech, to antenna toggles, to even factory reset a phone if a certain code is entered.
What are the security risks here? Well to the folks making the phones, it’s you. You the user are at risk in this area. This information dashboard can be useful to change reception information, read RF frequency strength and find additional phone state info.
Also accessing your voice inbox is as easy as dialing your own number and hitting pound. You’ve put a password on that, right?
The next area I will cover is the permission screen. In short, Apple’s allowed permissions mirror Android’s pretty much 1–1 in 2021.
With iOS 13, users can use features to manage an app’s access to your devices, photos, location, etc. It can also be used to clear the cache and web storage content much like Android has been able to do since 6 (and I do think earlier on some fundamental linux level). One thing that’s missing is clearing your storage. Its 2021 and still you cannot do this without the devs hard coding it in like twitter.
In the comparison screenshot, the settings field is one area you should pay attention to. These fields are custom, and can vary from app to app. Some may expose the app’s endpoints, so be sure to use these alongside your decompiler to test for vulnerabilities.
Now that the feature set is identical, you may think that there’s not much more to say. Well there is one more thing in respect to iOS development of apps, and that’s exposed settings. This is where our security issues lie.
This is a fundamental development concept: use different keys for different builds to prevent the wrong build from being accidentally published. Also don’t include those in the app. Well, whoops!
This is a case of data exposure, and this doesn’t happen as a result of good practices. Finding an app in the wild doing this SHOULD be hard based on how most CI/CD and git checking software detects settings for these. Still, be aware of the apps published out there, common weaknesses are a real problem.
Keyboards, custom keyboards specifically. Swype wasn’t always a thing, but it became a thing when we needed to use one hand to type on a touchscreen. Android introduced this first, and hundreds of other keyboard makers followed suit. Apple picked up the slack (after a jailbroken version by ‘Sea Comet‘) with iOS 8, which introduced custom keyboards through downloaded apps.
What’s the big security deal? Full access. This means an app can read what you type, what’s in your clipboard, and so on. Depending on the app, it might even expose a passcode you use for a website. While Apple does vetting on the keyboards through the iOS app store, it may not be a perfect process, so keeping an eye out for this permission becomes a user issue.
Here’s another fun one. You just bought a new video game controller and you want to start using it. You know it supports the two big players in the format wars: Dinput and Xinput. Well you fire up your iPhone into Apple arcade and attempt to connect it with Bluetooth, but it’s not on apple’s pre-approved list of devices. Bummer, now you’ve got to return it, right?
Well not exactly. There’s another way to leverage Apple’s features to get that device working. The source? Accessibility options.
Accessibility features are standard for any computer anywhere, and yet time and time again they’re the source of numerous security issues. It often stems from a lack of testing on the developer’s part, so it becomes a great way for developers to circumvent the classic walled garden model for unexpected bugs and bypass’.
Indeed, accessibility has its own Bluetooth syncing protocols, using a different configuration method as well. By using an older standard, it’s blown so much wider own to Bluetooth security risks. On the flip side though, Apple arcade is great. If you want to try a game out with your 3rd party Sega genesis controller, might I recommend World of Monsters? That’ll keep your mind off the glaring hole that allowed Apple to ignore a practical solution to a better controller whitelist.
Of course, what if you were a developer with a new radical idea. Expand your app space using a new app that needed to connect to your current app? iOS accomplishes this through an IPC technology called app groups.[https://www.appcoda.com/app-group-macos-ios-communication/]
App groups let apps share container spaces, that’s basically a blocked off chunk of an OS similar to a sandbox, to share things like login tokens, settings and so on. By sharing these things between apps, yet another spot is available to plunk a piece of code in to read and capture what’s going on with your system. Google is a well known user of this system.
As this is a relatively new space, I recommend exploring containers and sandboxes before jumping here. Mess around with docker or chrome for a bit and see what types of escape methods are present in these tools. From there, you’ll have enough of an understanding to jump into the nonsense that is IPS calls!
On that note, the question is “who gets their apps on the app store?” The verge has an article that answers how it’s not only end users, but users paying for Apple’s enterprise program.
The app store examples are particularly egregious. This offers an alternative to jailbreaking to install unscrupulous apps onto one’s device. Apps intended to go through enterprise development vetting and use in specific MDM software. Anything can happen with these apps. The Verge did a great writeup on the issues with enterprise cert abuse on iOS.
This actually is a very real thing with both the legal Apple world and with Android. Android has had 3rd party app stores for years, starting with 1.6 even. What you get with these is equally unvetted unless you trust the game store source. Device-compromising malware is a real threat in this space.
File access starts off the next section. Beyond folders, the storage of your iOS device from 11 and on mirrors that which existed since the beginning of Android. Access to local storage comes with a plethora of development flexibility, but also security risks as what you have access to becomes the question. Local storage is limited in most circumstances, but can be expanded on depending if the device is rooted or not.
While the benefits of users accessing their own files are clear, so are the risks. One such risk is building an app with Apple’s own shortcut engine which can perform all sorts of functions. Heck, how about Native Development Environments right on your iOS device to prod around with to your hearts content?
Shortcuts introduced in iOS 14 and 3rd party app Scriptable allow system level and json scripted apps to run and function on your device. I’d go so far as to call this a terminal with code-less GUI features placed around it. Android had access to terminal services through bash since it’s inception as a linux distro. Having access to this, as well as 3rd party IDE’s, fundamentally changed what was allowed on the iOS platform forever.
One of my favorite Scriptible apps was Youtube’s Picture in Picture, which pulled the video stream data from the Youtube app, put it in a windowed format (a feature which also started as an Android/jailbroken feature), and made it so apps can play outside of the application, a feature Youtube hid behind a monetization wall. At this point, cutting into a revenue stream is becoming a real threat to iOS’s environment.
The 2nd to last thing to cover is Apple’s coveted Apple pay system, with it’s own PCI entry of the Apple Card, you would think user transparency, privacy and security are at the forefront of their offering. Unfortunately, you’d be wrong on multiple accounts there. Researchers at PenTestPartners found a way to discretely hide transactions from end users of its iOS wallet app.
Somewhat effortlessly, I’ve been able to replicate this in my home lab. A $0 transaction was shown, while in fact my receipt, credit card bill and the final checkout screen, showed the $1 tip added. While this impacts small transaction amounts, this could be the start of a resurgence for old school distance subscription scams in a new form.
Local access to your own hardware is nice and is usually found tied together with developer tools and resources. IDE tools are readily available in the 21st century with the advent of the internet. These are complex in most cases, Apple provides Configurator 2 and Xcode as its mac offerings, while windows has to rely on outsourced sources. They can also be simple apps targeted at users looking to speed up their phones, reset settings, read log files and yes, root their devices.
Tools for windows are numerous in purpose and scope. They include legitimate ones such as iMazing, which can pull off log analysis as well, and the only requirement is local access to the device. For QA testing this tool is often necessary. [https://imazing.com/] They also include very real, but sometimes very fake jailbreak and cleanup tools.
The risk of such an open environment here is once again, exposure to utter nonsense online. Users are exposed to certain applications and developer intent they would not otherwise be, users are more willing to install inadvertently malicious software on their PC’s or macs. This risk is very tangible and is a big strike against Apple’s own walled garden protected approach, unless you’re a mac user, then you’re fine.
A pretty dry writeup this month, but at the same time I’m glad to go over the modern features and threat landscape facing iOS today. Hopefully some of the resources are enough to get folks looking into some weird new apps.