Apple built its empire on a promise: Your iPhone is safe. The company’s famously walled garden – with tightly controlled hardware, software, and App Store curation – is supposed to be the envy of the digital world. It’s why Apple users sleep a little easier, why regulators give the company more breathing room, and why Apple can credibly claim to “think differently.”
But what if the walled garden has holes?
New research conducted by my team at Cybernews reveals a massive security oversight at the heart of Apple’s App Store: over 110,000 iOS apps – roughly 7 out of every 10 we analysed – leak “hardcoded secrets,” including API keys, authentication tokens, and cloud storage credentials. Many of these secrets unlock access to sensitive user data. Some could allow full account takeovers. Others – like those found in fetish dating apps – have exposed private photos sent in confidence.
It’s a systemic failure, and Apple, with all its resources and security rhetoric, should be held accountable.
Secrets Hidden in Plain Sight
Let’s be clear about what we found. Our researchers downloaded 156,000 iOS apps, about 8% of the App Store. We used automated analysis and reverse engineering—the same techniques used by attackers—to scan for secrets embedded directly in the app’s code—the kind of secrets developers should never store there.
Among the more than 816,000 exposed secrets, we discovered:
- 94,240 hardcoded Storage Buckets, with 836 (0.89%) lacking authentication. These open instances exposed over 76 billion files, leaking 406TB of data.
- 51,098 Firebase URLs, of which 2,218 (4.34%) lacked authentication. These open instances exposed 19.8 million records, leaking 33GB of data, including user session tokens and backend analytics. Almost all of these instances are hosted in the US.
- 8,439 Fabric API keys were exposed. Fabric, an order management system, uses these keys to manage, track, and fulfil orders.
- 3,343 live Branch keys exposed. Branch.io is a marketing platform used to track campaigns and enable advanced deep linking.
In the case of five niche dating apps – catering to LGBTQ+ users and kink communities – the leaks were especially troubling. Because their developers embedded Google Cloud credentials into their iOS app code, we found 1.5 million private user images sitting in unprotected cloud buckets: intimate photos, identity verification selfies, even images flagged for violating platform rules. All publicly accessible.
This is the kind of leak that can ruin lives – especially in countries where same-sex relationships is criminalised. Yet these apps passed Apple’s review process and remain live in the App Store.
The Myth of the Secure App Store
Apple’s defenders might point to its App Store Review Guidelines. They’re robust – at least on paper – and cover safety, performance, and legal compliance. But nowhere do they mention scanning for hardcoded secrets. If Apple does check for these weak spots behind the scenes, our findings suggest it’s doing a very poor job.
In contrast, major tech companies like GitHub, Google, and AWS all have automated detection systems to catch exposed secrets in code. Apple, with its trillion-dollar valuation, could easily implement the same – but hasn’t.
Why not?
One reason may be speed. Apple’s app approval pipeline is enormous, and slowing it down to add deep security scanning might cut into App Store revenue – especially from free apps running on ad-driven models. But another reason may be philosophical. Apple prefers to position itself as a hardware company with privacy baked in. What happens inside apps, it implies, is the developers’ responsibility.
This distinction might have worked in 2010. It doesn’t hold up today.
The Cost of Convenience
Most developers aren’t malicious. They’re just under pressure. Hardcoding secrets are faster than building secure authentication flows. Updating an app to fix a leaked secret can be risky and time-consuming. Many developers simply hope no one notices.
But attackers do notice.
In 2016, Uber was breached because hackers found hardcoded AWS credentials. In 2022, Toyota left GitHub keys exposed for five years. These weren’t amateur operations – they were failures by major companies.
If these mistakes can happen at Uber and Toyota, imagine the risks among hundreds of thousands of apps built by small firms or freelance developers.
During the times when 78% of people use mobile devices for sensitive financial and healthcare tasks, and 71% of employees use phones for work, the stakes couldn’t be higher. One compromised API key could allow a threat actor to read your medical history, hijack your crypto wallet, or impersonate you in a phishing attack.
Apple Has the Power. It Should Use It
Apple often casts itself as the privacy champion in a dangerous digital world. It markets encryption, app tracking transparency, and on-device processing. It draws a sharp contrast with Android, where data collection and security gaps are more openly discussed.
But security doesn’t end at the lock screen. And for all its technical prowess, Apple still hasn’t built the safeguards needed to prevent insecure apps from leaking user data – or even user dignity.
The tools to fix this are readily available:
- Static analysis tools can detect hardcoded secrets automatically.
- A requirement for credential scanning could be added to the App Store review.
- Apple could revoke vulnerable secrets in coordination with developers.
These aren’t radical measures. They’re standard practices at companies far smaller than Apple.
Where Is Apple’s Accountability?
We shouldn’t mistake slick marketing for security. And we shouldn’t let Apple off the hook simply because the alternative might be worse. Apple’s tight control over its ecosystem gives it enormous power – but with that comes responsibility.
Apple has already decided which apps can run on its devices, how payments are processed, and what APIs are accessible. It should also ensure that the apps it approves don’t recklessly expose private user data to the internet.
Until then, the walled garden may look pristine – but it’s full of weeds.
About the author of this editorial
Vincentas Baubonis is an expert in Full-Stack Software Development and Web App Security, with a specialised focus on identifying and mitigating critical vulnerabilities in IoT, hardware hacking, and organisational penetration testing.
As Head of Security Research at Cybernews, he leads a team that has uncovered significant privacy and security issues affecting high-profile organisations and platforms such as NASA, Google Play, and PayPal.
Under his leadership, the Cybernews team conducts over 7,000 pieces of research annually, publishing more than 600 studies each year that provide consumers and businesses with actionable insights on data security risks.
You must be logged in to post a comment.