Apple will pay up to $1M to anyone who hacks its AI cloud

Apple is challenging security researchers to try and hack its AI cloud, and there's a huge bounty up for grabs.

Apple will pay up to $1M to anyone who hacks its AI cloud

Apple just made an announcement that shows it means business when it comes to keeping Apple Intelligence secure. The company is offering a massive bug bounty of up to $1 million to anyone who is able to hack its AI cloud, referred to as Private Cloud Compute (PCC). These servers will take over Apple Intelligence tasks when the on-device AI capabilities just aren’t good enough — but there are downsides, which is why Apple’s bug-squashing mission seems like a good idea.

As per a recent Apple Security blog post, Apple has created a virtual research environment and opened the doors to the public to let everyone take a peek at the code and judge its security. The PCC was initially only available to a group of security researchers and auditors, but now, anyone can take a shot at trying to hack Apple’s AI cloud.

A lot of Apple Intelligence tasks are said to be done on-device, but for more complex demands, the PCC steps in. Apple offers end-to-end encryption and only makes the data available to the user to ensure that your private requests remain just that — private. However, with sensitive data like what AI might handle, be it on Macs or iPhones, users are right to feel concerned about the potential of the data leaving their device and ending up in the wrong hands.

Apple's Craig Federighi discussing Apple Intelligence at the Worldwide Developers Conference (WWDC) 2024.Apple

That’s presumably partly why Apple is now reaching out to anyone who’s interested with this lucrative offer. The company provides access to the source code for some of the most important parts of PCC, which will make it possible for researchers to dig into its flaws.

The $1 million bounty is not universal. That’s the highest reward for the person or the team who manages to run malicious code on the PCC servers. The next-highest bounty sits at $250,000 and covers exploits that might allow hackers to extract user data from Apple’s AI cloud. There are also smaller rewards, starting at $150,000, which will be paid out to anyone who accesses user data from a “privileged network position.”

Apple’s bug bounty program has previously helped it spot exploits ahead of time while rewarding the researchers involved. A couple of years ago, Apple paid a student $100,000 for successfully hacking a Mac. Let’s hope that if there are any bugs to be found in Apple’s AI cloud, they’ll be spotted before Apple Intelligence becomes widely available.