Apple has offered a reward of up to $1 million to anyone who can complete an incredibly difficult task.
The request comes after Apple Intelligence is set to launch on all iPhones next week (October 28).
iOS 18.1 will also become available which will include the iPhone AI features for the very first time, allowing users to play around with an enhanced version of its voice assistant, Siri.
Advert
After an ongoing battle between who is the most secure brand, specifically when it comes to private AI options, Apple has appeared to rank number one over the likes of Google and Samsung.
It’s all thanks to Apple Intelligence, which allows it to process as much data as possible on its devices.
However, to really test out whether it’s hardened and ready for the public, Apple has requested that researchers test the security of ‘Private Cloud Compute’ (PCC).
Private Cloud Compute explained
These are the servers which will receive and process user requests for Apple Intelligence when an AI request is too complex for on-device processing.
Advert
Apple is doing this to address privacy concerns and created the Private Cloud Compute servers to delete a user’s request once the task has been completed.
It also features end-to-end encryption, meaning that Apple can’t peek at any user requests made through Apple Intelligence, regardless of the fact that the brand created the server and controls it.
As privacy becomes more and more important to the public, the company has decided to show off just how private Private Cloud Compute really is.
While Apple originally began by asking a select group of researchers to hack into the system, they’ve now opened the floodgate to anyone who is willing to try their hand at having a go.
Advert
To do this, Apple will grant access to the source code for key components of Private Cloud Compute, so that they can analyze the software before trying to hack it.
Apple also created a virtual research environment for macOS which is able to run on the Private Cloud Compute software, as well as providing a security guide that explains more about the company’s server system for Apple Intelligence.
Monetary rewards
The company stated: “To further encourage your research in Private Cloud Compute, we’re expanding Apple Security Bounty to include rewards for vulnerabilities that demonstrate a compromise of the fundamental security and privacy guarantees of PCC.”
Advert
One of the rewards you could be in for include $250,000 for finding a way to hack Private Cloud Compute remotely and exposing a user’s data request file.
The company will also hand out $1 million if you can remotely attack the servers to complete a rogue computer code with privileges.
Apple explained that it might even give out rewards for anyone who reports vulnerabilities ‘even if it doesn’t match a published category’.
It added: “We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time.”
Topics: Apple, Technology, iPhone