Apple may have found a way to process AI data while retaining a measure of user privacy.
While the company will use WWDC to unveil its AI strategy within its operating systems, The Information has cited a method that Apple intends to employ confidential computing techniques, which will enable “black box processing”.
Typically, cloud services encrypt data only on disk, when it is being stored. However, the data has to be decrypted into memory in order to be processed or transformed on the server.
Per the article, Apple may have found a way to process user data in a way that allows the data to remain private throughout. Apple may have upscaled its Secure Enclave designs to enable such a programming model. Bloomberg previously mentioned the relationship between the Secure Enclave and the Apple Chips in Data Centers (ACDC) project.
The article states that there could still be a potential weakness if hackers assumed physical access to Apple’s server hardware. Still, the approach seems to be far more secure than anything Apple’s rivals are doing in the AI space. In certain events, Apple may be able to tell law enforcement that it does not have access to the information, and won’t be able to provide any user data in the case of subpoenas or government inquiries.
It’s thought that Apple’s work in this confidential computing initiative predates the current AI boom and that Apple may have been working on this for at least three years. The report also notes that in the future Apple could create lightweight wearable devices that don’t require powerful chips, as they could offload their processing to Apple’s backend.
The exact details of this technology are still murky, and The Information has said it remains to be seen how Apple will be able to preserve the security model when a singular chip in a data center is running requests from many users simultaneously.
Stay tuned for additional details as they become available.
Via 9to5Mac, Bloomberg, and The Information