You are currently viewing Apple offers Private Cloud Compute up for a security probe

Apple offers Private Cloud Compute up for a security probe

The virtual environment for testing Private Cloud Compute – Image credit: Apple

Apple advised at launch that Private Cloud Compute’s security will be inspectable by third parties. On Thursday, it fulfilled its promise.

In July, Apple introduced Apple Intelligence and its cloud-based processing facility, Private Cloud Compute. It was pitched as being a secure and private way to handle in-cloud processing of Siri queries under Apple Intelligence.

As well as insisting that it used cryptography and didn’t store user data, it also insisted that the features could be inspected by independent experts. On October 24, it offered an update on that plan.

In a Security Research blog post titled “Security research in Private Cloud Compute,” Apple explains that it provided third-party auditors and some security researchers with early access. This included access to resources created for the project, including the PCC Virtual Research Environment (VRE).

The post also says that the same resources are being made publicly available from Thursday. Apple says this allows all security and privacy researchers, “or anyone with interest and a technical curiosity” to learn about Private Cloud Compute’s workings and to make their own independent verification.

Resources

The release includes a new Private Cloud Compute Security Guide, which explains how the architecture is designed to meet Apple’s core requirements for the project. It includes technical details of PCC components and their workings, how authentications and routing of requests occurs, and how the security holds up to various forms of attack.

The VRE is Apple’s first ever for any of its platforms. It consists of tools to run the PCC node software on a virtual machine.

This isn’t specifically the same code as used on servers, as there are “minor modifications” for it to work locally. Apple insists the software runs identically to the PCC node, with changes only to the boot process and the kernel.

Flowchart depicting interactions between virtual research environment, private cloud compute versions, and PCC client with attestation, prompt, response, ML stack, and Apple silicon server.

A diagram showing how elements of Private Cloud Compute interact with the new virtual research environment – Image credit: Apple

The VRE also includes a virtual Secure Enclave Processor, and takes advantage of the built-in macOS support for paravirtualized graphics.

Apple is also making the source code for some key components available for inspection. Offered under a limited-use license intended for analysis, the source code includes the CloudAttestation project for constructing and validating PCC node attestations.

There’s also the Thimble project, which includes a daemon for a user’s device that works with CloudAttestation for verifying transparency.

PCC bug bounty

Furthermore, Apple is expanding its Apple Security Bounty. It promises “significant rewards” for reports of issues with security and privacy in Private Cloud Compute.

The new categories in the bounty directly align with critical threats from the Security Guide. This includes accidental data disclosure, external compromise from user requests, and physical or internal access vulnerabilities.

The prize scale starts from $50,000 for the accidental or unexpected disclosure of data due to a deployment or configuration issue. At the top end of the scale, managing to demonstrate arbitrary code execution with arbitrary entitlements, which can earn participants up to $1 million.

Apple adds that it will consider any security issue that has a “significant impact” to PCC for a potential award, even if it’s not lined up with one of the defined categories.

“We hope that you’ll dive deeper into PCC’s design with our Security Guide, explore the code yourself with the Virtual Research Environment, and report any issues you find through Apple Security Bounty,” the post states.

In closing, Apple says it designed PCC “to take an extraordinary step forward for privacy in AI,” including verifiable transparency.

The post concludes “We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time.”

Source