You are currently viewing Apple’s AI ethics doubted by scaremongering shareholder proposal

Apple’s AI ethics doubted by scaremongering shareholder proposal

Apple Intelligence and Siri

Apple could be forced to detail more of its AI activity, after a proposal asks shareholders to expose whether Apple is truly working ethically in the field when training Apple Intelligence.

As a company well known for its stance on user privacy and security, Apple knows it has to be above board when it comes to hot-button topics. With the rise of AI and the potential for abuse, it could end up being tested by shareholders.

In a Securities and Exchange Commission filing, the National Legal and Policy Center has proposed an item on the 2025 Proxy Ballot. The proposal is to be voted on by shareholders as part of Apple’s Annual Shareholder Meeting, taking place on February 25.

The filing discusses Proposal 4 on the ballot, titled “Report on Ethical AI Data Acquisition and Usage.” It is to request that Apple prepare a report “omitting proprietary or legally privileged information” on the risks of AI development.

Specifically, it is a report to “assess the risks to the Company’s operations and finances, and to the greater public health, safety and welfare, presented by Apple’s unethical or improper usage of external data in the development and training of its artificial intelligence projects and implementation,” reads the filing.

It also has to state what steps Apple has taken to mitigate the risks, and how to measure the effectiveness of those steps.

The report is to be published within a year of the meeting, and to be updated on an annual basis. That is, if shareholders vote for it.

Questionable activity

The NLPC’s filing starts off by highlighting the public awareness of AI and data ethics, including how data is obtained to train models. This includes “data scraping, unauthorized collection, and the use of proprietary or copyrighted content without permission,” which some companies have already been accused of.

Due to the risks involved, and to “increase shareholder value,” the NLPC insists Apple should improve its disclosure of strategies for the ethical usage of user data in AI development.

“This report seeks to encourage Apple to adopt a more ambitious pro-privacy stance, which may provide the Company a strong competitive advantage,” the NLPC insists.

One early factor pointed out by the NLPC is how Apple was offered a special seat on OpenAI’s board, which was dropped “after antitrust concerns were raised.” With OpenAI accused of unethical data collection practices, Apple’s association with the company raises further issues.

Though Apple has protected user privacy in the past to “great success,” the filing adds that the “monetization potential of its massive userbase is too high to pass up.”

Apple’s longstanding partnership with Alphabet on search is mentioned, including how it is an opportunity for Alphabet to collect “massive amounts of data on Apple users.” Apple also apparently “explored a partnership with Meta,” which the NLPC deems “another serial privacy violator.”

“In effect, Apple has outsourced its unethical activities to Alphabet while collecting substantial sums in the process,” the report claims.

Apple’s own algorithms are kept secret, and the lack of transparency on their workings “poses significant ethical risks,” it continues. “It raises fundamental questions about accountability, trust, and fairness.”

Avoidable doubts

While the proposal is pretty damning, the attack on Apple’s AI work and its quite opaque policies may not go that far beyond the filing.

As well as taking proposals for items to be included in the shareholder ballot, Apple also advises which way shareholders should vote. In this case, Apple is likely to advise against the measure.

Since shareholders tend to follow along the lines of Apple’s recommendations, it’s more than likely that the proposal won’t be approved at all.

That said, the general thrust of the filing may still be felt by Apple, as it continues to keep itself operating as ethically as possible.

For example, while it prefers on-device processing that doesn’t feed back user data, it does still rely on cloud servers for some queries. In that instance, Private Cloud Compute is produced to be as private as possible, with Apple unable to access user queries through the extensive use of encryption.

Apple has also tried to answer the problem of copyrighted data collection, by seeking out archives and requesting to pay for access. By contrast, other tech companies have been sued for allegedly scraping accessible data without much thought to copyright.

The proposal may not result in a new AI ethics report, but it could help intimidate Apple into keeping on the straight and narrow in this emerging field.

Source