You are currently viewing Apple confronted by victims over withdrawn CSAM plan

Apple confronted by victims over withdrawn CSAM plan

Apple has retained nudity detection in images, but dropped some CSAM protection features in 2022.

A victim of childhood sexual abuse is suing Apple over its 2022 dropping of a previously-announced plan to scan images stored in iCloud for child sexual abuse material.

Apple originally introduced a plan in late 2021 to protect users from child sexual abuse material (CSAM) by scanning uploaded images on-device using a hashtag system. It would also warn users before sending or receiving photos with algorithically-detected nudity.

The nudity-detection feature, called Communication Safety, is still in place today. However, Apple dropped its plan for CSAM detection after backlash from privacy experts, child safety groups, and governments.

A 27-year-old woman, who was a victim of sexual abuse as a child by a relative, is suing Apple using a court-allowed pseudonym for stopping the CSAM-detecting feature. She says she previously received a law-enforcement notice that the images of her abuse were being stored on iCloud via a MacBook seized in Vermont when the feature was active.

In her lawsuit, she says Apple broke its promise to protect victims like her when it eliminated the CSAM-scanning feature from iCloud. By doing so, she says that Apple has allowed that material to be shared extensively.

Therefore, Apple is selling “defective products that harmed a class of customers” like herself.

More victims join lawsuit

The woman’s lawsuit against Apple demands changes to Apple practices, and potential compensation to a group of up to 2,680 other eligible victims, according to one of the woman’s lawyers. The lawsuit notes that CSAM-scanning features used by Google and Meta’s Facebook catch far more illegal material than Apple’s anti-nudity feature does.

Under current law, victims of child sexual abuse can be compensated at a minimum amount of $150,000. If all of the potential plaintiffs in the woman’s lawsuit were to win compensation, damages could exceed $1.2 billion for Apple if it is found liable.

In a related case, attorneys acting on behalf of a nine-year-old CSAM victim sued Apple in a North Carolina court in August. In that case, the girl says strangers sent her CSAM videos through iCloud links, and “encouraged her to film and upload” similar videos, according to The New York Times, which reported on both cases.

Apple filed a motion to dismiss the North Carolina case, noting that Section 230 of the federal code protects it from liability for material uploaded to iCloud by its users. It also said that it was protected from product liability claims because iCloud isn’t a standalone product.

Court rulings soften Section 230 protection

Recent court rulings, however, could work against Apple’s claims to avoid liability. The US Court of Appeals for the Ninth Circuit has determined that such defenses can only apply to active content moderation, rather than as a blanket protection from possible liability.

Apple spokesman Fred Sainz said in response to the new lawsuit that Apple believes “child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk.”

Sainz added that “we are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”

He pointed to the expansion of the nudity-detecting features to its Messages app, along with the ability for users to report harmful material to Apple.

The woman behind the lawsuit and her lawyer, Margaret Mabie, do not agree that Apple has done enough. In preparation for the case, Mabie dug through law enforcement reports and other documents to find cases related to her clients’ images and Apple’s products.

Mabie eventually built a list of more than 80 examples of the images being shared. One of the people sharing the images was a Bay Area man who was caught with more than 2,000 illegal images and videos stored in iCloud, the Times noted.

Source