Apple Faces $1.2 Billion Lawsuit Over Alleged Failure to Curb Child Abuse Material
Apple faces lawsuit as victims of child sexual abuse in the US are suing Apple for $1.2 billion in damages, claiming the company failed to prevent the distribution of illegal material on its platforms.
Lawsuit Sparked by Disturbing Experiences
The lawsuit is led by a 27-year-old woman who endured abuse as an infant, with photos of the crime being shared online by a family member. While the perpetrator was later convicted and imprisoned, the victim and her mother have continued to receive notifications from authorities about new charges involving the images.
“It was hard to believe there were so many out there,” the victim told The New York Times, speaking anonymously.
In late 2021, she learned that the images had been discovered on a Vermont man’s MacBook and stored in Apple’s iCloud. Feeling that Apple failed in its duty to protect victims, she decided to take legal action.
Claims of Defective Practices
The lawsuit accuses Apple of negligence, citing its 2021 unveiling of NeuralHash, a tool designed to detect child sexual abuse material (CSAM) in iCloud photos. NeuralHash would compare distinct digital signatures (known as hashes) of illegal material against user-uploaded content, flagging matches to authorities.
However, due to concerns over privacy and potential misuse, Apple shelved NeuralHash before its implementation. Critics argue this decision left victims vulnerable, with the lawsuit describing Apple’s actions as selling defective products that harmed users.
The Scope of the Lawsuit
The case, filed in Northern California, may involve up to 2,689 victims eligible for compensation. Under US law, victims of child sexual abuse are entitled to a minimum of $150,000 each, which could lead to Apple facing damages exceeding $1.87 billion if held liable.
The complainants are also demanding that Apple reform its practices to better detect and remove illegal content.
Apple’s Response and Growing Scrutiny
Apple spokesperson Fred Sainz condemned the material as “abhorrent” and stated the company is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”
The lawsuit follows increasing criticism of Apple’s efforts to address child abuse material. In 2020, an Apple executive admitted in an internal message that the company prioritised privacy over user trust and safety, labelling Apple “the greatest platform for distributing child porn.”
A 2019 New York Times investigation highlighted tech companies’ failures in tackling CSAM, revealing that Apple had reported only 267 cases of suspected material to the National Centre for Missing & Exploited Children (NCMEC) in one year. In comparison, Facebook and Google each filed over one million reports.
Global Concerns Persist
In the UK, the National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of vastly underreporting CSAM. Between April 2022 and March 2023, Apple products were linked to 337 recorded child abuse image offences in England and Wales.
“There is a concerning discrepancy between the number of crimes taking place on Apple’s services and the negligible number of global reports they make to authorities,” said Richard Collard, NSPCC’s head of child safety online policy.
Apple remains under pressure to balance privacy with safety while addressing these significant allegations.