A truck from the child advocacy organization Heat Initiative calling on Apple to do more to police child sex abuse material on iCloud, is parked outside the Apple store as people line up to get the new iPhone 15 in Boston, Massachusetts, U.S., September 22, 2023. (Reuters/Brian Snyder/File Photo)
- West Virginia’s attorney general sued Apple, accusing the company of allowing child sexual abuse material to proliferate on its iCloud platform.
- The lawsuit alleges Apple prioritized user privacy through end-to-end encryption while failing to implement effective detection measures.
- Apple denies wrongdoing, saying it has built industry-leading safety features and continues working to combat child exploitation while protecting user privacy.
Share
|
Getting your Trinity Audio player ready...
|
Feb 19 – West Virginia’s attorney general sued Apple on Thursday, accusing the iPhone maker of allowing its iCloud service to become what the company’s own internal communications called the “greatest platform for distributing child porn.”
Attorney General JB McCuskey, a Republican, accused Apple of prioritizing user privacy over child safety. His office called the case the first of its kind by a government agency over the distribution of child sexual abuse material on Apple’s data storage platform.
“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” McCuskey said in the statement. “This conduct is despicable, and Apple’s inaction is inexcusable.”
Apple in a statement said it has implemented features that prevent children from uploading or receiving nude images and was “innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.”
“All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls — are designed with the safety, security, and privacy of our users at their core,” Apple said.
The U.S. has seen a growing national reckoning over how smartphones and social media harm children. So far, the wave of litigation and public pressure has mostly targeted companies like Meta, Snap, and Google’s YouTube, with Apple largely insulated from scrutiny.
End-to-End Encryption
West Virginia’s lawsuit focuses on Apple’s move toward end-to-end encryption, putting digital files outside the reach of both Apple and law enforcement officials. The state alleges Apple’s use of such technology has allowed child abuse material to proliferate on its platform.
For decades, technology and privacy advocates have sparred over end-to-end encryption. Advocates call this vital to ensuring privacy and preventing widespread digital eavesdropping. Governments insist it hinders criminal investigations.
Apple has considered scanning images but abandoned the approach after concerns about user privacy and safety, including worries that it could be exploited by governments looking for other material for censorship or arrest, Reuters has reported.
McCuskey’s office cited a text message Apple’s then anti-fraud chief sent in 2020 stating that because of Apple’s priorities, it was “the greatest platform for distributing child porn.”
The lawsuit in Mason County Circuit Court seeks statutory and punitive damages and requests that a judge force Apple to implement safer product designs including effective measures to detect abusive material.
Alphabet’s Google, Microsoft and other platform providers check uploaded photos or emailed attachments against a database of identifiers of known child sex abuse material provided by the National Center for Missing and Exploited Children and other clearinghouses.
Until 2022, Apple took a different approach. It did not scan all files uploaded to its iCloud storage offerings, and the data was not end-to-end encrypted, meaning law enforcement officials could access it with a warrant.
Reuters in 2020 reported that Apple planned end-to-end encryption for iCloud, which would have put data into a form unusable by law enforcement officials. It abandoned the plan after the FBI complained it would harm investigations.
Neuralhash
In August 2021, Apple announced NeuralHash, which it designed to balance the detection of child abuse material with privacy by scanning images on users’ devices before upload.
Security researchers criticized the system, worrying it could yield false reports of abuse material. This sparked a backlash from privacy advocates who claimed it could be expanded to permit government surveillance.
A month later, Apple delayed introduction of NeuralHash, then canceled it in December 2022, the state said in its lawsuit. That same month, Apple launched an option for end-to-end encryption for iCloud data.
The state said Apple engaged in unfair or deceptive practices prohibited by state law through promotion of NeuralHash, which West Virginia called inferior to other tools and a technology that could be easily evaded. The state contended that Apple broke its promise to combat child sex abuse material when it quietly abandoned the program.
The lawsuit also accuses Apple of creating a public nuisance by designing its products in ways that allow users to collect, store and spread child sexual abuse material. The state argues that Apple’s design choices have allowed illegal content to persist and evade law enforcement, causing widespread harm to West Virginia’s public health and child‑protection systems.
While Apple did not go through with the effort to scan images being uploaded to iCloud, it did implement a feature called Communication Safety that blurs nudity and other sensitive content being sent to or from a child’s device.
Federal law requires U.S.-based technology companies to report abuse material to the National Center for Missing and Exploited Children. Apple in 2023 made 267 reports, far fewer than the 1.47 million by Google and 30.6 million by Meta Platforms, the state said.
The state’s claims mirror allegations in a proposed class action lawsuit filed against Apple in late 2024 in federal court in California by individuals depicted in such images.
Apple has moved to dismiss that lawsuit, saying Section 230 of the Communications Decency Act shields it from liability. The law provides broad protections to internet companies from lawsuits over content generated by users.
—
(Reporting by Nate Raymond in Boston and Stephen Nellis in San Francisco; Editing by Alexia Garamfalvi, Christopher Cushing, Franklin Paul and David Gregorio)
RELATED TOPICS:
Categories
Alysa Liu, Clovis-Born Olympian, Wins Gold in Figure Skating




