Apple is being sued by child sexual abuse victims for failing to implement its iCloud Scanning for Child Sexual Abuse Materials (CSAM) program, the New York Times reported. In 2021, Apple announced that it was working on a tool to detect CSAM that would flag images showing such abuse and notify the National Center for Missing and Exploited Children. But the company faced immediate backlash over the technology’s privacy implications and ultimately abandoned the plan.
The lawsuit, filed Saturday in Northern California, seeks more than $1.2 billion in damages for a group of 2,680 potential victims, according to the NYT. After Apple showed off its planned child safety tools, the company said it was unable to “implement those designs or take steps to detect and restrict” CSAM on its devices. , alleging that the images continued to be circulated and caused harm to the victims. Engadget has reached out to Apple for comment.
In a statement to the New York Times about the lawsuit, Apple spokesperson Fred Sainz said, “Contents of child sexual abuse are abhorrent, and we are committed to ensuring that predators do not put children at risk.” “We are committed to combating the methods of exposure.” We are urgently and aggressively innovating to combat these crimes without compromising the security and privacy of all our users. The lawsuit comes just months after Apple was accused by Britain’s National Society for the Prevention of Cruelty to Children (NSPCC) of under-reporting CSAM.
