An individual filing a lawsuit has accused Apple of inadequately addressing the issue of child sexual abuse material (CSAM) on its iCloud and iMessage platforms.
The plaintiff, a 9-year-old minor represented by a guardian, experienced a distressing incident between December 2023 and January 2024. The minor received friend requests from two unknown Snapchat users, who then proceeded to request the minor’s iCloud ID, which is Apple’s storage service. Subsequently, the individuals sent five videos depicting young children engaged in sexual intercourse via iMessage. They also solicited explicit videos from the minor through the same messaging service.
As a result of this traumatic interaction, the plaintiff has suffered severe mental and physical harm, leading to a need for psychotherapy and mental health care, as stated in the lawsuit.
The Epoch Times reached out to Apple for a comment, but no response has been received thus far.
The lawsuit, proposed as a class-action, accuses Apple of using their alleged commitment to privacy as a pretense to ignore the proliferation of child sexual material on iCloud. The abandonment of the NeuralHash CSAM scanning tool by Apple is highlighted as evidence.
The tech company cautioned against such scanning initiatives, expressing concerns about potential bulk surveillance and fears among users about being screened for political or religious viewpoints, which could have a detrimental impact on free speech.
The complaint accuses Apple of engaging in “privacy-washing,” a deceptive marketing tactic where the company promotes its dedication to consumer privacy without effectively translating those principles into practice.
The complaint also claims that Apple has consistently underreported CSAM instances to agencies like the National Center for Missing & Exploited Children (NCMEC).
Despite Apple’s significant resources, the complaint alleges that the company chooses not to adopt industry standards for CSAM detection. Instead, it shifts the burden and cost of creating a safe user experience onto children and their families. This is exemplified by Apple’s decision to completely abandon its CSAM detection tool.
In another lawsuit involving Apple, an employee made a disturbing statement, claiming that Apple’s strong focus on privacy made it a prime platform for distributing child pornography. This revelation is based on messages from a chat conversation cited in the lawsuit.
According to the complaint, this action effectively violated the privacy of anonymous Chinese citizens in a country internationally recognized for its persecution of dissidents.
Among the companies contributing to the child sexual abuse crisis and enabling the proliferation of image-based sexual abuse, the most influential ones, such as Apple, Microsoft, and Meta, rank highly.
Lina Nealon, Vice President and Director of Corporate Advocacy at the National Center on Sexual Exploitation (NCOSE), stated, “Their products and policies are also undoubtedly fueling the child sexual abuse crisis and enabling the proliferation of image-based sexual abuse.”
Instead of adequately allocating resources to prevent exploitation of both children and adults, these tech companies prioritize profitability and engage in an AI arms race.
The Act aims to hold tech firms accountable for CSAM by granting victims of child sexual exploitation the ability to bring a civil cause of action against the companies that host, store, or make available such material.
The bill also allows for restitution for victims of child exploitation and empowers them to request the removal of CSAM content from tech platforms. Failure to comply with removal requests could result in administrative penalties for the platforms.
Discover more from Tension News
Subscribe to get the latest posts sent to your email.