New Mexico’s attorney general has against , accusing the company of failing to protect children from sextortion, sexual exploitation and other harms on . The suit contends that Snapchat’s features “foster the sharing of child sexual abuse material (CSAM) and facilitate child sexual exploitation.”
The state’s Department of Justice carried out a months-long investigation into Snapchat and discovered a “vast network of dark web sites dedicated to sharing stolen, non-consensual sexual images from Snap.” It claims to have found more than 10,000 records related to Snap and child sexual abuse material “in the last year alone,” and says Snapchat was “by far” the biggest source of images and videos on the dark web sites that it examined.
In its complaint [], the agency accused the app of being “a breeding ground for predators to collect sexually explicit images of children and to find, groom and extort them.” It states that “criminals circulate sextortion scripts” that contain instructions on how to victimize minors. It claims that these documents are publicly available and are actively being used against victims but they “have not yet been blacklisted by . . . Snapchat.”
Furthermore, investigators determined that many accounts that openly share and sell CSAM on Snapchat are linked to each other through the app’s recommendation algorithm. The suit claims “Snap designed its platform specifically to make it addicting to young people, which has led some of its users to depression, anxiety, sleep deprivation, body dysmorphia and other mental health issues.
The Snapchat complaint follows a similar child safety suit that the . Engadget has contacted Snap for comment.
“Our undercover investigation revealed that Snapchat’s harmful design features create an environment where predators can easily target children through sextortion schemes and other forms of sexual abuse,” Attorney General Raúl Torrez said in a statement. “Snap has misled users into believing that photos and videos sent on their platform will disappear, but predators can permanently capture this content and they have created a virtual yearbook of child sexual images that are traded, sold and stored indefinitely. Through our litigation against Meta and Snap, the New Mexico Department of Justice will continue to hold these platforms accountable for prioritizing profits over children’s safety.”